Sample records for accurately reproduce experimental

  1. Can cancer researchers accurately judge whether preclinical reports will reproduce?

    PubMed Central

    Mandel, David R.; Kimmelman, Jonathan

    2017-01-01

    There is vigorous debate about the reproducibility of research findings in cancer biology. Whether scientists can accurately assess which experiments will reproduce original findings is important to determining the pace at which science self-corrects. We collected forecasts from basic and preclinical cancer researchers on the first 6 replication studies conducted by the Reproducibility Project: Cancer Biology (RP:CB) to assess the accuracy of expert judgments on specific replication outcomes. On average, researchers forecasted a 75% probability of replicating the statistical significance and a 50% probability of replicating the effect size, yet none of these studies successfully replicated on either criterion (for the 5 studies with results reported). Accuracy was related to expertise: experts with higher h-indices were more accurate, whereas experts with more topic-specific expertise were less accurate. Our findings suggest that experts, especially those with specialized knowledge, were overconfident about the RP:CB replicating individual experiments within published reports; researcher optimism likely reflects a combination of overestimating the validity of original studies and underestimating the difficulties of repeating their methodologies. PMID:28662052

  2. Systematic heterogenization for better reproducibility in animal experimentation.

    PubMed

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  3. Accurate and reproducible measurements of RhoA activation in small samples of primary cells.

    PubMed

    Nini, Lylia; Dagnino, Lina

    2010-03-01

    Rho GTPase activation is essential in a wide variety of cellular processes. Measurement of Rho GTPase activation is difficult with limited material, such as tissues or primary cells that exhibit stringent culture requirements for growth and survival. We defined parameters to accurately and reproducibly measure RhoA activation (i.e., RhoA-GTP) in cultured primary keratinocytes in response to serum and growth factor stimulation using enzyme-linked immunosorbent assay (ELISA)-based G-LISA assays. We also established conditions that minimize RhoA-GTP in unstimulated cells without affecting viability, allowing accurate measurements of RhoA activation on stimulation or induction of exogenous GTPase expression. Copyright 2009 Elsevier Inc. All rights reserved.

  4. Digital image analysis: improving accuracy and reproducibility of radiographic measurement.

    PubMed

    Bould, M; Barnard, S; Learmonth, I D; Cunningham, J L; Hardy, J R

    1999-07-01

    To assess the accuracy and reproducibility of a digital image analyser and the human eye, in measuring radiographic dimensions. We experimentally compared radiographic measurement using either an image analyser system or the human eye with digital caliper. The assessment of total hip arthroplasty wear from radiographs relies on both the accuracy of radiographic images and the accuracy of radiographic measurement. Radiographs were taken of a slip gauge (30+/-0.00036 mm) and slip gauge with a femoral stem. The projected dimensions of the radiographic images were calculated by trigonometry. The radiographic dimensions were then measured by blinded observers using both techniques. For a single radiograph, the human eye was accurate to 0.26 mm and reproducible to +/-0.1 mm. In comparison the digital image analyser system was accurate to 0.01 mm with a reproducibility of +/-0.08 mm. In an arthroplasty model, where the dimensions of an object were corrected for magnification by the known dimensions of a femoral head, the human eye was accurate to 0.19 mm, whereas the image analyser system was accurate to 0.04 mm. The digital image analysis system is up to 20 times more accurate than the human eye, and in an arthroplasty model the accuracy of measurement increases four-fold. We believe such image analysis may allow more accurate and reproducible measurement of wear from standard follow-up radiographs.

  5. Accurate and reproducible functional maps in 127 human cell types via 2D genome segmentation

    PubMed Central

    Hardison, Ross C.

    2017-01-01

    Abstract The Roadmap Epigenomics Consortium has published whole-genome functional annotation maps in 127 human cell types by integrating data from studies of multiple epigenetic marks. These maps have been widely used for studying gene regulation in cell type-specific contexts and predicting the functional impact of DNA mutations on disease. Here, we present a new map of functional elements produced by applying a method called IDEAS on the same data. The method has several unique advantages and outperforms existing methods, including that used by the Roadmap Epigenomics Consortium. Using five categories of independent experimental datasets, we compared the IDEAS and Roadmap Epigenomics maps. While the overall concordance between the two maps is high, the maps differ substantially in the prediction details and in their consistency of annotation of a given genomic position across cell types. The annotation from IDEAS is uniformly more accurate than the Roadmap Epigenomics annotation and the improvement is substantial based on several criteria. We further introduce a pipeline that improves the reproducibility of functional annotation maps. Thus, we provide a high-quality map of candidate functional regions across 127 human cell types and compare the quality of different annotation methods in order to facilitate biomedical research in epigenomics. PMID:28973456

  6. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    NASA Astrophysics Data System (ADS)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  7. Experimental Procedures for Sensitive and Reproducible In Situ EPR Tooth Dosimetry

    PubMed Central

    Williams, Benjamin B.; Sucheta, Artur; Dong, Ruhong; Sakata, Yasuko; Iwasaki, Akinori; Burke, Gregory; Grinberg, Oleg; Lesniewski, Piotr; Kmiec, Maciej; Swartz, Harold M.

    2007-01-01

    In vivo electron paramagnetic resonance (EPR) tooth dosimetry provides a means for non-invasive retrospective assessment of personal radiation exposure. While there is a clear need for such capabilities following radiation accidents, the most pressing need for the development of this technology is the heightened likelihood of terrorist events or nuclear conflicts. This technique will enable such measurements to be made at the site of an incident, while the subject is present, to assist emergency personnel as they perform triage for the affected population. At Dartmouth Medical School this development is currently being tested with normal volunteers with irradiated teeth placed in their mouths and with patients who have undergone radiation therapy. Here we describe progress in practical procedures to provide accurate and reproducible in vivo dose estimates. PMID:18591989

  8. Understanding reproducibility of human IVF traits to predict next IVF cycle outcome.

    PubMed

    Wu, Bin; Shi, Juanzi; Zhao, Wanqiu; Lu, Suzhen; Silva, Marta; Gelety, Timothy J

    2014-10-01

    Evaluating the failed IVF cycle often provides useful prognostic information. Before undergoing another attempt, patients experiencing an unsuccessful IVF cycle frequently request information about the probability of future success. Here, we introduced the concept of reproducibility and formulae to predict the next IVF cycle outcome. The experimental design was based on the retrospective review of IVF cycle data from 2006 to 2013 in two different IVF centers and statistical analysis. The reproducibility coefficients (r) of IVF traits including number of oocytes retrieved, oocyte maturity, fertilization, embryo quality and pregnancy were estimated using the interclass correlation coefficient between the repeated IVF cycle measurements for the same patient by variance component analysis. The formulae were designed to predict next IVF cycle outcome. The number of oocytes retrieved from patients and their fertilization rate had the highest reproducibility coefficients (r = 0.81 ~ 0.84), which indicated a very close correlation between the first retrieval cycle and subsequent IVF cycles. Oocyte maturity and number of top quality embryos had middle level reproducibility (r = 0.38 ~ 0.76) and pregnancy rate had a relative lower reproducibility (r = 0.23 ~ 0.27). Based on these parameters, the next outcome for these IVF traits might be accurately predicted by the designed formulae. The introduction of the concept of reproducibility to our human IVF program allows us to predict future IVF cycle outcomes. The traits of oocyte numbers retrieved, oocyte maturity, fertilization, and top quality embryos had higher or middle reproducibility, which provides a basis for accurate prediction of future IVF outcomes. Based on this prediction, physicians may counsel their patients or change patient's stimulation plans, and laboratory embryologists may improve their IVF techniques accordingly.

  9. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry.

    PubMed

    Tirion, Monique M; Ben-Avraham, Daniel

    2018-01-16

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 [Formula: see text] resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 [Formula: see text] sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  10. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry

    NASA Astrophysics Data System (ADS)

    Tirion, Monique M.; ben-Avraham, Daniel

    2018-03-01

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 {\\mathringA} resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 {\\mathringA} sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  11. Three-dimensional endoanal ultrasound is accurate and reproducible in determining type and height of anal fistulas.

    PubMed

    Kołodziejczak, M; Santoro, G A; Obcowska, A; Lorenc, Z; Mańczak, M; Sudoł-Szopińska, I

    2017-04-01

    Surgical treatment of high anal fistulas is associated with the potential risk of faecal incontinence and recurrence. The primary aim of this study was to determine the accuracy of three-dimensional endoanal ultrasound (3D-EAUS) in the assessment of height and type of anal fistulas, compared to the intra-operative findings (gold standard). The secondary aim was to evaluate the inter-observer reproducibility of 3D-EAUS. The study design was a prospective analysis of retrospective data. 299 patients (202 men), mean age 45.3 years, who underwent surgery for anal fistulas, were included. All patients were preoperatively assessed by 3D-EAUS. Two readers independently reviewed the volumes to determine the type and height of fistulas. Sensitivity, specificity, positive and negative predictive values, proportion of agreements and Cohen's kappa coefficient (κ) were calculated for both examiners. Ultrasound findings were compared with intra-operative data (reference standard), evaluated blindly by the surgeons. At surgery, 201 (67%) were transsphincteric, 49 (16%) suprasphincteric, 47 (16%) intersphincteric and two (1%) extrasphincteric fistulas. Intra-operatively, 177 (59%) were low and 122 (41%) high fistulas. The overall accuracy of 3D-EAUS was 91% for fistula type (271/299 fistulas: 97% transsphincteric, 100% intersphincteric, 57% suprasphincteric, 0% extrasphincteric) and 92% for fistula height (275/299 fistulas: 80% high and 100% low). Both readers reported very good agreement with surgery in the assessment of fistula type (proportion of agreement 0.88, κ = 0.89) and height (proportion of agreement 0.90, κ = 0.91). 3D-EAUS is an accurate and reproducible modality for the assessment of type and height of anal fistulas. Colorectal Disease © 2016 The Association of Coloproctology of Great Britain and Ireland.

  12. Panel-based Genetic Diagnostic Testing for Inherited Eye Diseases is Highly Accurate and Reproducible and More Sensitive for Variant Detection Than Exome Sequencing

    PubMed Central

    Bujakowska, Kinga M.; Sousa, Maria E.; Fonseca-Kelly, Zoë D.; Taub, Daniel G.; Janessian, Maria; Wang, Dan Yi; Au, Elizabeth D.; Sims, Katherine B.; Sweetser, David A.; Fulton, Anne B.; Liu, Qin; Wiggs, Janey L.; Gai, Xiaowu; Pierce, Eric A.

    2015-01-01

    Purpose Next-generation sequencing (NGS) based methods are being adopted broadly for genetic diagnostic testing, but the performance characteristics of these techniques have not been fully defined with regard to test accuracy and reproducibility. Methods We developed a targeted enrichment and NGS approach for genetic diagnostic testing of patients with inherited eye disorders, including inherited retinal degenerations, optic atrophy and glaucoma. In preparation for providing this Genetic Eye Disease (GEDi) test on a CLIA-certified basis, we performed experiments to measure the sensitivity, specificity, reproducibility as well as the clinical sensitivity of the test. Results The GEDi test is highly reproducible and accurate, with sensitivity and specificity for single nucleotide variant detection of 97.9% and 100%, respectively. The sensitivity for variant detection was notably better than the 88.3% achieved by whole exome sequencing (WES) using the same metrics, due to better coverage of targeted genes in the GEDi test compared to commercially available exome capture sets. Prospective testing of 192 patients with IRDs indicated that the clinical sensitivity of the GEDi test is high, with a diagnostic rate of 51%. Conclusion The data suggest that based on quantified performance metrics, selective targeted enrichment is preferable to WES for genetic diagnostic testing. PMID:25412400

  13. Ensemble MD simulations restrained via crystallographic data: Accurate structure leads to accurate dynamics

    PubMed Central

    Xue, Yi; Skrynnikov, Nikolai R

    2014-01-01

    Currently, the best existing molecular dynamics (MD) force fields cannot accurately reproduce the global free-energy minimum which realizes the experimental protein structure. As a result, long MD trajectories tend to drift away from the starting coordinates (e.g., crystallographic structures). To address this problem, we have devised a new simulation strategy aimed at protein crystals. An MD simulation of protein crystal is essentially an ensemble simulation involving multiple protein molecules in a crystal unit cell (or a block of unit cells). To ensure that average protein coordinates remain correct during the simulation, we introduced crystallography-based restraints into the MD protocol. Because these restraints are aimed at the ensemble-average structure, they have only minimal impact on conformational dynamics of the individual protein molecules. So long as the average structure remains reasonable, the proteins move in a native-like fashion as dictated by the original force field. To validate this approach, we have used the data from solid-state NMR spectroscopy, which is the orthogonal experimental technique uniquely sensitive to protein local dynamics. The new method has been tested on the well-established model protein, ubiquitin. The ensemble-restrained MD simulations produced lower crystallographic R factors than conventional simulations; they also led to more accurate predictions for crystallographic temperature factors, solid-state chemical shifts, and backbone order parameters. The predictions for 15N R1 relaxation rates are at least as accurate as those obtained from conventional simulations. Taken together, these results suggest that the presented trajectories may be among the most realistic protein MD simulations ever reported. In this context, the ensemble restraints based on high-resolution crystallographic data can be viewed as protein-specific empirical corrections to the standard force fields. PMID:24452989

  14. Reproducibility studies for experimental epitope detection in macrophages (EDIM).

    PubMed

    Japink, Dennis; Nap, Marius; Sosef, Meindert N; Nelemans, Patty J; Coy, Johannes F; Beets, Geerard; von Meyenfeldt, Maarten F; Leers, Math P G

    2014-05-01

    We have recently described epitope detection in macrophages (EDIM) by flow cytometry. This is a promising tool for the diagnosis and follow-up of malignancies. However, biological and technical validation is warranted before clinical applicability can be explored. The pre-analytic and analytic phases were investigated. Five different aspects were assessed: blood sample stability, intra-individual variability in healthy persons, intra-assay variation, inter-assay variation and assay transferability. The post-analytic phase was already partly standardized and described in an earlier study. The outcomes in the pre-analytic phase showed that samples are stable for 24h after venipuncture. Biological variation over time was similar to that of serum tumor marker assays; each patient has a baseline value. Intra-assay variation showed good reproducibility, while inter-assay variation showed reproducibility similar to that of to established serum tumor marker assays. Furthermore, the assay showed excellent transferability between analyzers. Under optimal analytic conditions the EDIM method is technically stable, reproducible and transferable. Biological variation over time needs further assessment in future work. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Reproducible surface-enhanced Raman quantification of biomarkers in multicomponent mixtures.

    PubMed

    De Luca, Anna Chiara; Reader-Harris, Peter; Mazilu, Michael; Mariggiò, Stefania; Corda, Daniela; Di Falco, Andrea

    2014-03-25

    Direct and quantitative detection of unlabeled glycerophosphoinositol (GroPIns), an abundant cytosolic phosphoinositide derivative, would allow rapid evaluation of several malignant cell transformations. Here we report label-free analysis of GroPIns via surface-enhanced Raman spectroscopy (SERS) with a sensitivity of 200 nM, well below its apparent concentration in cells. Crucially, our SERS substrates, based on lithographically defined gold nanofeatures, can be used to predict accurately the GroPIns concentration even in multicomponent mixtures, avoiding the preliminary separation of individual compounds. Our results represent a critical step toward the creation of SERS-based biosensor for rapid, label-free, and reproducible detection of specific molecules, overcoming limits of current experimental methods.

  16. Experimental and Numerical Models of Complex Clinical Scenarios; Strategies to Improve Relevance and Reproducibility of Joint Replacement Research

    PubMed Central

    Bechtold, Joan E.; Swider, Pascal; Goreham-Voss, Curtis; Soballe, Kjeld

    2016-01-01

    This research review aims to focus attention on the effect of specific surgical and host factors on implant fixation, and the importance of accounting for them in experimental and numerical models. These factors affect (a) eventual clinical applicability and (b) reproducibility of findings across research groups. Proper function and longevity for orthopedic joint replacement implants relies on secure fixation to the surrounding bone. Technology and surgical technique has improved over the last 50 years, and robust ingrowth and decades of implant survival is now routinely achieved for healthy patients and first-time (primary) implantation. Second-time (revision) implantation presents with bone loss with interfacial bone gaps in areas vital for secure mechanical fixation. Patients with medical comorbidities such as infection, smoking, congestive heart failure, kidney disease, and diabetes have a diminished healing response, poorer implant fixation, and greater revision risk. It is these more difficult clinical scenarios that require research to evaluate more advanced treatment approaches. Such treatments can include osteogenic or antimicrobial implant coatings, allo- or autogenous cellular or tissue-based approaches, local and systemic drug delivery, surgical approaches. Regarding implant-related approaches, most experimental and numerical models do not generally impose conditions that represent mechanical instability at the implant interface, or recalcitrant healing. Many treatments will work well in forgiving settings, but fail in complex human settings with disease, bone loss, or previous surgery. Ethical considerations mandate that we justify and limit the number of animals tested, which restricts experimental permutations of treatments. Numerical models provide flexibility to evaluate multiple parameters and combinations, but generally need to employ simplifying assumptions. The objectives of this paper are to (a) to highlight the importance of mechanical

  17. Experimental and theoretical oscillator strengths of Mg I for accurate abundance analysis

    NASA Astrophysics Data System (ADS)

    Pehlivan Rhodin, A.; Hartman, H.; Nilsson, H.; Jönsson, P.

    2017-02-01

    Context. With the aid of stellar abundance analysis, it is possible to study the galactic formation and evolution. Magnesium is an important element to trace the α-element evolution in our Galaxy. For chemical abundance analysis, such as magnesium abundance, accurate and complete atomic data are essential. Inaccurate atomic data lead to uncertain abundances and prevent discrimination between different evolution models. Aims: We study the spectrum of neutral magnesium from laboratory measurements and theoretical calculations. Our aim is to improve the oscillator strengths (f-values) of Mg I lines and to create a complete set of accurate atomic data, particularly for the near-IR region. Methods: We derived oscillator strengths by combining the experimental branching fractions with radiative lifetimes reported in the literature and computed in this work. A hollow cathode discharge lamp was used to produce free atoms in the plasma and a Fourier transform spectrometer recorded the intensity-calibrated high-resolution spectra. In addition, we performed theoretical calculations using the multiconfiguration Hartree-Fock program ATSP2K. Results: This project provides a set of experimental and theoretical oscillator strengths. We derived 34 experimental oscillator strengths. Except from the Mg I optical triplet lines (3p 3P°0,1,2-4s 3S1), these oscillator strengths are measured for the first time. The theoretical oscillator strengths are in very good agreement with the experimental data and complement the missing transitions of the experimental data up to n = 7 from even and odd parity terms. We present an evaluated set of oscillator strengths, gf, with uncertainties as small as 5%. The new values of the Mg I optical triplet line (3p 3P°0,1,2-4s 3S1) oscillator strength values are 0.08 dex larger than the previous measurements.

  18. How reproducible are methods to measure the dynamic viscoelastic properties of poroelastic media?

    NASA Astrophysics Data System (ADS)

    Bonfiglio, Paolo; Pompoli, Francesco; Horoshenkov, Kirill V.; Rahim, Mahmud Iskandar B. Seth A.; Jaouen, Luc; Rodenas, Julia; Bécot, François-Xavier; Gourdon, Emmanuel; Jaeger, Dirk; Kursch, Volker; Tarello, Maurizio; Roozen, Nicolaas Bernardus; Glorieux, Christ; Ferrian, Fabrizio; Leroy, Pierre; Vangosa, Francesco Briatico; Dauchez, Nicolas; Foucart, Félix; Lei, Lei; Carillo, Kevin; Doutres, Olivier; Sgard, Franck; Panneton, Raymond; Verdiere, Kévin; Bertolini, Claudio; Bär, Rolf; Groby, Jean-Philippe; Geslain, Alan; Poulain, Nicolas; Rouleau, Lucie; Guinault, Alain; Ahmadi, Hamid; Forge, Charlie

    2018-08-01

    There is a considerable number of research publications on the acoustical properties of porous media with an elastic frame. A simple search through the Web of Science™ (last accessed 21 March 2018) suggests that there are at least 819 publications which deal with the acoustics of poroelastic media. A majority of these researches require accurate knowledge of the elastic properties over a broad frequency range. However, the accuracy of the measurement of the dynamic elastic properties of poroelastic media has been a contentious issue. The novelty of this paper is that it studies the reproducibility of some popular experimental methods which are used routinely to measure the key elastic properties such as the dynamic Young's modulus, loss factor and Poisson ratio of poroelastic media. In this paper, fourteen independent sets of laboratory measurements were performed on specimens of the same porous materials. The results from these measurements suggest that the reproducibility of this type of experimental method is poor. This work can be helpful to suggest improvements which can be developed to harmonize the way the elastic properties of poroelastic media are measured worldwide.

  19. Evaluation of New Reference Genes in Papaya for Accurate Transcript Normalization under Different Experimental Conditions

    PubMed Central

    Chen, Weixin; Chen, Jianye; Lu, Wangjin; Chen, Lei; Fu, Danwen

    2012-01-01

    Real-time reverse transcription PCR (RT-qPCR) is a preferred method for rapid and accurate quantification of gene expression studies. Appropriate application of RT-qPCR requires accurate normalization though the use of reference genes. As no single reference gene is universally suitable for all experiments, thus reference gene(s) validation under different experimental conditions is crucial for RT-qPCR analysis. To date, only a few studies on reference genes have been done in other plants but none in papaya. In the present work, we selected 21 candidate reference genes, and evaluated their expression stability in 246 papaya fruit samples using three algorithms, geNorm, NormFinder and RefFinder. The samples consisted of 13 sets collected under different experimental conditions, including various tissues, different storage temperatures, different cultivars, developmental stages, postharvest ripening, modified atmosphere packaging, 1-methylcyclopropene (1-MCP) treatment, hot water treatment, biotic stress and hormone treatment. Our results demonstrated that expression stability varied greatly between reference genes and that different suitable reference gene(s) or combination of reference genes for normalization should be validated according to the experimental conditions. In general, the internal reference genes EIF (Eukaryotic initiation factor 4A), TBP1 (TATA binding protein 1) and TBP2 (TATA binding protein 2) genes had a good performance under most experimental conditions, whereas the most widely present used reference genes, ACTIN (Actin 2), 18S rRNA (18S ribosomal RNA) and GAPDH (Glyceraldehyde-3-phosphate dehydrogenase) were not suitable in many experimental conditions. In addition, two commonly used programs, geNorm and Normfinder, were proved sufficient for the validation. This work provides the first systematic analysis for the selection of superior reference genes for accurate transcript normalization in papaya under different experimental conditions. PMID

  20. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  1. Genotypic variability enhances the reproducibility of an ecological study.

    PubMed

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  2. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  3. Reproducibility2020: Progress and priorities

    PubMed Central

    Freedman, Leonard P.; Venugopalan, Gautham; Wisman, Rosann

    2017-01-01

    The preclinical research process is a cycle of idea generation, experimentation, and reporting of results. The biomedical research community relies on the reproducibility of published discoveries to create new lines of research and to translate research findings into therapeutic applications. Since 2012, when scientists from Amgen reported that they were able to reproduce only 6 of 53 “landmark” preclinical studies, the biomedical research community began discussing the scale of the reproducibility problem and developing initiatives to address critical challenges. Global Biological Standards Institute (GBSI) released the “Case for Standards” in 2013, one of the first comprehensive reports to address the rising concern of irreproducible biomedical research. Further attention was drawn to issues that limit scientific self-correction, including reporting and publication bias, underpowered studies, lack of open access to methods and data, and lack of clearly defined standards and guidelines in areas such as reagent validation. To evaluate the progress made towards reproducibility since 2013, GBSI identified and examined initiatives designed to advance quality and reproducibility. Through this process, we identified key roles for funders, journals, researchers and other stakeholders and recommended actions for future progress. This paper describes our findings and conclusions. PMID:28620458

  4. The ability of different materials to reproduce accurate records of interocclusal relationships in the vertical dimension.

    PubMed

    Ghazal, M; Albashaireh, Z S; Kern, M

    2008-11-01

    Restorations made on incorrectly mounted casts might require considerable intra-oral adjustments to correct the occlusion or might even necessitate a remake of the restoration. The aim of this study was to evaluate interocclusal recording materials for their ability to reproduce accurate vertical interocclusal relationships after a storage time of 1 and 48 h, respectively. A custom-made apparatus was used to simulate the maxilla and mandible. Eight interocclusal records were made in each of the following groups: G1: Aluwax (aluminium wax), G2: Beauty Pink wax (hydrocarbon wax compound), G3: Futar D, G4: Futar D Fast, G5: Futar Scan (G3-G5: vinyl polysiloxane), G6: Ramitec (polyether). The vertical discrepancies were measured by an inductive displacement transducer connected to a carrier frequency amplifier after storage of the records for two periods of 1 and 48 h. Two-way anova was used for statistical analysis. The mean vertical discrepancies in mum (1/48 h) for G1 (31/35) and G2 (35/38) were statistically significantly higher than for the other groups G3 (8/10), G4 (11/12), G5 (6/8) and G6 (5/8) (P < or = 0.05). There were no statistically significant differences between the elastomers tested. The effect of storage on the vertical discrepancies was statistically significant (P < 0.001). Vinyl polysiloxane and polyether interocclusal records can be used to relate working casts during mounting procedures without significant vertical displacement of the casts.

  5. Experimentally Reproducing Thermal Breakdown of Rock at Earth's Surface

    NASA Astrophysics Data System (ADS)

    Eppes, M. C.; Griffiths, L.; Heap, M. J.; Keanini, R.; Baud, P.

    2016-12-01

    Thermal stressing induces microcrack growth in rock in part due to thermal expansion mismatch between different minerals, mineral phases, or crystalline axes and/or thermal gradients in the entire rock mass. This knowledge is largely derived from experimental studies of thermal microcracking, typically under conditions of very high temperatures (hundreds of °C). Thermal stressing at lower temperatures has received significantly less attention despite the fact that it may play an important role in rock breakdown at and near Earth's surface (Aldred et al., 2015; Collins and Stock, 2016). In particular, Eppes et al. (2016) attribute recorded Acoustic Emissions (AE) from a highly instrumented granite boulder sitting on the ground in natural conditions to subcritical crack growth driven by thermal stresses arising from a combination of solar- and weather-induced temperature changes; however the maximum temperature the boulder experienced was just 65 °C. In order to better understand these results without complicating factors of a natural environment, we conducted controlled laboratory experiments on cylindrical samples (40 mm length and 20 mm diameter) cored from the same granite as the Eppes et al. (2016) experiment, subjecting them to temperature fluctuations that reproduced the field measurements. We used a novel experimental configuration whereby two high temperature piezo-transducers are each in contact with an opposing face of the sample. The servo-controlled uniaxial press compensates for the thermal expansion and contraction of the pistons and the sample, keeping the coupling between the transducers and the sample, and the axial force acting on the sample, constant throughout. The system records AE, as well as P-wave velocity, both independent proxies for microfracture, as well as strain and temperature. Preliminary tests, heating and cooling granite at a rate of 1 °C/min, show that a large amount of AE occurs at temperatures as low as 100 °C. Ultimately, by

  6. ROCS: a Reproducibility Index and Confidence Score for Interaction Proteomics Studies

    PubMed Central

    2012-01-01

    published AP-MS experiments, each containing well characterized protein interactions, allowing for systematic benchmarking of ROCS. We show that our method may be used on its own to make accurate identification of specific, biologically relevant protein-protein interactions, or in combination with other AP-MS scoring methods to significantly improve inferences. Conclusions Our method addresses important issues encountered in AP-MS datasets, making ROCS a very promising tool for this purpose, either on its own or in conjunction with other methods. We anticipate that our methodology may be used more generally in proteomics studies and databases, where experimental reproducibility issues arise. The method is implemented in the R language, and is available as an R package called “ROCS”, freely available from the CRAN repository http://cran.r-project.org/. PMID:22682516

  7. A theoretical and experimental benchmark study of core-excited states in nitrogen

    NASA Astrophysics Data System (ADS)

    Myhre, Rolf H.; Wolf, Thomas J. A.; Cheng, Lan; Nandi, Saikat; Coriani, Sonia; Gühr, Markus; Koch, Henrik

    2018-02-01

    The high resolution near edge X-ray absorption fine structure spectrum of nitrogen displays the vibrational structure of the core-excited states. This makes nitrogen well suited for assessing the accuracy of different electronic structure methods for core excitations. We report high resolution experimental measurements performed at the SOLEIL synchrotron facility. These are compared with theoretical spectra calculated using coupled cluster theory and algebraic diagrammatic construction theory. The coupled cluster singles and doubles with perturbative triples model known as CC3 is shown to accurately reproduce the experimental excitation energies as well as the spacing of the vibrational transitions. The computational results are also shown to be systematically improved within the coupled cluster hierarchy, with the coupled cluster singles, doubles, triples, and quadruples method faithfully reproducing the experimental vibrational structure.

  8. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  9. Development of accurate potentials to explore the structure of water on 2D materials

    NASA Astrophysics Data System (ADS)

    Bejagam, Karteek; Singh, Samrendra; Deshmukh, Sanket; Deshmkuh Group Team; Samrendra Group Collaboration

    Water play an important role in many biological and non-biological process. Thus structure of water at various interfaces and under confinement has always been the topic of immense interest. 2-D materials have shown great potential in surface coating applications and nanofluidic devices. However, the exact atomic level understanding of the wettability of single layer of these 2-D materials is still lacking mainly due to lack of experimental techniques and computational methodologies including accurate force-field potentials and algorithms to measure the contact angle of water. In the present study, we have developed a new algorithm to measure the accurate contact angle between water and 2-D materials. The algorithm is based on fitting the best sphere to the shape of the droplet. This novel spherical fitting method accounts for every individual molecule of the droplet, rather than those at the surface only. We employ this method of contact angle measurements to develop the accurate non-bonded potentials between water and 2-D materials including graphene and boron nitride (BN) to reproduce the experimentally observed contact angle of water on these 2-D materials. Different water models such as SPC, SPC/Fw, and TIP3P were used to study the structure of water at the interfaces.

  10. Reproducibility of Regional Pulse Wave Velocity in Healthy Subjects

    PubMed Central

    Lee, Nak Bum

    2009-01-01

    Background/Aims Despite the clinical importance and widespread use of pulse wave velocity (PWV), there are no standards for pulse sensors or for system requirements to ensure accurate pulse wave measurement. We assessed the reproducibility of PWV values using a newly developed PWV measurement system. Methods The system used in this study was the PP-1000, which simultaneously provides regional PWV values from arteries at four different sites (carotid, femoral, radial, and dorsalis pedis). Seventeen healthy male subjects without any cardiovascular disease participated in this study. Two observers performed two consecutive measurements in the same subject in random order. To evaluate the reproducibility of the system, two sets of analyses (within-observer and between-observer) were performed. Results The means±SD of PWV for the aorta, arm, and leg were 7.0±1.48, 8.43±1.14, and 8.09±0.98 m/s as measured by observer A and 6.76±1.00, 7.97±0.80, and 7.97±0.72 m/s by observer B, respectively. Between-observer differences for the aorta, arm, and leg were 0.14±0.62, 0.18±0.84, and 0.07±0.86 m/s, respectively, and the correlation coefficients were high, especially for aortic PWV (r=0.93). All the measurements showed significant correlation coefficients, ranging from 0.94 to 0.99. Conclusions The PWV measurement system used in this study provides accurate analysis results with high reproducibility. It is necessary to provide an accurate algorithm for the detection of additional features such as flow wave, reflection wave, and dicrotic notch from a pulse waveform. PMID:19270477

  11. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  12. A theoretical and experimental benchmark study of core-excited states in nitrogen

    DOE PAGES

    Myhre, Rolf H.; Wolf, Thomas J. A.; Cheng, Lan; ...

    2018-02-14

    The high resolution near edge X-ray absorption fine structure spectrum of nitrogen displays the vibrational structure of the core-excited states. This makes nitrogen well suited for assessing the accuracy of different electronic structure methods for core excitations. We report high resolution experimental measurements performed at the SOLEIL synchrotron facility. These are compared with theoretical spectra calculated using coupled cluster theory and algebraic diagrammatic construction theory. The coupled cluster singles and doubles with perturbative triples model known as CC3 is shown to accurately reproduce the experimental excitation energies as well as the spacing of the vibrational transitions. In conclusion, the computationalmore » results are also shown to be systematically improved within the coupled cluster hierarchy, with the coupled cluster singles, doubles, triples, and quadruples method faithfully reproducing the experimental vibrational structure.« less

  13. A theoretical and experimental benchmark study of core-excited states in nitrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myhre, Rolf H.; Wolf, Thomas J. A.; Cheng, Lan

    The high resolution near edge X-ray absorption fine structure spectrum of nitrogen displays the vibrational structure of the core-excited states. This makes nitrogen well suited for assessing the accuracy of different electronic structure methods for core excitations. We report high resolution experimental measurements performed at the SOLEIL synchrotron facility. These are compared with theoretical spectra calculated using coupled cluster theory and algebraic diagrammatic construction theory. The coupled cluster singles and doubles with perturbative triples model known as CC3 is shown to accurately reproduce the experimental excitation energies as well as the spacing of the vibrational transitions. In conclusion, the computationalmore » results are also shown to be systematically improved within the coupled cluster hierarchy, with the coupled cluster singles, doubles, triples, and quadruples method faithfully reproducing the experimental vibrational structure.« less

  14. Reproducing impact ionization mass spectra of E and F ring ice grains at different impact speeds

    NASA Astrophysics Data System (ADS)

    Klenner, F.; Reviol, R.; Postberg, F.

    2017-09-01

    As impact speeds of E and F ring ice grains impinging onto the target of impact ionization mass spectrometers in space can vary greatly, the resulting cationic or anionic mass spectra can have very different appearances. The mass spectra can be accurately reproduced with an analog experimental setup IR-FL-MALDI-ToF-MS (Infrared Free Liquid Matrix Assisted Laser Desorption and Ionization Time of Flight Mass Spectrometry). We compare mass spectra of E and F ring ice grains taken by the Cosmic Dust Analyzer (CDA) onboard Cassini recorded at different impact speeds with our analog spectra and prove the capability of the analog experiment.

  15. Automated morphometry provides accurate and reproducible virtual staging of liver fibrosis in chronic hepatitis C

    PubMed Central

    Calès, Paul; Chaigneau, Julien; Hunault, Gilles; Michalak, Sophie; Cavaro-Menard, Christine; Fasquel, Jean-Baptiste; Bertrais, Sandrine; Rousselet, Marie-Christine

    2015-01-01

    morphometric scores provide reproducible and accurate diagnoses of fibrosis stages via “virtual expert pathologist.” PMID:26110088

  16. Technical Note: Using experimentally determined proton spot scanning timing parameters to accurately model beam delivery time.

    PubMed

    Shen, Jiajian; Tryggestad, Erik; Younkin, James E; Keole, Sameer R; Furutani, Keith M; Kang, Yixiu; Herman, Michael G; Bues, Martin

    2017-10-01

    To accurately model the beam delivery time (BDT) for a synchrotron-based proton spot scanning system using experimentally determined beam parameters. A model to simulate the proton spot delivery sequences was constructed, and BDT was calculated by summing times for layer switch, spot switch, and spot delivery. Test plans were designed to isolate and quantify the relevant beam parameters in the operation cycle of the proton beam therapy delivery system. These parameters included the layer switch time, magnet preparation and verification time, average beam scanning speeds in x- and y-directions, proton spill rate, and maximum charge and maximum extraction time for each spill. The experimentally determined parameters, as well as the nominal values initially provided by the vendor, served as inputs to the model to predict BDTs for 602 clinical proton beam deliveries. The calculated BDTs (T BDT ) were compared with the BDTs recorded in the treatment delivery log files (T Log ): ∆t = T Log -T BDT . The experimentally determined average layer switch time for all 97 energies was 1.91 s (ranging from 1.9 to 2.0 s for beam energies from 71.3 to 228.8 MeV), average magnet preparation and verification time was 1.93 ms, the average scanning speeds were 5.9 m/s in x-direction and 19.3 m/s in y-direction, the proton spill rate was 8.7 MU/s, and the maximum proton charge available for one acceleration is 2.0 ± 0.4 nC. Some of the measured parameters differed from the nominal values provided by the vendor. The calculated BDTs using experimentally determined parameters matched the recorded BDTs of 602 beam deliveries (∆t = -0.49 ± 1.44 s), which were significantly more accurate than BDTs calculated using nominal timing parameters (∆t = -7.48 ± 6.97 s). An accurate model for BDT prediction was achieved by using the experimentally determined proton beam therapy delivery parameters, which may be useful in modeling the interplay effect and patient throughput. The model may

  17. Testing Reproducibility in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  18. Musical training generalises across modalities and reveals efficient and adaptive mechanisms for reproducing temporal intervals.

    PubMed

    Aagten-Murphy, David; Cappagli, Giulia; Burr, David

    2014-03-01

    Expert musicians are able to time their actions accurately and consistently during a musical performance. We investigated how musical expertise influences the ability to reproduce auditory intervals and how this generalises across different techniques and sensory modalities. We first compared various reproduction strategies and interval length, to examine the effects in general and to optimise experimental conditions for testing the effect of music, and found that the effects were robust and consistent across different paradigms. Focussing on a 'ready-set-go' paradigm subjects reproduced time intervals drawn from distributions varying in total length (176, 352 or 704 ms) or in the number of discrete intervals within the total length (3, 5, 11 or 21 discrete intervals). Overall, Musicians performed more veridical than Non-Musicians, and all subjects reproduced auditory-defined intervals more accurately than visually-defined intervals. However, Non-Musicians, particularly with visual stimuli, consistently exhibited a substantial and systematic regression towards the mean interval. When subjects judged intervals from distributions of longer total length they tended to regress more towards the mean, while the ability to discriminate between discrete intervals within the distribution had little influence on subject error. These results are consistent with a Bayesian model that minimizes reproduction errors by incorporating a central tendency prior weighted by the subject's own temporal precision relative to the current distribution of intervals. Finally a strong correlation was observed between all durations of formal musical training and total reproduction errors in both modalities (accounting for 30% of the variance). Taken together these results demonstrate that formal musical training improves temporal reproduction, and that this improvement transfers from audition to vision. They further demonstrate the flexibility of sensorimotor mechanisms in adapting to

  19. Undefined cellulase formulations hinder scientific reproducibility

    DOE PAGES

    Himmel, Michael E.; Abbas, Charles A.; Baker, John O.; ...

    2017-11-28

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparationsmore » may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.« less

  20. Undefined cellulase formulations hinder scientific reproducibility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Himmel, Michael E.; Abbas, Charles A.; Baker, John O.

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparationsmore » may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.« less

  1. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    PubMed Central

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  2. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  3. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    DOE PAGES

    Nagayama, T.; Bailey, J. E.; Loisel, G.; ...

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 10 22 cm –3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulationsmore » that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the

  4. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  5. Urinary incontinence self-report questions: reproducibility and agreement with bladder diary.

    PubMed

    Bradley, Catherine S; Brown, Jeanette S; Van Den Eeden, Stephen K; Schembri, Michael; Ragins, Arona; Thom, David H

    2011-12-01

    This study aims to measure self-report urinary incontinence questions' reproducibility and agreement with bladder diary. Data were analyzed from the Reproductive Risk of Incontinence Study at Kaiser. Participating women reporting at least weekly incontinence completed self-report incontinence questions and a 7-day bladder diary. Self-report question reproducibility was assessed and agreement between self-reported and diary-recorded voiding and incontinence frequency was measured. Test characteristics and area under the curve were calculated for self-reported incontinence types using diary as the gold standard. Five hundred ninety-one women were included and 425 completed a diary. The self-report questions had moderate reproducibility and self-reported and diary-recorded incontinence and voiding frequencies had moderate to good agreement. Self-reported incontinence types identified stress and urgency incontinence more accurately than mixed incontinence. Self-report incontinence questions have moderate reproducibility and agreement with diary, and considering their minimal burden, are acceptable research tools in epidemiologic studies.

  6. Achieving perceptually-accurate aural telepresence

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.

    Immersive multimedia requires not only realistic visual imagery but also a perceptually-accurate aural experience. A sound field may be presented simultaneously to a listener via a loudspeaker rendering system using the direct sound from acoustic sources as well as a simulation or "auralization" of room acoustics. Beginning with classical Wave-Field Synthesis (WFS), improvements are made to correct for asymmetries in loudspeaker array geometry. Presented is a new Spatially-Equalized WFS (SE-WFS) technique to maintain the energy-time balance of a simulated room by equalizing the reproduced spectrum at the listener for a distribution of possible source angles. Each reproduced source or reflection is filtered according to its incidence angle to the listener. An SE-WFS loudspeaker array of arbitrary geometry reproduces the sound field of a room with correct spectral and temporal balance, compared with classically-processed WFS systems. Localization accuracy of human listeners in SE-WFS sound fields is quantified by psychoacoustical testing. At a loudspeaker spacing of 0.17 m (equivalent to an aliasing cutoff frequency of 1 kHz), SE-WFS exhibits a localization blur of 3 degrees, nearly equal to real point sources. Increasing the loudspeaker spacing to 0.68 m (for a cutoff frequency of 170 Hz) results in a blur of less than 5 degrees. In contrast, stereophonic reproduction is less accurate with a blur of 7 degrees. The ventriloquist effect is psychometrically investigated to determine the effect of an intentional directional incongruence between audio and video stimuli. Subjects were presented with prerecorded full-spectrum speech and motion video of a talker's head as well as broadband noise bursts with a static image. The video image was displaced from the audio stimulus in azimuth by varying amounts, and the perceived auditory location measured. A strong bias was detectable for small angular discrepancies between audio and video stimuli for separations of less than 8

  7. Folding molecular dynamics simulations accurately predict the effect of mutations on the stability and structure of a vammin-derived peptide.

    PubMed

    Koukos, Panagiotis I; Glykos, Nicholas M

    2014-08-28

    Folding molecular dynamics simulations amounting to a grand total of 4 μs of simulation time were performed on two peptides (with native and mutated sequences) derived from loop 3 of the vammin protein and the results compared with the experimentally known peptide stabilities and structures. The simulations faithfully and accurately reproduce the major experimental findings and show that (a) the native peptide is mostly disordered in solution, (b) the mutant peptide has a well-defined and stable structure, and (c) the structure of the mutant is an irregular β-hairpin with a non-glycine β-bulge, in excellent agreement with the peptide's known NMR structure. Additionally, the simulations also predict the presence of a very small β-hairpin-like population for the native peptide but surprisingly indicate that this population is structurally more similar to the structure of the native peptide as observed in the vammin protein than to the NMR structure of the isolated mutant peptide. We conclude that, at least for the given system, force field, and simulation protocol, folding molecular dynamics simulations appear to be successful in reproducing the experimentally accessible physical reality to a satisfactory level of detail and accuracy.

  8. Effect of Population Heterogenization on the Reproducibility of Mouse Behavior: A Multi-Laboratory Study

    PubMed Central

    Richter, S. Helene; Garner, Joseph P.; Zipser, Benjamin; Lewejohann, Lars; Sachser, Norbert; Touma, Chadi; Schindler, Britta; Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter; van Stipdonk, Niek; van der Harst, Johanneke; Spruijt, Berry; Võikar, Vootele; Wolfer, David P.; Würbel, Hanno

    2011-01-01

    In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies. PMID:21305027

  9. An accurate and efficient experimental approach for characterization of the complex oral microbiota.

    PubMed

    Zheng, Wei; Tsompana, Maria; Ruscitto, Angela; Sharma, Ashu; Genco, Robert; Sun, Yijun; Buck, Michael J

    2015-10-05

    Currently, taxonomic interrogation of microbiota is based on amplification of 16S rRNA gene sequences in clinical and scientific settings. Accurate evaluation of the microbiota depends heavily on the primers used, and genus/species resolution bias can arise with amplification of non-representative genomic regions. The latest Illumina MiSeq sequencing chemistry has extended the read length to 300 bp, enabling deep profiling of large number of samples in a single paired-end reaction at a fraction of the cost. An increasingly large number of researchers have adopted this technology for various microbiome studies targeting the 16S rRNA V3-V4 hypervariable region. To expand the applicability of this powerful platform for further descriptive and functional microbiome studies, we standardized and tested an efficient, reliable, and straightforward workflow for the amplification, library construction, and sequencing of the 16S V1-V3 hypervariable region using the new 2 × 300 MiSeq platform. Our analysis involved 11 subgingival plaque samples from diabetic and non-diabetic human subjects suffering from periodontitis. The efficiency and reliability of our experimental protocol was compared to 16S V3-V4 sequencing data from the same samples. Comparisons were based on measures of observed taxonomic richness and species evenness, along with Procrustes analyses using beta(β)-diversity distance metrics. As an experimental control, we also analyzed a total of eight technical replicates for the V1-V3 and V3-V4 regions from a synthetic community with known bacterial species operon counts. We show that our experimental protocol accurately measures true bacterial community composition. Procrustes analyses based on unweighted UniFrac β-diversity metrics depicted significant correlation between oral bacterial composition for the V1-V3 and V3-V4 regions. However, measures of phylotype richness were higher for the V1-V3 region, suggesting that V1-V3 offers a deeper assessment of

  10. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations.

    PubMed

    Martínez-Romero, Marcos; O'Connor, Martin J; Shankar, Ravi D; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L; Gevaert, Olivier; Graybeal, John; Musen, Mark A

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository.

  11. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations

    PubMed Central

    Martínez-Romero, Marcos; O’Connor, Martin J.; Shankar, Ravi D.; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L.; Gevaert, Olivier; Graybeal, John; Musen, Mark A.

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository. PMID:29854196

  12. Experimental Influences in the Accurate Measurement of Cartilage Thickness in MRI.

    PubMed

    Wang, Nian; Badar, Farid; Xia, Yang

    2018-01-01

    Objective To study the experimental influences to the measurement of cartilage thickness by magnetic resonance imaging (MRI). Design The complete thicknesses of healthy and trypsin-degraded cartilage were measured at high-resolution MRI under different conditions, using two intensity-based imaging sequences (ultra-short echo [UTE] and multislice-multiecho [MSME]) and 3 quantitative relaxation imaging sequences (T 1 , T 2 , and T 1 ρ). Other variables included different orientations in the magnet, 2 soaking solutions (saline and phosphate buffered saline [PBS]), and external loading. Results With cartilage soaked in saline, UTE and T 1 methods yielded complete and consistent measurement of cartilage thickness, while the thickness measurement by T 2 , T 1 ρ, and MSME methods were orientation dependent. The effect of external loading on cartilage thickness is also sequence and orientation dependent. All variations in cartilage thickness in MRI could be eliminated with the use of a 100 mM PBS or imaged by UTE sequence. Conclusions The appearance of articular cartilage and the measurement accuracy of cartilage thickness in MRI can be influenced by a number of experimental factors in ex vivo MRI, from the use of various pulse sequences and soaking solutions to the health of the tissue. T 2 -based imaging sequence, both proton-intensity sequence and quantitative relaxation sequence, similarly produced the largest variations. With adequate resolution, the accurate measurement of whole cartilage tissue in clinical MRI could be utilized to detect differences between healthy and osteoarthritic cartilage after compression.

  13. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.

  14. Composting in small laboratory pilots: Performance and reproducibility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creatingmore » artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the

  15. The origin of replicators and reproducers

    PubMed Central

    Szathmáry, Eörs

    2006-01-01

    Replicators are fundamental to the origin of life and evolvability. Their survival depends on the accuracy of replication and the efficiency of growth relative to spontaneous decay. Infrabiological systems are built of two coupled autocatalytic systems, in contrast to minimal living systems that must comprise at least a metabolic subsystem, a hereditary subsystem and a boundary, serving respective functions. Some scenarios prefer to unite all these functions into one primordial system, as illustrated in the lipid world scenario, which is considered as a didactic example in detail. Experimentally produced chemical replicators grow parabolically owing to product inhibition. A selection consequence is survival of everybody. The chromatographized replicator model predicts that such replicators spreading on surfaces can be selected for higher replication rate because double strands are washed away slower than single strands from the surface. Analysis of real ribozymes suggests that the error threshold of replication is less severe by about one order of magnitude than thought previously. Surface-bound dynamics is predicted to play a crucial role also for exponential replicators: unlinked genes belonging to the same genome do not displace each other by competition, and efficient and accurate replicases can spread. The most efficient form of such useful population structure is encapsulation by reproducing vesicles. The stochastic corrector model shows how such a bag of genes can survive, and what the role of chromosome formation and intragenic recombination could be. Prebiotic and early evolution cannot be understood without the models of dynamics. PMID:17008217

  16. Refined Dummy Atom Model of Mg(2+) by Simple Parameter Screening Strategy with Revised Experimental Solvation Free Energy.

    PubMed

    Jiang, Yang; Zhang, Haiyang; Feng, Wei; Tan, Tianwei

    2015-12-28

    Metal ions play an important role in the catalysis of metalloenzymes. To investigate metalloenzymes via molecular modeling, a set of accurate force field parameters for metal ions is highly imperative. To extend its application range and improve the performance, the dummy atom model of metal ions was refined through a simple parameter screening strategy using the Mg(2+) ion as an example. Using the AMBER ff03 force field with the TIP3P model, the refined model accurately reproduced the experimental geometric and thermodynamic properties of Mg(2+). Compared with point charge models and previous dummy atom models, the refined dummy atom model yields an enhanced performance for producing reliable ATP/GTP-Mg(2+)-protein conformations in three metalloenzyme systems with single or double metal centers. Similar to other unbounded models, the refined model failed to reproduce the Mg-Mg distance and favored a monodentate binding of carboxylate groups, and these drawbacks needed to be considered with care. The outperformance of the refined model is mainly attributed to the use of a revised (more accurate) experimental solvation free energy and a suitable free energy correction protocol. This work provides a parameter screening strategy that can be readily applied to refine the dummy atom models for metal ions.

  17. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  18. Competitive Abilities in Experimental Microcosms Are Accurately Predicted by a Demographic Index for R*

    PubMed Central

    Murrell, Ebony G.; Juliano, Steven A.

    2012-01-01

    Resource competition theory predicts that R*, the equilibrium resource amount yielding zero growth of a consumer population, should predict species' competitive abilities for that resource. This concept has been supported for unicellular organisms, but has not been well-tested for metazoans, probably due to the difficulty of raising experimental populations to equilibrium and measuring population growth rates for species with long or complex life cycles. We developed an index (Rindex) of R* based on demography of one insect cohort, growing from egg to adult in a non-equilibrium setting, and tested whether Rindex yielded accurate predictions of competitive abilities using mosquitoes as a model system. We estimated finite rate of increase (λ′) from demographic data for cohorts of three mosquito species raised with different detritus amounts, and estimated each species' Rindex using nonlinear regressions of λ′ vs. initial detritus amount. All three species' Rindex differed significantly, and accurately predicted competitive hierarchy of the species determined in simultaneous pairwise competition experiments. Our Rindex could provide estimates and rigorous statistical comparisons of competitive ability for organisms for which typical chemostat methods and equilibrium population conditions are impractical. PMID:22970128

  19. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  1. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  2. Cycle accurate and cycle reproducible memory for an FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameh W.; Kapur, Mohit

    2016-03-15

    A method, system and computer program product are disclosed for using a Field Programmable Gate Array (FPGA) to simulate operations of a device under test (DUT). The DUT includes a device memory having a number of input ports, and the FPGA is associated with a target memory having a second number of input ports, the second number being less than the first number. In one embodiment, a given set of inputs is applied to the device memory at a frequency Fd and in a defined cycle of time, and the given set of inputs is applied to the target memory at a frequency Ft. Ft is greater than Fd and cycle accuracy is maintained between the device memory and the target memory. In an embodiment, a cycle accurate model of the DUT memory is created by separating the DUT memory interface protocol from the target memory storage array.

  3. Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges.

    PubMed

    Mullane, Kevin; Williams, Michael

    2017-08-15

    Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Accuracy, reproducibility, and uncertainty analysis of thyroid-probe-based activity measurements for determination of dose calibrator settings.

    PubMed

    Esquinas, Pedro L; Tanguay, Jesse; Gonzalez, Marjorie; Vuckovic, Milan; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna

    2016-12-01

    In the nuclear medicine department, the activity of radiopharmaceuticals is measured using dose calibrators (DCs) prior to patient injection. The DC consists of an ionization chamber that measures current generated by ionizing radiation (emitted from the radiotracer). In order to obtain an activity reading, the current is converted into units of activity by applying an appropriate calibration factor (also referred to as DC dial setting). Accurate determination of DC dial settings is crucial to ensure that patients receive the appropriate dose in diagnostic scans or radionuclide therapies. The goals of this study were (1) to describe a practical method to experimentally determine dose calibrator settings using a thyroid-probe (TP) and (2) to investigate the accuracy, reproducibility, and uncertainties of the method. As an illustration, the TP method was applied to determine 188 Re dial settings for two dose calibrator models: Atomlab 100plus and Capintec CRC-55tR. Using the TP to determine dose calibrator settings involved three measurements. First, the energy-dependent efficiency of the TP was determined from energy spectra measurements of two calibration sources ( 152 Eu and 22 Na). Second, the gamma emissions from the investigated isotope ( 188 Re) were measured using the TP and its activity was determined using γ-ray spectroscopy methods. Ambient background, scatter, and source-geometry corrections were applied during the efficiency and activity determination steps. Third, the TP-based 188 Re activity was used to determine the dose calibrator settings following the calibration curve method [B. E. Zimmerman et al., J. Nucl. Med. 40, 1508-1516 (1999)]. The interobserver reproducibility of TP measurements was determined by the coefficient of variation (COV) and uncertainties associated to each step of the measuring process were estimated. The accuracy of activity measurements using the proposed method was evaluated by comparing the TP activity estimates of 99m Tc

  5. Reproducibility in science.

    PubMed

    Yaffe, Michael B

    2015-04-07

    The issue of reproducibility and reliability in science has come to the forefront in light of several high-profile studies that could not be reproduced. Whereas some errors in reliability can be attributed to the application of new techniques that have unappreciated caveats, some problems with reproducibility lie in the climate of intense pressure for funding and to publish faced by many researchers. Copyright © 2015, American Association for the Advancement of Science.

  6. Reliable Mechanochemistry: Protocols for Reproducible Outcomes of Neat and Liquid Assisted Ball-mill Grinding Experiments.

    PubMed

    Belenguer, Ana M; Lampronti, Giulio I; Sanders, Jeremy K M

    2018-01-23

    The equilibrium outcomes of ball mill grinding can dramatically change as a function of even tiny variations in the experimental conditions such as the presence of very small amounts of added solvent. To reproducibly and accurately capture this sensitivity, the experimentalist needs to carefully consider every single factor that can affect the ball mill grinding reaction under investigation, from ensuring the grinding jars are clean and dry before use, to accurately adding the stoichiometry of the starting materials, to validating that the delivery of solvent volume is accurate, to ensuring that the interaction between the solvent and the powder is well understood and, if necessary, a specific soaking time is added to the procedure. Preliminary kinetic studies are essential to determine the necessary milling time to achieve equilibrium. Only then can exquisite phase composition curves be obtained as a function of the solvent concentration under ball mill liquid assisted grinding (LAG). By using strict and careful procedures analogous to the ones here presented, such milling equilibrium curves can be obtained for virtually all milling systems. The system we use to demonstrate these procedures is a disulfide exchange reaction starting from the equimolar mixture of two homodimers to obtain at equilibrium quantitative heterodimer. The latter is formed by ball mill grinding as two different polymorphs, Form A and Form B. The ratio R = [Form B] / ([Form A] + [Form B]) at milling equilibrium depends on the nature and concentration of the solvent in the milling jar.

  7. Reproducibility of the cutoff probe for the measurement of electron density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, D. W.; Oh, W. Y.; You, S. J., E-mail: sjyou@cnu.ac.kr

    2016-06-15

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e.,more » there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.« less

  8. Accuracy and Reproducibility of Adipose Tissue Measurements in Young Infants by Whole Body Magnetic Resonance Imaging

    PubMed Central

    Bauer, Jan Stefan; Noël, Peter Benjamin; Vollhardt, Christiane; Much, Daniela; Degirmenci, Saliha; Brunner, Stefanie; Rummeny, Ernst Josef; Hauner, Hans

    2015-01-01

    Purpose MR might be well suited to obtain reproducible and accurate measures of fat tissues in infants. This study evaluates MR-measurements of adipose tissue in young infants in vitro and in vivo. Material and Methods MR images of ten phantoms simulating subcutaneous fat of an infant’s torso were obtained using a 1.5T MR scanner with and without simulated breathing. Scans consisted of a cartesian water-suppression turbo spin echo (wsTSE) sequence, and a PROPELLER wsTSE sequence. Fat volume was quantified directly and by MR imaging using k-means clustering and threshold-based segmentation procedures to calculate accuracy in vitro. Whole body MR was obtained in sleeping young infants (average age 67±30 days). This study was approved by the local review board. All parents gave written informed consent. To obtain reproducibility in vivo, cartesian and PROPELLER wsTSE sequences were repeated in seven and four young infants, respectively. Overall, 21 repetitions were performed for the cartesian sequence and 13 repetitions for the PROPELLER sequence. Results In vitro accuracy errors depended on the chosen segmentation procedure, ranging from 5.4% to 76%, while the sequence showed no significant influence. Artificial breathing increased the minimal accuracy error to 9.1%. In vivo reproducibility errors for total fat volume of the sleeping infants ranged from 2.6% to 3.4%. Neither segmentation nor sequence significantly influenced reproducibility. Conclusion With both cartesian and PROPELLER sequences an accurate and reproducible measure of body fat was achieved. Adequate segmentation was mandatory for high accuracy. PMID:25706876

  9. Accuracy and reproducibility of adipose tissue measurements in young infants by whole body magnetic resonance imaging.

    PubMed

    Bauer, Jan Stefan; Noël, Peter Benjamin; Vollhardt, Christiane; Much, Daniela; Degirmenci, Saliha; Brunner, Stefanie; Rummeny, Ernst Josef; Hauner, Hans

    2015-01-01

    MR might be well suited to obtain reproducible and accurate measures of fat tissues in infants. This study evaluates MR-measurements of adipose tissue in young infants in vitro and in vivo. MR images of ten phantoms simulating subcutaneous fat of an infant's torso were obtained using a 1.5T MR scanner with and without simulated breathing. Scans consisted of a cartesian water-suppression turbo spin echo (wsTSE) sequence, and a PROPELLER wsTSE sequence. Fat volume was quantified directly and by MR imaging using k-means clustering and threshold-based segmentation procedures to calculate accuracy in vitro. Whole body MR was obtained in sleeping young infants (average age 67±30 days). This study was approved by the local review board. All parents gave written informed consent. To obtain reproducibility in vivo, cartesian and PROPELLER wsTSE sequences were repeated in seven and four young infants, respectively. Overall, 21 repetitions were performed for the cartesian sequence and 13 repetitions for the PROPELLER sequence. In vitro accuracy errors depended on the chosen segmentation procedure, ranging from 5.4% to 76%, while the sequence showed no significant influence. Artificial breathing increased the minimal accuracy error to 9.1%. In vivo reproducibility errors for total fat volume of the sleeping infants ranged from 2.6% to 3.4%. Neither segmentation nor sequence significantly influenced reproducibility. With both cartesian and PROPELLER sequences an accurate and reproducible measure of body fat was achieved. Adequate segmentation was mandatory for high accuracy.

  10. Bad Behavior: Improving Reproducibility in Behavior Testing.

    PubMed

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  11. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  12. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  13. The Need for Reproducibility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, Robert W.

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  14. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    PubMed

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  15. Modernized Approach for Generating Reproducible Heterogeneity Using Transmitted-Light for Flow Visualization Experiments

    NASA Astrophysics Data System (ADS)

    Jones, A. A.; Holt, R. M.

    2017-12-01

    Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).

  16. NetBenchmark: a bioconductor package for reproducible benchmarks of gene regulatory network inference.

    PubMed

    Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E

    2015-09-29

    In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.

  17. A highly accurate ab initio potential energy surface for methane.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-14

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of (12)CH4 reproduced with a root-mean-square error of 0.70 cm(-1). The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  18. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity.

  19. Optimal experimental design in an epidermal growth factor receptor signalling and down-regulation model.

    PubMed

    Casey, F P; Baird, D; Feng, Q; Gutenkunst, R N; Waterfall, J J; Myers, C R; Brown, K S; Cerione, R A; Sethna, J P

    2007-05-01

    We apply the methods of optimal experimental design to a differential equation model for epidermal growth factor receptor signalling, trafficking and down-regulation. The model incorporates the role of a recently discovered protein complex made up of the E3 ubiquitin ligase, Cbl, the guanine exchange factor (GEF), Cool-1 (beta -Pix) and the Rho family G protein Cdc42. The complex has been suggested to be important in disrupting receptor down-regulation. We demonstrate that the model interactions can accurately reproduce the experimental observations, that they can be used to make predictions with accompanying uncertainties, and that we can apply ideas of optimal experimental design to suggest new experiments that reduce the uncertainty on unmeasurable components of the system.

  20. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  1. Infrared Imaging of Carbon and Ceramic Composites: Data Reproducibility

    NASA Astrophysics Data System (ADS)

    Knight, B.; Howard, D. R.; Ringermacher, H. I.; Hudson, L. D.

    2010-02-01

    Infrared NDE techniques have proven to be superior for imaging of flaws in ceramic matrix composites (CMC) and carbon silicon carbide composites (C/SiC). Not only can one obtain accurate depth gauging of flaws such as delaminations and layered porosity in complex-shaped components such as airfoils and other aeronautical components, but also excellent reproducibility of image data is obtainable using the STTOF (Synthetic Thermal Time-of-Flight) methodology. The imaging of large complex shapes is fast and reliable. This methodology as applied to large C/SiC flight components at the NASA Dryden Flight Research Center will be described.

  2. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  3. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  4. Water trimer torsional spectrum from accurate ab initio and semiempirical potentials

    NASA Astrophysics Data System (ADS)

    van der Avoird, Ad; Szalewicz, Krzysztof

    2008-01-01

    The torsional levels of (H2O)3 and (D2O)3 were calculated in a restricted dimensionality (three-dimensional) model with several recently proposed water potentials. Comparison with the experimental data provides a critical test, not only of the pair interactions that have already been probed on the water dimer spectra, but also of the nonadditive three-body contributions to the potential. The purely ab initio CC-pol and HBB potentials that were previously shown to yield very accurate water dimer levels, also reproduce the trimer levels well when supplemented with an appropriate three-body interaction potential. The TTM2.1 potential gives considerably less good agreement with experiment. Also the semiempirical VRT(ASP-W)III potential, fitted to the water dimer vibration-rotation-tunneling levels, gives substantial disagreement with the measured water trimer levels, which shows that the latter probe the potential for geometries other than those probed by the dimer spectrum. Although the three-body nonadditive interactions significantly increase the stability of the water trimer, their effect on the torsional energy barriers and vibration-tunneling frequencies is less significant.

  5. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data

    NASA Astrophysics Data System (ADS)

    White, Andrew D.; Knight, Chris; Hocky, Glen M.; Voth, Gregory A.

    2017-01-01

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  6. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data.

    PubMed

    White, Andrew D; Knight, Chris; Hocky, Glen M; Voth, Gregory A

    2017-01-28

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  7. Reproducible segmentation of white matter hyperintensities using a new statistical definition.

    PubMed

    Damangir, Soheil; Westman, Eric; Simmons, Andrew; Vrenken, Hugo; Wahlund, Lars-Olof; Spulber, Gabriela

    2017-06-01

    We present a method based on a proposed statistical definition of white matter hyperintensities (WMH), which can work with any combination of conventional magnetic resonance (MR) sequences without depending on manually delineated samples. T1-weighted, T2-weighted, FLAIR, and PD sequences acquired at 1.5 Tesla from 119 subjects from the Kings Health Partners-Dementia Case Register (healthy controls, mild cognitive impairment, Alzheimer's disease) were used. The segmentation was performed using a proposed definition for WMH based on the one-tailed Kolmogorov-Smirnov test. The presented method was verified, given all possible combinations of input sequences, against manual segmentations and a high similarity (Dice 0.85-0.91) was observed. Comparing segmentations with different input sequences to one another also yielded a high similarity (Dice 0.83-0.94) that exceeded intra-rater similarity (Dice 0.75-0.91). We compared the results with those of other available methods and showed that the segmentation based on the proposed definition has better accuracy and reproducibility in the test dataset used. Overall, the presented definition is shown to produce accurate results with higher reproducibility than manual delineation. This approach can be an alternative to other manual or automatic methods not only because of its accuracy, but also due to its good reproducibility.

  8. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    PubMed

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  9. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    PubMed

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  10. Reproducibility of MRI-determined proton density fat fraction across two different MR scanner platforms.

    PubMed

    Kang, Geraldine H; Cruite, Irene; Shiehmorteza, Masoud; Wolfson, Tanya; Gamst, Anthony C; Hamilton, Gavin; Bydder, Mark; Middleton, Michael S; Sirlin, Claude B

    2011-10-01

    To evaluate magnetic resonance imaging (MRI)-determined proton density fat fraction (PDFF) reproducibility across two MR scanner platforms and, using MR spectroscopy (MRS)-determined PDFF as reference standard, to confirm MRI-determined PDFF estimation accuracy. This prospective, cross-sectional, crossover, observational pilot study was approved by an Institutional Review Board. Twenty-one subjects gave written informed consent and underwent liver MRI and MRS at both 1.5T (Siemens Symphony scanner) and 3T (GE Signa Excite HD scanner). MRI-determined PDFF was estimated using an axial 2D spoiled gradient-recalled echo sequence with low flip-angle to minimize T1 bias and six echo-times to permit correction of T2* and fat-water signal interference effects. MRS-determined PDFF was estimated using a stimulated-echo acquisition mode sequence with long repetition time to minimize T1 bias and five echo times to permit T2 correction. Interscanner reproducibility of MRI determined PDFF was assessed by correlation analysis; accuracy was assessed separately at each field strength by linear regression analysis using MRS-determined PDFF as reference standard. 1.5T and 3T MRI-determined PDFF estimates were highly correlated (r = 0.992). MRI-determined PDFF estimates were accurate at both 1.5T (regression slope/intercept = 0.958/-0.48) and 3T (slope/intercept = 1.020/0.925) against the MRS-determined PDFF reference. MRI-determined PDFF estimation is reproducible and, using MRS-determined PDFF as reference standard, accurate across two MR scanner platforms at 1.5T and 3T. Copyright © 2011 Wiley-Liss, Inc.

  11. ACCURATE ORBITAL INTEGRATION OF THE GENERAL THREE-BODY PROBLEM BASED ON THE D'ALEMBERT-TYPE SCHEME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minesaki, Yukitaka

    2013-03-15

    We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.

  12. Mathematical modeling and experimental testing of three bioreactor configurations based on windkessel models

    PubMed Central

    Ruel, Jean; Lachance, Geneviève

    2010-01-01

    This paper presents an experimental study of three bioreactor configurations. The bioreactor is intended to be used for the development of tissue-engineered heart valve substitutes. Therefore it must be able to reproduce physiological flow and pressure waveforms accurately. A detailed analysis of three bioreactor arrangements is presented using mathematical models based on the windkessel (WK) approach. First, a review of the many applications of this approach in medical studies enhances its fundamental nature and its usefulness. Then the models are developed with reference to the actual components of the bioreactor. This study emphasizes different conflicting issues arising in the design process of a bioreactor for biomedical purposes, where an optimization process is essential to reach a compromise satisfying all conditions. Two important aspects are the need for a simple system providing ease of use and long-term sterility, opposed to the need for an advanced (thus more complex) architecture capable of a more accurate reproduction of the physiological environment. Three classic WK architectures are analyzed, and experimental results enhance the advantages and limitations of each one. PMID:21977286

  13. Accurate optimization of amino acid form factors for computing small-angle X-ray scattering intensity of atomistic protein structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, Dudu; Yang, Sichun; Lu, Lanyuan

    2016-06-20

    Structure modellingviasmall-angle X-ray scattering (SAXS) data generally requires intensive computations of scattering intensity from any given biomolecular structure, where the accurate evaluation of SAXS profiles using coarse-grained (CG) methods is vital to improve computational efficiency. To date, most CG SAXS computing methods have been based on a single-bead-per-residue approximation but have neglected structural correlations between amino acids. To improve the accuracy of scattering calculations, accurate CG form factors of amino acids are now derived using a rigorous optimization strategy, termed electron-density matching (EDM), to best fit electron-density distributions of protein structures. This EDM method is compared with and tested againstmore » other CG SAXS computing methods, and the resulting CG SAXS profiles from EDM agree better with all-atom theoretical SAXS data. By including the protein hydration shell represented by explicit CG water molecules and the correction of protein excluded volume, the developed CG form factors also reproduce the selected experimental SAXS profiles with very small deviations. Taken together, these EDM-derived CG form factors present an accurate and efficient computational approach for SAXS computing, especially when higher molecular details (represented by theqrange of the SAXS data) become necessary for effective structure modelling.« less

  14. Reproducibility of gastrocnemius medialis muscle architecture during treadmill running.

    PubMed

    Giannakou, Erasmia; Aggeloussis, Nickos; Arampatzis, Adamantios

    2011-12-01

    The purpose of this study was to assess the reproducibility of fascicle length (FL) and pennation angle (PA) of gastrocnemius medialis (GM) muscle during running in vivo. Twelve male recreational long distance runners (mean±SD; age: 24±3 years, mass: 76±7kg) ran on a treadmill at a speed of 3.0m/s, wearing their own running shoes, for two different 10min sessions that were at least 2 days apart. For each test day 10 acceptable trials were recorded. Ankle and knee joint angle data were recorded by a Vicon 624 system with three cameras operating at 120Hz. B-mode ultrasonography was used to examine fascicle length and pennation angle of gastrocnemius medialis muscle. The ultrasound probe was firmly secured on the muscle belly using a lightweight foam fixation. The results indicated that fascicle length and pennation angle demonstrated high reproducibility values during treadmill running both for within and between test days. The root mean square scores between the repeated waveforms of pennation angle and fascicle length were small (∼2° and ∼3.5mm, respectively). However, ∼14 trials for pennation angle and ∼9 trials for fascicle length may be required in order to record accurate data from muscle architecture parameters. In conclusion, ultrasound measurements may be highly reproducible during dynamic movements such as treadmill running, provided that a proper fixation is used in order to assure the constant location and orientation of the ultrasound probe throughout the movement. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Experimental and Theoretical Reduction Potentials of Some Biologically Active ortho-Carbonyl para-Quinones.

    PubMed

    Martínez-Cifuentes, Maximiliano; Salazar, Ricardo; Ramírez-Rodríguez, Oney; Weiss-López, Boris; Araya-Maturana, Ramiro

    2017-04-04

    The rational design of quinones with specific redox properties is an issue of great interest because of their applications in pharmaceutical and material sciences. In this work, the electrochemical behavior of a series of four p -quinones was studied experimentally and theoretically. The first and second one-electron reduction potentials of the quinones were determined using cyclic voltammetry and correlated with those calculated by density functional theory (DFT) using three different functionals, BHandHLYP, M06-2x and PBE0. The differences among the experimental reduction potentials were explained in terms of structural effects on the stabilities of the formed species. DFT calculations accurately reproduced the first one-electron experimental reduction potentials with R ² higher than 0.94. The BHandHLYP functional presented the best fit to the experimental values ( R ² = 0.957), followed by M06-2x ( R ² = 0.947) and PBE0 ( R ² = 0.942).

  16. Opinion: Is science really facing a reproducibility crisis, and do we need it to?

    PubMed Central

    Fanelli, Daniele

    2018-01-01

    Efforts to improve the reproducibility and integrity of science are typically justified by a narrative of crisis, according to which most published results are unreliable due to growing problems with research and publication practices. This article provides an overview of recent evidence suggesting that this narrative is mistaken, and argues that a narrative of epochal changes and empowerment of scientists would be more accurate, inspiring, and compelling. PMID:29531051

  17. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  18. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  19. Experimental demonstration of cheap and accurate phase estimation

    NASA Astrophysics Data System (ADS)

    Rudinger, Kenneth; Kimmel, Shelby; Lobser, Daniel; Maunz, Peter

    We demonstrate experimental implementation of robust phase estimation (RPE) to learn the phases of X and Y rotations on a trapped Yb+ ion qubit.. Unlike many other phase estimation protocols, RPE does not require ancillae nor near-perfect state preparation and measurement operations. Additionally, its computational requirements are minimal. Via RPE, using only 352 experimental samples per phase, we estimate phases of implemented gates with errors as small as 10-4 radians, as validated using gate set tomography. We also demonstrate that these estimates exhibit Heisenberg scaling in accuracy. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  20. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  1. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    NASA Astrophysics Data System (ADS)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  2. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    PubMed Central

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research. PMID:29599739

  3. Reproducibility in a multiprocessor system

    DOEpatents

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  4. Reproducibility of intraocular pressure peak and fluctuation of the water-drinking test.

    PubMed

    Hatanaka, Marcelo; Alencar, Luciana M; De Moraes, Carlos G; Susanna, Remo

    2013-01-01

    The water-drinking test has been used as a stress test to evaluate the drainage system of the eye. However, in order to be clinically applicable,a test must provide reproducible results with consistent measurements. This study was performed to verify the reproducibility of intraocular pressure peaks and fluctuation detected during the water-drinking test in patients with ocular hypertension and open-angle glaucoma. A prospective analysis of patients in a tertiary care unit for glaucoma treatment. Twenty-four ocular hypertension and 64 open-angle glaucoma patients not under treatment. The water-drinking test was performed in 2 consecutive days by the same examiners in patients not under treatment. Reproducibility was assessed using the intraclass correlation coefficient. Peak and fluctuation of intraocular pressure obtained with the water-drinking test were analysed for reproducibility. Eighty-eight eyes from 24 ocular hypertension and 64 open-angle glaucoma patients not under treatment were evaluated. Test and retest intraocular pressure peak values were 28.38 ± 4.64 and 28.38 ± 4.56 mmHg, respectively (P = 1.00). Test and retest intraocular pressure fluctuation values were 5.75 ± 3.9 and 4.99 ± 2.7 mmHg, respectively (P = 0.06). Based on intraclass coefficient, reproducibility was excellent for peak intraocular pressure (intraclass correlation coefficient = 0.79) and fair for intraocular pressure fluctuation (intraclass correlation coefficient = 0.37). Intraocular pressure peaks detected during the water-drinking test presented excellent reproducibility, whereas the reproducibility of fluctuation was considered fair. © 2012 The Authors. Clinical and Experimental Ophthalmology © 2012 Royal Australian and New Zealand College of Ophthalmologists.

  5. Reproducibility of MRI-Determined Proton Density Fat Fraction Across Two Different MR Scanner Platforms

    PubMed Central

    Kang, Geraldine H.; Cruite, Irene; Shiehmorteza, Masoud; Wolfson, Tanya; Gamst, Anthony C.; Hamilton, Gavin; Bydder, Mark; Middleton, Michael S.; Sirlin, Claude B.

    2016-01-01

    Purpose To evaluate magnetic resonance imaging (MRI)-determined proton density fat fraction (PDFF) reproducibility across two MR scanner platforms and, using MR spectroscopy (MRS)-determined PDFF as reference standard, to confirm MRI-determined PDFF estimation accuracy. Materials and Methods This prospective, cross-sectional, crossover, observational pilot study was approved by an Institutional Review Board. Twenty-one subjects gave written informed consent and underwent liver MRI and MRS at both 1.5T (Siemens Symphony scanner) and 3T (GE Signa Excite HD scanner). MRI-determined PDFF was estimated using an axial 2D spoiled gradient-recalled echo sequence with low flip-angle to minimize T1 bias and six echo-times to permit correction of T2* and fat-water signal interference effects. MRS-determined PDFF was estimated using a stimulated-echo acquisition mode sequence with long repetition time to minimize T1 bias and five echo times to permit T2 correction. Interscanner reproducibility of MRI determined PDFF was assessed by correlation analysis; accuracy was assessed separately at each field strength by linear regression analysis using MRS-determined PDFF as reference standard. Results 1.5T and 3T MRI-determined PDFF estimates were highly correlated (r = 0.992). MRI-determined PDFF estimates were accurate at both 1.5T (regression slope/intercept = 0.958/−0.48) and 3T (slope/intercept = 1.020/0.925) against the MRS-determined PDFF reference. Conclusion MRI-determined PDFF estimation is reproducible and, using MRS-determined PDFF as reference standard, accurate across two MR scanner platforms at 1.5T and 3T. PMID:21769986

  6. Contextual sensitivity in scientific reproducibility

    PubMed Central

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  7. Contextual sensitivity in scientific reproducibility.

    PubMed

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  8. Low-dimensional, morphologically accurate models of subthreshold membrane potential

    PubMed Central

    Kellems, Anthony R.; Roos, Derrick; Xiao, Nan; Cox, Steven J.

    2009-01-01

    The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasi-active model, which in turn we reduce by both time-domain (Balanced Truncation) and frequency-domain (ℋ2 approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speed-up in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasi-integrate and fire model. PMID:19172386

  9. The reproducibility of 31-phosphorus MRS measures of muscle energetics at 3 Tesla in trained men.

    PubMed

    Edwards, Lindsay M; Tyler, Damian J; Kemp, Graham J; Dwyer, Renee M; Johnson, Andrew; Holloway, Cameron J; Nevill, Alan M; Clarke, Kieran

    2012-01-01

    Magnetic resonance spectroscopy (MRS) provides an exceptional opportunity for the study of in vivo metabolism. MRS is widely used to measure phosphorus metabolites in trained muscle, although there are no published data regarding its reproducibility in this specialized cohort. Thus, the aim of this study was to assess the reproducibility of (31)P-MRS in trained skeletal muscle. We recruited fifteen trained men (VO(2)peak = 4.7±0.8 L min(-1)/58±8 mL kg(-1) min(-1)) and performed duplicate MR experiments during plantar flexion exercise, three weeks apart. Measures of resting phosphorus metabolites were reproducible, with 1.7 mM the smallest detectable difference in phosphocreatine (PCr). Measures of metabolites during exercise were less reliable: exercising PCr had a coefficient of variation (CV) of 27% during exercise, compared with 8% at rest. Estimates of mitochondrial function were variable, but experimentally useful. The CV of PCr(1/2t) was 40%, yet much of this variance was inter-subject such that differences of <20% were detectable with n = 15, given a significance threshold of p<0.05. 31-phosphorus MRS provides reproducible and experimentally useful measures of phosphorus metabolites and mitochondrial function in trained human skeletal muscle.

  10. Accuracy, repeatability, and reproducibility of Artemis very high-frequency digital ultrasound arc-scan lateral dimension measurements

    PubMed Central

    Reinstein, Dan Z.; Archer, Timothy J.; Silverman, Ronald H.; Coleman, D. Jackson

    2008-01-01

    Purpose To determine the accuracy, repeatability, and reproducibility of measurement of lateral dimensions using the Artemis (Ultralink LLC) very high-frequency (VHF) digital ultrasound (US) arc scanner. Setting London Vision Clinic, London, United Kingdom. Methods A test object was measured first with a micrometer and then with the Artemis arc scanner. Five sets of 10 consecutive B-scans of the test object were performed with the scanner. The test object was removed from the system between each scan set. One expert observer and one newly trained observer separately measured the lateral dimension of the test object. Two-factor analysis of variance was performed. The accuracy was calculated as the average bias of the scan set averages. The repeatability and reproducibility coefficients were calculated. The coefficient of variation (CV) was calculated for repeatability and reproducibility. Results The test object was measured to be 10.80 mm wide. The mean lateral dimension bias was 0.00 mm. The repeatability coefficient was 0.114 mm. The reproducibility coefficient was 0.026 mm. The repeatability CV was 0.38%, and the reproducibility CV was 0.09%. There was no statistically significant variation between observers (P = .0965). There was a statistically significant variation between scan sets (P = .0036) attributed to minor vertical changes in the alignment of the test object between consecutive scan sets. Conclusion The Artemis VHF digital US arc scanner obtained accurate, repeatable, and reproducible measurements of lateral dimensions of the size commonly found in the anterior segment. PMID:17081860

  11. Accurate thermoelastic tensor and acoustic velocities of NaCl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcondes, Michel L., E-mail: michel@if.usp.br; Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455; Shukla, Gaurav, E-mail: shukla@physics.umn.edu

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor bymore » using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.« less

  12. Ranking and averaging independent component analysis by reproducibility (RAICAR).

    PubMed

    Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping

    2008-06-01

    Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data. Copyright 2007 Wiley-Liss, Inc.

  13. A SELDI mass spectrometry study of experimental autoimmune encephalomyelitis: sample preparation, reproducibility, and differential protein expression patterns.

    PubMed

    Azzam, Sausan; Broadwater, Laurie; Li, Shuo; Freeman, Ernest J; McDonough, Jennifer; Gregory, Roger B

    2013-05-01

    Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein isoform 8 (MBP8) (14.2 k

  14. Experimental Demonstration of a Cheap and Accurate Phase Estimation

    DOE PAGES

    Rudinger, Kenneth; Kimmel, Shelby; Lobser, Daniel; ...

    2017-05-11

    We demonstrate an experimental implementation of robust phase estimation (RPE) to learn the phase of a single-qubit rotation on a trapped Yb + ion qubit. Here, we show this phase can be estimated with an uncertainty below 4 × 10 -4 rad using as few as 176 total experimental samples, and our estimates exhibit Heisenberg scaling. Unlike standard phase estimation protocols, RPE neither assumes perfect state preparation and measurement, nor requires access to ancillae. We crossvalidate the results of RPE with the more resource-intensive protocol of gate set tomography.

  15. The use of a robotic tibial rotation device and an electromagnetic tracking system to accurately reproduce the clinical dial test.

    PubMed

    Stinton, S K; Siebold, R; Freedberg, H; Jacobs, C; Branch, T P

    2016-03-01

    The purpose of this study was to: (1) determine whether a robotic tibial rotation device and an electromagnetic tracking system could accurately reproduce the clinical dial test at 30° of knee flexion; (2) compare rotation data captured at the footplates of the robotic device to tibial rotation data measured using an electromagnetic sensor on the proximal tibia. Thirty-two unilateral ACL-reconstructed patients were examined using a robotic tibial rotation device that mimicked the dial test. The data reported in this study is only from the healthy legs of these patients. Torque was applied through footplates and was measured using servomotors. Lower leg motion was measured at the foot using the motors. Tibial motion was also measured through an electromagnetic tracking system and a sensor on the proximal tibia. Load-deformation curves representing rotational motion of the foot and tibia were compared using Pearson's correlation coefficients. Off-axis motions including medial-lateral translation and anterior-posterior translation were also measured using the electromagnetic system. The robotic device and electromagnetic system were able to provide axial rotation data and translational data for the tibia during the dial test. Motion measured at the foot was not correlated to motion of the tibial tubercle in internal rotation or in external rotation. The position of the tibial tubercle was 26.9° ± 11.6° more internally rotated than the foot at torque 0 Nm. Medial-lateral translation and anterior-posterior translation were combined to show the path of the tubercle in the coronal plane during tibial rotation. The information captured during a manual dial test includes both rotation of the tibia and proximal tibia translation. All of this information can be captured using a robotic tibial axial rotation device with an electromagnetic tracking system. The pathway of the tibial tubercle during tibial axial rotation can provide additional information about knee

  16. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens.

    PubMed

    Larson, Jeffrey S; Goodman, Laurie J; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C; Cook, Jennifer W; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D B; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J; Whitcomb, Jeannette M

    2010-06-28

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7-10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH).

  17. Wringing the last drop of optically stimulated luminescence response for accurate dating of glacial sediments

    NASA Astrophysics Data System (ADS)

    Medialdea, Alicia; Bateman, Mark D.; Evans, David J.; Roberts, David H.; Chiverrell, Richard C.; Clark, Chris D.

    2017-04-01

    BRITICE-CHRONO is a NERC-funded consortium project of more than 40 researchers aiming to establish the retreat patterns of the last British and Irish Ice Sheet. For this purpose, optically stimulated luminescence (OSL) dating, among other dating techniques, has been used in order to establish accurate chronology. More than 150 samples from glacial environments have been dated and provide key information for modelling of the ice retreat. Nevertheless, luminescence dating of glacial sediments has proven to be challenging: first, glacial sediments were often affected by incomplete bleaching and secondly, quartz grains within the sediments sampled were often characterized by complex luminescence behaviour; characterized by dim signal and low reproducibility. Specific statistical approaches have been used to over come the former to enable the estimated ages to be based on grain populations most likely to have been well bleached. This latest work presents how issues surrounding complex luminescence behaviour were over-come in order to obtain accurate OSL ages. This study has been performed on two samples of bedded sand originated on an ice walled lake plain, in Lincolnshire, UK. Quartz extracts from each sample were artificially bleached and irradiated to known doses. Dose recovery tests have been carried out under different conditions to study the effect of: preheat temperature, thermal quenching, contribution of slow components, hot bleach after a measuring cycles and IR stimulation. Measurements have been performed on different luminescence readers to study the possible contribution of instrument reproducibility. These have shown that a great variability can be observed not only among the studied samples but also within a specific site and even a specific sample. In order to determine an accurate chronology and realistic uncertainties to the estimated ages, this variability must be taken into account. Tight acceptance criteria to measured doses from natural, not

  18. Application of the weighted-density approximation to the accurate description of electron-positron correlation effects in materials

    NASA Astrophysics Data System (ADS)

    Callewaert, Vincent; Saniz, Rolando; Barbiellini, Bernardo; Bansil, Arun; Partoens, Bart

    2017-08-01

    We discuss positron-annihilation lifetimes for a set of illustrative bulk materials within the framework of the weighted-density approximation (WDA). The WDA can correctly describe electron-positron correlations in strongly inhomogeneous systems, such as surfaces, where the applicability of (semi-)local approximations is limited. We analyze the WDA in detail and show that the electrons which cannot screen external charges efficiently, such as the core electrons, cannot be treated accurately via the pair correlation of the homogeneous electron gas. We discuss how this problem can be addressed by reducing the screening in the homogeneous electron gas by adding terms depending on the gradient of the electron density. Further improvements are obtained when core electrons are treated within the LDA and the valence electron using the WDA. Finally, we discuss a semiempirical WDA-based approach in which a sum rule is imposed to reproduce the experimental lifetimes.

  19. Numerical and Experimental Studies of Particle Settling in Real Fracture Geometries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Pratanu; Du Frane, Wyatt L.; Kanarska, Yuliya

    In this study, proppant is a vital component of hydraulic stimulation operations, improving conductivity by maintaining fracture aperture. While correct placement is a necessary part of ensuring that proppant performs efficiently, the transport behavior of proppant in natural rock fractures is poorly understood. In particular, as companies pursue new propping strategies involving new types of proppant, more accurate models of proppant behavior are needed to help guide their deployment. A major difficulty with simulating reservoir-scale proppant behavior is that continuum models traditionally used to represent large-scale slurry behavior loose applicability in fracture geometries. Particle transport models are often based onmore » representative volumes that are at the same scale or larger than fractures found in hydraulic fracturing operations, making them inappropriate for modeling these types of flows. In the absence of a first-principles approach, empirical closure relations are needed. However, even such empirical closure relationships are difficult to derive without an accurate understanding of proppant behavior on the particle level. Thus, there is a need for experiments and simulations capable of probing phenomena at the sub-fracture scale. In this paper, we present results from experimental and numerical studies investigating proppant behavior at the sub-fracture level, in particular, the role of particle dispersion during proppant settling. In the experimental study, three-dimensional printing techniques are used to accurately reproduce the topology of a fractured Marcellus shale sample inside a particle-flow cell.« less

  20. Numerical and Experimental Studies of Particle Settling in Real Fracture Geometries

    DOE PAGES

    Roy, Pratanu; Du Frane, Wyatt L.; Kanarska, Yuliya; ...

    2016-09-30

    In this study, proppant is a vital component of hydraulic stimulation operations, improving conductivity by maintaining fracture aperture. While correct placement is a necessary part of ensuring that proppant performs efficiently, the transport behavior of proppant in natural rock fractures is poorly understood. In particular, as companies pursue new propping strategies involving new types of proppant, more accurate models of proppant behavior are needed to help guide their deployment. A major difficulty with simulating reservoir-scale proppant behavior is that continuum models traditionally used to represent large-scale slurry behavior loose applicability in fracture geometries. Particle transport models are often based onmore » representative volumes that are at the same scale or larger than fractures found in hydraulic fracturing operations, making them inappropriate for modeling these types of flows. In the absence of a first-principles approach, empirical closure relations are needed. However, even such empirical closure relationships are difficult to derive without an accurate understanding of proppant behavior on the particle level. Thus, there is a need for experiments and simulations capable of probing phenomena at the sub-fracture scale. In this paper, we present results from experimental and numerical studies investigating proppant behavior at the sub-fracture level, in particular, the role of particle dispersion during proppant settling. In the experimental study, three-dimensional printing techniques are used to accurately reproduce the topology of a fractured Marcellus shale sample inside a particle-flow cell.« less

  1. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  2. Enhancing Scientific Foundations to Ensure Reproducibility: A New Paradigm.

    PubMed

    Hsieh, Terry; Vaickus, Max H; Remick, Daniel G

    2018-01-01

    Progress in science is dependent on a strong foundation of reliable results. The publish or perish paradigm in research, coupled with an increase in retracted articles from the peer-reviewed literature, is beginning to erode the trust of both the scientific community and the public. The NIH is combating errors by requiring investigators to follow new guidelines addressing scientific premise, experimental design, biological variables, and authentication of reagents. Herein, we discuss how implementation of NIH guidelines will help investigators proactively address pitfalls of experimental design and methods. Careful consideration of the variables contributing to reproducibility helps ensure robust results. The NIH, investigators, and journals must collaborate to ensure that quality science is funded, explored, and published. Copyright © 2018 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  3. A SELDI mass spectrometry study of experimental autoimmune encephalomyelitis: sample preparation, reproducibility, and differential protein expression patterns

    PubMed Central

    2013-01-01

    Background Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Results Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein

  4. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    NASA Astrophysics Data System (ADS)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  5. Reproducibility in Data-Scarce Environments

    NASA Astrophysics Data System (ADS)

    Darch, P. T.

    2016-12-01

    Among the usual requirements for reproducibility are large volumes of data and computationally intensive methods. Many fields within earth sciences, however, do not meet these requirements. Data are scarce and data-intensive methods are not well established. How can science be reproducible under these conditions? What changes, both infrastructural and cultural, are needed to advance reproducibility? This paper presents findings from a long-term social scientific case study of an emergent and data scarce field, the deep subseafloor biosphere. This field studies interactions between microbial communities living in the seafloor and the physical environments they inhabit. Factors such as these make reproducibility seem a distant goal for this community: - The relative newness of the field. Serious study began in the late 1990s; - The highly multidisciplinary nature of the field. Researchers come from a range of physical and life science backgrounds; - Data scarcity. Domain researchers produce much of these data in their own onshore laboratories by analyzing cores from international ocean drilling expeditions. Allocation of cores is negotiated between researchers from many fields. These factors interact in multiple ways to inhibit reproducibility: - Incentive structures emphasize producing new data and new knowledge rather than reanalysing extant data; - Only a few steps of laboratory analyses can be reproduced - such as analysis of DNA sequences, but not extraction of DNA from cores -, due to scarcity of cores; - Methodological heterogeneity is a consequence of multidisciplinarity, as researchers bring different techniques from diverse fields. - Few standards for data collection or analysis are available at this early stage of the field; - While datasets from multiple biological and physical phenomena can be integrated into a single workflow, curation tends to be divergent. Each type of dataset may be subject to different disparate policies and contributed to different

  6. Accuracy of femoral templating in reproducing anatomical femoral offset in total hip replacement.

    PubMed

    Davies, H; Foote, J; Spencer, R F

    2007-01-01

    Restoration of hip biomechanics is a crucial component of successful total hip replacement. Preoperative templating is recommended to ensure that the size and orientation of implants is optimised. We studied how closely natural femoral offset could be reproduced using the manufacturers' templates for 10 femoral stems in common use in the UK. A series of 23 consecutive preoperative radiographs from patients who had undergone unilateral total hip replacement for unilateral osteoarthritis of the hip was employed. The change in offset between the templated position of the best-fitting template and the anatomical centre of the hip was measured. The templates were then ranked according to their ability to reproduce the normal anatomical offset. The most accurate was the CPS-Plus (Root Mean Square Error 2.0 mm) followed in rank order by: C stem (2.16), CPT (2.40), Exeter (3.23), Stanmore (3.28), Charnley (3.65), Corail (3.72), ABG II (4.30), Furlong HAC (5.08) and Furlong modular (7.14). A similar pattern of results was achieved when the standard error of variability of offset was analysed. We observed a wide variation in the ability of the femoral prosthesis templates to reproduce normal femoral offset. This variation was independent of the seniority of the observer. The templates of modern polished tapered stems with high modularity were best able to reproduce femoral offset. The current move towards digitisation of X-rays may offer manufacturers an opportunity to improve template designs in certain instances, and to develop appropriate computer software.

  7. Accurate Nanoscale Crystallography in Real-Space Using Scanning Transmission Electron Microscopy.

    PubMed

    Dycus, J Houston; Harris, Joshua S; Sang, Xiahan; Fancher, Chris M; Findlay, Scott D; Oni, Adedapo A; Chan, Tsung-Ta E; Koch, Carl C; Jones, Jacob L; Allen, Leslie J; Irving, Douglas L; LeBeau, James M

    2015-08-01

    Here, we report reproducible and accurate measurement of crystallographic parameters using scanning transmission electron microscopy. This is made possible by removing drift and residual scan distortion. We demonstrate real-space lattice parameter measurements with <0.1% error for complex-layered chalcogenides Bi2Te3, Bi2Se3, and a Bi2Te2.7Se0.3 nanostructured alloy. Pairing the technique with atomic resolution spectroscopy, we connect local structure with chemistry and bonding. Combining these results with density functional theory, we show that the incorporation of Se into Bi2Te3 causes charge redistribution that anomalously increases the van der Waals gap between building blocks of the layered structure. The results show that atomic resolution imaging with electrons can accurately and robustly quantify crystallography at the nanoscale.

  8. Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman

    2015-01-01

    The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.

  9. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  10. Periodontal repair in dogs: examiner reproducibility in the supraalveolar periodontal defect model.

    PubMed

    Koo, Ki-Tae; Polimeni, Giuseppe; Albandar, Jasim M; Wikesjö, Ulf M E

    2004-06-01

    Histometric assessments are routinely used to evaluate biologic events ascertained in histologic sections acquired from animal and human studies. The objective of this study was to evaluate the intra- and inter-examiner reproducibility of histometric assessments in the supraalveolar periodontal defect model. Histometric analysis using incandescent and polarized light microscopy, an attached digital camera system, and a PC-based image analysis system including a custom program for the supraalveolar periodontal defect model was performed on histologic sections acquired from one jaw quadrant in each of 12 dogs. The animals had received an experimental protocol including implantation of a coral biomaterial and guided tissue regeneration (GTR) barrier devices, and were evaluated following a 4-week healing interval. Histometric parameters were recorded and repeated within a 3-month interval by two examiners following brief training. Intra- and inter-examiner reproducibility was assessed using the intra-class correlation coefficient (ICC). Most parameters showed high intra-examiner ICCs. Parameters including defect height, connective tissue repair, bone regeneration (height/area), formation of a junctional epithelium, positioning of the GTR device, ankylosis, root resorption, and defect area yielded an ICC> or 0..9. The ICCs for bone density and biomaterial density were somewhat lower (0.8 and 0.7, respectively). The inter-examiner reproducibility was somewhat lower compared to the intra-examiner reproducibility. Nevertheless, the ICCs were generally high (ICC range: 0.6-0.9). Histometric evaluations in the supraalveolar periodontal defect model yield highly reproducible results, in particular when a single examiner performs the histometric measurements, even when the examiner was exposed to limited training.

  11. Organ-on-a-Chip Technology for Reproducing Multiorgan Physiology.

    PubMed

    Lee, Seung Hwan; Sung, Jong Hwan

    2018-01-01

    In the drug development process, the accurate prediction of drug efficacy and toxicity is important in order to reduce the cost, labor, and effort involved. For this purpose, conventional 2D cell culture models are used in the early phase of drug development. However, the differences between the in vitro and the in vivo systems have caused the failure of drugs in the later phase of the drug-development process. Therefore, there is a need for a novel in vitro model system that can provide accurate information for evaluating the drug efficacy and toxicity through a closer recapitulation of the in vivo system. Recently, the idea of using microtechnology for mimicking the microscale tissue environment has become widespread, leading to the development of "organ-on-a-chip." Furthermore, the system is further developed for realizing a multiorgan model for mimicking interactions between multiple organs. These advancements are still ongoing and are aimed at ultimately developing "body-on-a-chip" or "human-on-a-chip" devices for predicting the response of the whole body. This review summarizes recently developed organ-on-a-chip technologies, and their applications for reproducing multiorgan functions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Accurate Vibrational-Rotational Parameters and Infrared Intensities of 1-Bromo-1-fluoroethene: A Joint Experimental Analysis and Ab Initio Study.

    PubMed

    Pietropolli Charmet, Andrea; Stoppa, Paolo; Giorgianni, Santi; Bloino, Julien; Tasinato, Nicola; Carnimeo, Ivan; Biczysko, Malgorzata; Puzzarini, Cristina

    2017-05-04

    The medium-resolution gas-phase infrared (IR) spectra of 1-bromo-1-fluoroethene (BrFC═CH 2 , 1,1-C 2 H 2 BrF) were investigated in the range 300-6500 cm -1 , and the vibrational analysis led to the assignment of all fundamentals as well as many overtone and combination bands up to three quanta, thus giving an accurate description of its vibrational structure. Integrated band intensity data were determined with high precision from the measurements of their corresponding absorption cross sections. The vibrational analysis was supported by high-level ab initio investigations. CCSD(T) computations accounting for extrapolation to the complete basis set and core correlation effects were employed to accurately determine the molecular structure and harmonic force field. The latter was then coupled to B2PLYP and MP2 computations in order to account for mechanical and electrical anharmonicities. Second-order perturbative vibrational theory was then applied to the thus obtained hybrid force fields to support the experimental assignment of the IR spectra.

  13. High-Reproducibility and High-Accuracy Method for Automated Topic Classification

    NASA Astrophysics Data System (ADS)

    Lancichinetti, Andrea; Sirer, M. Irmak; Wang, Jane X.; Acuna, Daniel; Körding, Konrad; Amaral, Luís A. Nunes

    2015-01-01

    Much of human knowledge sits in large databases of unstructured text. Leveraging this knowledge requires algorithms that extract and record metadata on unstructured text documents. Assigning topics to documents will enable intelligent searching, statistical characterization, and meaningful classification. Latent Dirichlet allocation (LDA) is the state of the art in topic modeling. Here, we perform a systematic theoretical and numerical analysis that demonstrates that current optimization techniques for LDA often yield results that are not accurate in inferring the most suitable model parameters. Adapting approaches from community detection in networks, we propose a new algorithm that displays high reproducibility and high accuracy and also has high computational efficiency. We apply it to a large set of documents in the English Wikipedia and reveal its hierarchical structure.

  14. Correlative imaging across microscopy platforms using the fast and accurate relocation of microscopic experimental regions (FARMER) method

    NASA Astrophysics Data System (ADS)

    Huynh, Toan; Daddysman, Matthew K.; Bao, Ying; Selewa, Alan; Kuznetsov, Andrey; Philipson, Louis H.; Scherer, Norbert F.

    2017-05-01

    Imaging specific regions of interest (ROIs) of nanomaterials or biological samples with different imaging modalities (e.g., light and electron microscopy) or at subsequent time points (e.g., before and after off-microscope procedures) requires relocating the ROIs. Unfortunately, relocation is typically difficult and very time consuming to achieve. Previously developed techniques involve the fabrication of arrays of features, the procedures for which are complex, and the added features can interfere with imaging the ROIs. We report the Fast and Accurate Relocation of Microscopic Experimental Regions (FARMER) method, which only requires determining the coordinates of 3 (or more) conspicuous reference points (REFs) and employs an algorithm based on geometric operators to relocate ROIs in subsequent imaging sessions. The 3 REFs can be quickly added to various regions of a sample using simple tools (e.g., permanent markers or conductive pens) and do not interfere with the ROIs. The coordinates of the REFs and the ROIs are obtained in the first imaging session (on a particular microscope platform) using an accurate and precise encoded motorized stage. In subsequent imaging sessions, the FARMER algorithm finds the new coordinates of the ROIs (on the same or different platforms), using the coordinates of the manually located REFs and the previously recorded coordinates. FARMER is convenient, fast (3-15 min/session, at least 10-fold faster than manual searches), accurate (4.4 μm average error on a microscope with a 100x objective), and precise (almost all errors are <8 μm), even with deliberate rotating and tilting of the sample well beyond normal repositioning accuracy. We demonstrate this versatility by imaging and re-imaging a diverse set of samples and imaging methods: live mammalian cells at different time points; fixed bacterial cells on two microscopes with different imaging modalities; and nanostructures on optical and electron microscopes. FARMER can be readily

  15. [Reproducibility of subjective refraction measurement].

    PubMed

    Grein, H-J; Schmidt, O; Ritsche, A

    2014-11-01

    Reproducibility of subjective refraction measurement is limited by various factors. The main factors affecting reproducibility include the characteristics of the measurement method and of the subject and the examiner. This article presents the results of a study on this topic, focusing on the reproducibility of subjective refraction measurement in healthy eyes. The results of previous studies are not all presented in the same way by the respective authors and cannot be fully standardized without consulting the original scientific data. To the extent that they are comparable, the results of our study largely correspond largely with those of previous investigations: During repeated subjective refraction measurement, 95% of the deviation from the mean value was approximately ±0.2 D to ±0.65 D for the spherical equivalent and cylindrical power. The reproducibility of subjective refraction measurement in healthy eyes is limited, even under ideal conditions. Correct assessment of refraction results is only feasible after identifying individual variability. Several measurements are required. Refraction cannot be measured without a tolerance range. The English full-text version of this article is available at SpringerLink (under supplemental).

  16. Reproducibility in light microscopy: Maintenance, standards and SOPs.

    PubMed

    Deagle, Rebecca C; Wee, Tse-Luen Erika; Brown, Claire M

    2017-08-01

    Light microscopy has grown to be a valuable asset in both the physical and life sciences. It is a highly quantitative method available in individual research laboratories and often centralized in core facilities. However, although quantitative microscopy is becoming a customary tool in research, it is rarely standardized. To achieve accurate quantitative microscopy data and reproducible results, three levels of standardization must be considered: (1) aspects of the microscope, (2) the sample, and (3) the detector. The accuracy of the data is only as reliable as the imaging system itself, thereby imposing the need for routine standard performance testing. Depending on the task some maintenance procedures should be performed once a month, some before each imaging session, while others conducted annually. This text should be implemented as a resource for researchers to integrate with their own standard operating procedures to ensure the highest quality quantitative microscopy data. Copyright © 2017. Published by Elsevier Ltd.

  17. Reproducing Epidemiologic Research and Ensuring Transparency.

    PubMed

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  19. Accurate bulk density determination of irregularly shaped translucent and opaque aerogels

    NASA Astrophysics Data System (ADS)

    Petkov, M. P.; Jones, S. M.

    2016-05-01

    We present a volumetric method for accurate determination of bulk density of aerogels, calculated from extrapolated weight of the dry pure solid and volume estimates based on the Archimedes' principle of volume displacement, using packed 100 μm-sized monodispersed glass spheres as a "quasi-fluid" media. Hard particle packing theory is invoked to demonstrate the reproducibility of the apparent density of the quasi-fluid. Accuracy rivaling that of the refractive index method is demonstrated for both translucent and opaque aerogels with different absorptive properties, as well as for aerogels with regular and irregular shapes.

  20. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  1. Reproducible research: a minority opinion

    NASA Astrophysics Data System (ADS)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  2. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  3. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets

  4. Intra-observer reproducibility and interobserver reliability of the radiographic parameters in the Spinal Deformity Study Group's AIS Radiographic Measurement Manual.

    PubMed

    Dang, Natasha Radhika; Moreau, Marc J; Hill, Douglas L; Mahood, James K; Raso, James

    2005-05-01

    Retrospective cross-sectional assessment of the reproducibility and reliability of radiographic parameters. To measure the intra-examiner and interexaminer reproducibility and reliability of salient radiographic features. The management and treatment of adolescent idiopathic scoliosis (AIS) depends on accurate and reproducible radiographic measurements of the deformity. Ten sets of radiographs were randomly selected from a sample of patients with AIS, with initial curves between 20 degrees and 45 degrees. Fourteen measures of the deformity were measured from posteroanterior and lateral radiographs by 2 examiners, and were repeated 5 times at intervals of 3-5 days. Intra-examiner and interexaminer differences were examined. The parameters include measures of curve size, spinal imbalance, sagittal kyphosis and alignment, maximum apical vertebral rotation, T1 tilt, spondylolysis/spondylolisthesis, and skeletal age. Intra-examiner reproducibility was generally excellent for parameters measured from the posteroanterior radiographs but only fair to good for parameters from the lateral radiographs, in which some landmarks were not clearly visible. Of the 13 parameters observed, 7 had excellent interobserver reliability. The measurements from the lateral radiograph were less reproducible and reliable and, thus, may not add value to the assessment of AIS. Taking additional measures encourages a systematic and comprehensive assessment of spinal radiographs.

  5. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  6. A numerical model to reproduce squeaking of ceramic-on-ceramic total hip arthroplasty. Influence of design and material.

    PubMed

    Piriou, P; Ouenzerfi, G; Migaud, H; Renault, E; Massi, F; Serrault, M

    2016-06-01

    Modern ceramic (CoC) bearings for hip arthroplasty (THA) have been used in younger patients who expect improved survivorship. However, audible squeaking produced by the implant is an annoying complication. Previous numerical simulations were not able to accurately reproduce in vitro and in vivo observations. Therefore, we developed a finite element model to: (1) reproduce in vitro squeaking and validate the model by comparing it with in vivo recordings, (2) determine why there are differences between in vivo and in vitro squeaking frequencies, (3) identify the stem's role in this squeaking, (4) predict which designs and materials are more likely to produce squeaking. A CoC THA numerical model can be developed that reproduces the squeaking frequencies observed in vivo. Numerical methods (finite element analysis [ANSYS]) and experimental methods (using a non-lubricated simulated hip with a cementless 32mm CoC THA) were developed to reproduce squeaking. Numerical analysis was performed to identify the frequencies that cause vibrations perceived as an acoustic emission. The finite element analysis (FEA) model was enhanced by adjusting periprosthetic bone and soft tissue elements in order to reproduce the squeaking frequencies recorded in vivo. A numerical method (complex eigenvalue analysis) was used to find the acoustic frequencies of the squeaking noise. The frequencies obtained from the model and the hip simulator were compared to those recorded in vivo. The numerical results were validated by experiments with the laboratory hip simulator. The frequencies obtained (mean 2790Hz with FEA, 2755Hz with simulator, decreasing to 1759Hz when bone and soft tissue were included in the FEA) were consistent with those of squeaking hips recorded in vivo (1521Hz). The cup and ceramic insert were the source of the vibration, but had little influence on the diffusion of the noise required to make the squeaking audible to the human ear. The FEA showed that diffusion of squeaking

  7. Pauling's electronegativity equation and a new corollary accurately predict bond dissociation enthalpies and enhance current understanding of the nature of the chemical bond.

    PubMed

    Matsunaga, Nikita; Rogers, Donald W; Zavitsas, Andreas A

    2003-04-18

    Contrary to other recent reports, Pauling's original electronegativity equation, applied as Pauling specified, describes quite accurately homolytic bond dissociation enthalpies of common covalent bonds, including highly polar ones, with an average deviation of +/-1.5 kcal mol(-1) from literature values for 117 such bonds. Dissociation enthalpies are presented for more than 250 bonds, including 79 for which experimental values are not available. Some previous evaluations of accuracy gave misleadingly poor results by applying the equation to cases for which it was not derived and for which it should not reproduce experimental values. Properly interpreted, the results of the equation provide new and quantitative insights into many facets of chemistry such as radical stabilities, factors influencing reactivity in electrophilic aromatic substitutions, the magnitude of steric effects, conjugative stabilization in unsaturated systems, rotational barriers, molecular and electronic structure, and aspects of autoxidation. A new corollary of the original equation expands its applicability and provides a rationale for previously observed empirical correlations. The equation raises doubts about a new bonding theory. Hydrogen is unique in that its electronegativity is not constant.

  8. Appearance Constrained Semi-Automatic Segmentation from DCE-MRI is Reproducible and Feasible for Breast Cancer Radiomics: A Feasibility Study.

    PubMed

    Veeraraghavan, Harini; Dashevsky, Brittany Z; Onishi, Natsuko; Sadinski, Meredith; Morris, Elizabeth; Deasy, Joseph O; Sutton, Elizabeth J

    2018-03-19

    We present a segmentation approach that combines GrowCut (GC) with cancer-specific multi-parametric Gaussian Mixture Model (GCGMM) to produce accurate and reproducible segmentations. We evaluated GCGMM using a retrospectively collected 75 invasive ductal carcinoma with ERPR+ HER2- (n = 15), triple negative (TN) (n = 9), and ER-HER2+ (n = 57) cancers with variable presentation (mass and non-mass enhancement) and background parenchymal enhancement (mild and marked). Expert delineated manual contours were used to assess the segmentation performance using Dice coefficient (DSC), mean surface distance (mSD), Hausdorff distance, and volume ratio (VR). GCGMM segmentations were significantly more accurate than GrowCut (GC) and fuzzy c-means clustering (FCM). GCGMM's segmentations and the texture features computed from those segmentations were the most reproducible compared with manual delineations and other analyzed segmentation methods. Finally, random forest (RF) classifier trained with leave-one-out cross-validation using features extracted from GCGMM segmentation resulted in the best accuracy for ER-HER2+ vs. ERPR+/TN (GCGMM 0.95, expert 0.95, GC 0.90, FCM 0.92) and for ERPR + HER2- vs. TN (GCGMM 0.92, expert 0.91, GC 0.77, FCM 0.83).

  9. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  10. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified documents... material must be conspicuously marked with the same classification markings as the material being...

  12. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified documents... material must be conspicuously marked with the same classification markings as the material being...

  13. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified documents... material must be conspicuously marked with the same classification markings as the material being...

  14. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified documents... material must be conspicuously marked with the same classification markings as the material being...

  15. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified documents... material must be conspicuously marked with the same classification markings as the material being...

  16. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.

    2013-12-15

    correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.« less

  17. Accounting for reciprocal host-microbiome interactions in experimental science.

    PubMed

    Stappenbeck, Thaddeus S; Virgin, Herbert W

    2016-06-09

    Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research.

  18. Characterization and reproducibility of HepG2 hanging drop spheroids toxicology in vitro.

    PubMed

    Hurrell, Tracey; Ellero, Andrea Antonio; Masso, Zelie Flavienne; Cromarty, Allan Duncan

    2018-02-21

    Hepatotoxicity remains a major challenge in drug development despite preclinical toxicity screening using hepatocytes of human origin. To overcome some limitations of reproducing the hepatic phenotype, more structurally and functionally authentic cultures in vitro can be introduced by growing cells in 3D spheroid cultures. Characterisation and reproducibility of HepG2 spheroid cultures using a high-throughput hanging drop technique was performed and features contributing to potential phenotypic variation highlighted. Cultured HepG2 cells were seeded into Perfecta 3D® 96-well hanging drop plates and assessed over time for morphology, viability, cell cycle distribution, protein content and protein-mass profiles. Divergent aspects which were assessed included cell stocks, seeding density, volume of culture medium and use of extracellular matrix additives. Hanging drops are advantageous due to no complex culture matrix being present, enabling background free extractions for downstream experimentation. Varying characteristics were observed across cell stocks and batches, seeding density, culture medium volume and extracellular matrix when using immortalized HepG2 cells. These factors contribute to wide-ranging cellular responses and highlights concerns with respect to generating a reproducible phenotype in HepG2 hanging drop spheroids. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Z-scan theoretical and experimental studies for accurate measurements of the nonlinear refractive index and absorption of optical glasses near damage threshold

    NASA Astrophysics Data System (ADS)

    Olivier, Thomas; Billard, Franck; Akhouayri, Hassan

    2004-06-01

    Self-focusing is one of the dramatic phenomena that may occur during the propagation of a high power laser beam in a nonlinear material. This phenomenon leads to a degradation of the wave front and may also lead to a photoinduced damage of the material. Realistic simulations of the propagation of high power laser beams require an accurate knowledge of the nonlinear refractive index γ. In the particular case of fused silica and in the nanosecond regime, it seems that electronic mechanisms as well as electrostriction and thermal effects can lead to a significant refractive index variation. Compared to the different methods used to measure this parmeter, the Z-scan method is simple, offers a good sensitivity and may give absolute measurements if the incident beam is accurately studied. However, this method requires a very good knowledge of the incident beam and of its propagation inside a nonlinear sample. We used a split-step propagation algorithm to simlate Z-scan curves for arbitrary beam shape, sample thickness and nonlinear phase shift. According to our simulations and a rigorous analysis of the Z-scan measured signal, it appears that some abusive approximations lead to very important errors. Thus, by reducing possible errors on the interpretation of Z-scan experimental studies, we performed accurate measurements of the nonlinear refractive index of fused silica that show the significant contribution of nanosecond mechanisms.

  20. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  1. How accurately do force fields represent protein side chain ensembles?

    PubMed

    Petrović, Dušan; Wang, Xue; Strodel, Birgit

    2018-05-23

    Although the protein backbone is the most fundamental part of the structure, the fine-tuning of side-chain conformations is important for protein function, for example, in protein-protein and protein-ligand interactions, and also in enzyme catalysis. While several benchmarks testing the performance of protein force fields for side chain properties have already been published, they often considered only a few force fields and were not tested against the same experimental observables; hence, they are not directly comparable. In this work, we explore the ability of twelve force fields, which are different flavors of AMBER, CHARMM, OPLS, or GROMOS, to reproduce average rotamer angles and rotamer populations obtained from extensive NMR studies of the 3 J and residual dipolar coupling constants for two small proteins: ubiquitin and GB3. Based on a total of 196 μs sampling time, our results reveal that all force fields identify the correct side chain angles, while the AMBER and CHARMM force fields clearly outperform the OPLS and GROMOS force fields in estimating rotamer populations. The three best force fields for representing the protein side chain dynamics are AMBER 14SB, AMBER 99SB*-ILDN, and CHARMM36. Furthermore, we observe that the side chain ensembles of buried amino acid residues are generally more accurately represented than those of the surface exposed residues. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  2. EXPERIMENTAL AND RESEARCH WORK IN NEUTRON DOSIMETRY. Final Summary Report for the Period May 15, 1959-June 15, 1960

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorton, H.C.; Mengali, O.J.; Zacaroli, A.R.

    A practical, prototype silicon p-n junction fast-neutron dosimeter, sensitive in the same range as human tissue, was developed, together with sn associated read-out circuit to facilitate the accurate measurement of accumulated dose. From both theoretical and experimental considerations, it was demonstrated that the dosimeter is essentially insensitive to the gamma and thermal components of a uranium fission spectrum. It was shown that accumulated damage effects appear to be environmentally stable up to an ambient temperature of 100 C. A rather raarked reversible temperature dependence of the read-out parameters requires either control of the read-out temperature or temperature compensation in themore » read-out device. A high degree of reproducibility of dosimeter characteristics from one device to another was not achieved. The lack of reproducibility was attributed to uncontrolled variables in the bulk silicon from which the devices are fabricated, and in the production procedure. (auth)« less

  3. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  4. Accurate experimental and theoretical comparisons between superconductor-insulator-superconductor mixers showing weak and strong quantum effects

    NASA Technical Reports Server (NTRS)

    Mcgrath, W. R.; Richards, P. L.; Face, D. W.; Prober, D. E.; Lloyd, F. L.

    1988-01-01

    A systematic study of the gain and noise in superconductor-insulator-superconductor mixers employing Ta based, Nb based, and Pb-alloy based tunnel junctions was made. These junctions displayed both weak and strong quantum effects at a signal frequency of 33 GHz. The effects of energy gap sharpness and subgap current were investigated and are quantitatively related to mixer performance. Detailed comparisons are made of the mixing results with the predictions of a three-port model approximation to the Tucker theory. Mixer performance was measured with a novel test apparatus which is accurate enough to allow for the first quantitative tests of theoretical noise predictions. It is found that the three-port model of the Tucker theory underestimates the mixer noise temperature by a factor of about 2 for all of the mixers. In addition, predicted values of available mixer gain are in reasonable agreement with experiment when quantum effects are weak. However, as quantum effects become strong, the predicted available gain diverges to infinity, which is in sharp contrast to the experimental results. Predictions of coupled gain do not always show such divergences.

  5. Accurate atomistic potentials and training sets for boron-nitride nanostructures

    NASA Astrophysics Data System (ADS)

    Tamblyn, Isaac

    Boron nitride nanotubes exhibit exceptional structural, mechanical, and thermal properties. They are optically transparent and have high thermal stability, suggesting a wide range of opportunities for structural reinforcement of materials. Modeling can play an important role in determining the optimal approach to integrating nanotubes into a supporting matrix. Developing accurate, atomistic scale models of such nanoscale interfaces embedded within composites is challenging, however, due to the mismatch of length scales involved. Typical nanotube diameters range from 5-50 nm, with a length as large as a micron (i.e. a relevant length-scale for structural reinforcement). Unlike their carbon-based counterparts, well tested and transferable interatomic force fields are not common for BNNT. In light of this, we have developed an extensive training database of BN rich materials, under conditions relevant for BNNT synthesis and composites based on extensive first principles molecular dynamics simulations. Using this data, we have produced an artificial neural network potential capable of reproducing the accuracy of first principles data at significantly reduced computational cost, allowing for accurate simulation at the much larger length scales needed for composite design.

  6. Interlaboratory Reproducibility of Droplet Digital Polymerase Chain Reaction Using a New DNA Reference Material Format.

    PubMed

    Pinheiro, Leonardo B; O'Brien, Helen; Druce, Julian; Do, Hongdo; Kay, Pippa; Daniels, Marissa; You, Jingjing; Burke, Daniel; Griffiths, Kate; Emslie, Kerry R

    2017-11-07

    Use of droplet digital PCR technology (ddPCR) is expanding rapidly in the diversity of applications and number of users around the world. Access to relatively simple and affordable commercial ddPCR technology has attracted wide interest in use of this technology as a molecular diagnostic tool. For ddPCR to effectively transition to a molecular diagnostic setting requires processes for method validation and verification and demonstration of reproducible instrument performance. In this study, we describe the development and characterization of a DNA reference material (NMI NA008 High GC reference material) comprising a challenging methylated GC-rich DNA template under a novel 96-well microplate format. A scalable process using high precision acoustic dispensing technology was validated to produce the DNA reference material with a certified reference value expressed in amount of DNA molecules per well. An interlaboratory study, conducted using blinded NA008 High GC reference material to assess reproducibility among seven independent laboratories demonstrated less than 4.5% reproducibility relative standard deviation. With the exclusion of one laboratory, laboratories had appropriate technical competency, fully functional instrumentation, and suitable reagents to perform accurate ddPCR based DNA quantification measurements at the time of the study. The study results confirmed that NA008 High GC reference material is fit for the purpose of being used for quality control of ddPCR systems, consumables, instrumentation, and workflow.

  7. Voxel size dependency, reproducibility and sensitivity of an in vivo bone loading estimation algorithm

    PubMed Central

    Christen, Patrik; Schulte, Friederike A.; Zwahlen, Alexander; van Rietbergen, Bert; Boutroy, Stephanie; Melton, L. Joseph; Amin, Shreyasee; Khosla, Sundeep; Goldhahn, Jörg; Müller, Ralph

    2016-01-01

    A bone loading estimation algorithm was previously developed that provides in vivo loading conditions required for in vivo bone remodelling simulations. The algorithm derives a bone's loading history from its microstructure as assessed by high-resolution (HR) computed tomography (CT). This reverse engineering approach showed accurate and realistic results based on micro-CT and HR-peripheral quantitative CT images. However, its voxel size dependency, reproducibility and sensitivity still need to be investigated, which is the purpose of this study. Voxel size dependency was tested on cadaveric distal radii with micro-CT images scanned at 25 µm and downscaled to 50, 61, 75, 82, 100, 125 and 150 µm. Reproducibility was calculated with repeated in vitro as well as in vivo HR-pQCT measurements at 82 µm. Sensitivity was defined using HR-pQCT images from women with fracture versus non-fracture, and low versus high bone volume fraction, expecting similar and different loading histories, respectively. Our results indicate that the algorithm is voxel size independent within an average (maximum) error of 8.2% (32.9%) at 61 µm, but that the dependency increases considerably at voxel sizes bigger than 82 µm. In vitro and in vivo reproducibility are up to 4.5% and 10.2%, respectively, which is comparable to other in vitro studies and slightly higher than in other in vivo studies. Subjects with different bone volume fraction were clearly distinguished but not subjects with and without fracture. This is in agreement with bone adapting to customary loading but not to fall loads. We conclude that the in vivo bone loading estimation algorithm provides reproducible, sensitive and fairly voxel size independent results at up to 82 µm, but that smaller voxel sizes would be advantageous. PMID:26790999

  8. An experimental, theoretical and event-driven computational study of narrow vibrofluidised granular materials

    NASA Astrophysics Data System (ADS)

    Thornton, Anthony; Windows-Yule, Kit; Parker, David; Luding, Stefan

    2017-06-01

    We review simulations, experiments and a theoretical treatment of vertically vibrated granular media. The systems considered are confined in narrow quasi-two-dimensional and quasi-one-dimensional (column) geometries, where the vertical extension of the container is much larger than one or both horizontal lengths. The additional geometric constraint present in the column setup frustrates the convection state that is normally observed in wider geometries. We start by showing that the Event Driven (ED) simulation method is able to accurately reproduce the previously experimentally determined phase-diagram for vibrofludised granular materials. We then review two papers that used ED simulations to study narrow quasi-one-dimensional systems revealing a new phenomenon: collective oscillations of the grains with a characteristic frequency that is much lower than the frequency of energy injection. Theoretical work was then undertaken that is able to accurately predict the frequency of such an oscillation and Positron Emission Particle Tracking (PEPT) experiments were undertaken to provide the first experimental evidence of this new phenomenon. Finally, we briefly discuss ongoing work to create an open-source version of this ED via its integration in the existing open-source package MercuryDPM (http://MercuryDPM.org); which has many advanced features that are not found in other codes.

  9. Machine Learning of Accurate Energy-Conserving Molecular Force Fields

    NASA Astrophysics Data System (ADS)

    Chmiela, Stefan; Tkatchenko, Alexandre; Sauceda, Huziel; Poltavsky, Igor; Schütt, Kristof; Müller, Klaus-Robert; GDML Collaboration

    Efficient and accurate access to the Born-Oppenheimer potential energy surface (PES) is essential for long time scale molecular dynamics (MD) simulations. Using conservation of energy - a fundamental property of closed classical and quantum mechanical systems - we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio MD trajectories (AIMD). The GDML implementation is able to reproduce global potential-energy surfaces of intermediate-size molecules with an accuracy of 0.3 kcal/mol for energies and 1 kcal/mol/Å for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, malonaldehyde, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative MD simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.

  10. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  11. Three Dimensions of Reproducibility in Natural Language Processing.

    PubMed

    Cohen, K Bretonnel; Xia, Jingbo; Zweigenbaum, Pierre; Callahan, Tiffany J; Hargraves, Orin; Goss, Foster; Ide, Nancy; Névéol, Aurélie; Grouin, Cyril; Hunter, Lawrence E

    2018-05-01

    Despite considerable recent attention to problems with reproducibility of scientific research, there is a striking lack of agreement about the definition of the term. That is a problem, because the lack of a consensus definition makes it difficult to compare studies of reproducibility, and thus to have even a broad overview of the state of the issue in natural language processing. This paper proposes an ontology of reproducibility in that field. Its goal is to enhance both future research and communication about the topic, and retrospective meta-analyses. We show that three dimensions of reproducibility, corresponding to three kinds of claims in natural language processing papers, can account for a variety of types of research reports. These dimensions are reproducibility of a conclusion , of a finding , and of a value. Three biomedical natural language processing papers by the authors of this paper are analyzed with respect to these dimensions.

  12. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  13. 4D Visualization of Experimental Procedures in Rock Physics

    NASA Astrophysics Data System (ADS)

    Vanorio, T.; di Bonito, C.

    2010-12-01

    Engaging students in laboratory classes in geophysics is becoming more and more difficult. This is primarily because of an ever-widening gap between the less appealing aspects that characterize these courses (e.g., lengthiness of the experimental operations, high student/instrument ratio, limited time associated with lack of previous hands-on experiences, and logistical and safety concerns) and the life style of the 21st century generations (i.e., extensive practice to high-tech tools, high-speed communications and computing, 3D graphics and HD videos). To bridge the gap and enhance the teaching strategy of laboratory courses in geophysics, we have created simulator-training tools for use in preparation for the actual experimental phase. We are using a modeling, animation, and rendering package to create (a) 3D models that accurately reproduce actual scenarios and instruments used for the measurement of rock physics properties and (b) 4D interactive animations that simulate hands-on demonstrations of the experimental procedures. We present here a prototype describing step-by-step the experimental protocol and the principles behind the measurement of rock porosity. The tool reproduces an actual helium porosimeter and makes use of interactive animations, guided text, and a narrative voice guiding the audience through the different phases of the experimental process. Our strategy is to make the most of new technologies while preserving the accuracy of classical laboratory methods and practices. These simulations are not intended to replace traditional lab work; rather they provide students with the opportunity for review and repetition. The primary goal is thus to help students familiarize themselves during their earlier curricula with lab methodologies, thus minimizing apparent hesitation and frustration in later classes. This may also increase the level of interest and involvement of undergraduate students and, in turn, enhance their keenness to pursue their

  14. Hydrodynamic characteristics of the two-phase flow field at gas-evolving electrodes: numerical and experimental studies

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-Lin; Sun, Ze; Lu, Gui-Min; Yu, Jian-Guo

    2018-05-01

    Gas-evolving vertical electrode system is a typical electrochemical industrial reactor. Gas bubbles are released from the surfaces of the anode and affect the electrolyte flow pattern and even the cell performance. In the current work, the hydrodynamics induced by the air bubbles in a cold model was experimentally and numerically investigated. Particle image velocimetry and volumetric three-component velocimetry techniques were applied to experimentally visualize the hydrodynamics characteristics and flow fields in a two-dimensional (2D) plane and a three-dimensional (3D) space, respectively. Measurements were performed at different gas rates. Furthermore, the corresponding mathematical model was developed under identical conditions for the qualitative and quantitative analyses. The experimental measurements were compared with the numerical results based on the mathematical model. The study of the time-averaged flow field, three velocity components, instantaneous velocity and turbulent intensity indicate that the numerical model qualitatively reproduces liquid motion. The 3D model predictions capture the flow behaviour more accurately than the 2D model in this study.

  15. Hydrodynamic characteristics of the two-phase flow field at gas-evolving electrodes: numerical and experimental studies.

    PubMed

    Liu, Cheng-Lin; Sun, Ze; Lu, Gui-Min; Yu, Jian-Guo

    2018-05-01

    Gas-evolving vertical electrode system is a typical electrochemical industrial reactor. Gas bubbles are released from the surfaces of the anode and affect the electrolyte flow pattern and even the cell performance. In the current work, the hydrodynamics induced by the air bubbles in a cold model was experimentally and numerically investigated. Particle image velocimetry and volumetric three-component velocimetry techniques were applied to experimentally visualize the hydrodynamics characteristics and flow fields in a two-dimensional (2D) plane and a three-dimensional (3D) space, respectively. Measurements were performed at different gas rates. Furthermore, the corresponding mathematical model was developed under identical conditions for the qualitative and quantitative analyses. The experimental measurements were compared with the numerical results based on the mathematical model. The study of the time-averaged flow field, three velocity components, instantaneous velocity and turbulent intensity indicate that the numerical model qualitatively reproduces liquid motion. The 3D model predictions capture the flow behaviour more accurately than the 2D model in this study.

  16. Hydrodynamic characteristics of the two-phase flow field at gas-evolving electrodes: numerical and experimental studies

    PubMed Central

    Lu, Gui-Min; Yu, Jian-Guo

    2018-01-01

    Gas-evolving vertical electrode system is a typical electrochemical industrial reactor. Gas bubbles are released from the surfaces of the anode and affect the electrolyte flow pattern and even the cell performance. In the current work, the hydrodynamics induced by the air bubbles in a cold model was experimentally and numerically investigated. Particle image velocimetry and volumetric three-component velocimetry techniques were applied to experimentally visualize the hydrodynamics characteristics and flow fields in a two-dimensional (2D) plane and a three-dimensional (3D) space, respectively. Measurements were performed at different gas rates. Furthermore, the corresponding mathematical model was developed under identical conditions for the qualitative and quantitative analyses. The experimental measurements were compared with the numerical results based on the mathematical model. The study of the time-averaged flow field, three velocity components, instantaneous velocity and turbulent intensity indicate that the numerical model qualitatively reproduces liquid motion. The 3D model predictions capture the flow behaviour more accurately than the 2D model in this study. PMID:29892347

  17. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  18. Reproducible model development in the cardiac electrophysiology Web Lab.

    PubMed

    Daly, Aidan C; Clerx, Michael; Beattie, Kylie A; Cooper, Jonathan; Gavaghan, David J; Mirams, Gary R

    2018-05-26

    The modelling of the electrophysiology of cardiac cells is one of the most mature areas of systems biology. This extended concentration of research effort brings with it new challenges, foremost among which is that of choosing which of these models is most suitable for addressing a particular scientific question. In a previous paper, we presented our initial work in developing an online resource for the characterisation and comparison of electrophysiological cell models in a wide range of experimental scenarios. In that work, we described how we had developed a novel protocol language that allowed us to separate the details of the mathematical model (the majority of cardiac cell models take the form of ordinary differential equations) from the experimental protocol being simulated. We developed a fully-open online repository (which we termed the Cardiac Electrophysiology Web Lab) which allows users to store and compare the results of applying the same experimental protocol to competing models. In the current paper we describe the most recent and planned extensions of this work, focused on supporting the process of model building from experimental data. We outline the necessary work to develop a machine-readable language to describe the process of inferring parameters from wet lab datasets, and illustrate our approach through a detailed example of fitting a model of the hERG channel using experimental data. We conclude by discussing the future challenges in making further progress in this domain towards our goal of facilitating a fully reproducible approach to the development of cardiac cell models. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Experimental determination of heat transfer coefficients in roll bite and air cooling for computer simulations of 1100 MPa carbon steel rolling

    NASA Astrophysics Data System (ADS)

    Leinonen, Olli; Ilmola, Joonas; Seppälä, Oskari; Pohjonen, Aarne; Paavola, Jussi; Koskenniska, Sami; Larkiola, Jari

    2018-05-01

    In modeling of hot rolling pass schedules the heat transfer phenomena have to be known. Radiation to ambient, between rolls and a steel slab as well as heat transfer in contacts must be considered to achieve accurate temperature distribution and thereby accurate material behavior in simulations. Additional heat is generated by friction between the slab and the work roll and by plastic deformation. These phenomena must be taken into account when the effective heat transfer coefficient is determined from experimental data. In this paper we determine the effective heat transfer coefficient at the contact interface and emissivity factor of slab surface for 1100MPa strength carbon steel for hot rolling simulations. Experimental pilot rolling test were carried out and slab temperatures gathered right below the interface and at the mid thickness of the slab. Emissivity factor tests were carried out in the same manner but without rolling. Experimental data is utilized to derive contact heat transfer coefficient at the interface and emissivity factor of slab surface. Pilot rolling test is reproduced in FE-analysis to further refine the heat transfer coefficient and emissivity factor. Material mechanical properties at rolling temperatures were determined by Gleeble™ thermo-mechanical simulator and IDS thermodynamic-kinetic-empirical software.

  20. Where next for the reproducibility agenda in computational biology?

    PubMed

    Lewis, Joanna; Breeze, Charles E; Charlesworth, Jane; Maclaren, Oliver J; Cooper, Jonathan

    2016-07-15

    The concept of reproducibility is a foundation of the scientific method. With the arrival of fast and powerful computers over the last few decades, there has been an explosion of results based on complex computational analyses and simulations. The reproducibility of these results has been addressed mainly in terms of exact replicability or numerical equivalence, ignoring the wider issue of the reproducibility of conclusions through equivalent, extended or alternative methods. We use case studies from our own research experience to illustrate how concepts of reproducibility might be applied in computational biology. Several fields have developed 'minimum information' checklists to support the full reporting of computational simulations, analyses and results, and standardised data formats and model description languages can facilitate the use of multiple systems to address the same research question. We note the importance of defining the key features of a result to be reproduced, and the expected agreement between original and subsequent results. Dynamic, updatable tools for publishing methods and results are becoming increasingly common, but sometimes come at the cost of clear communication. In general, the reproducibility of computational research is improving but would benefit from additional resources and incentives. We conclude with a series of linked recommendations for improving reproducibility in computational biology through communication, policy, education and research practice. More reproducible research will lead to higher quality conclusions, deeper understanding and more valuable knowledge.

  1. A novel methodology to reproduce previously recorded six-degree of freedom kinematics on the same diarthrodial joint.

    PubMed

    Moore, Susan M; Thomas, Maribeth; Woo, Savio L-Y; Gabriel, Mary T; Kilger, Robert; Debski, Richard E

    2006-01-01

    The objective of this study was to develop a novel method to more accurately reproduce previously recorded 6-DOF kinematics of the tibia with respect to the femur using robotic technology. Furthermore, the effect of performing only a single or multiple registrations and the effect of robot joint configuration were investigated. A single registration consisted of registering the tibia and femur with respect to the robot at full extension and reproducing all kinematics while multiple registrations consisted of registering the bones at each flexion angle and reproducing only the kinematics of the corresponding flexion angle. Kinematics of the knee in response to an anterior (134 N) and combined internal/external (+/-10 N m) and varus/valgus (+/-5 N m) loads were collected at 0 degrees , 15 degrees , 30 degrees , 60 degrees , and 90 degrees of flexion. A six axes, serial-articulated robotic manipulator (PUMA Model 762) was calibrated and the working volume was reduced to improve the robot's accuracy. The effect of the robot joint configuration was determined by performing single and multiple registrations for three selected configurations. For each robot joint configuration, the accuracy in position of the reproduced kinematics improved after multiple registrations (0.7+/-0.3, 1.2+/-0.5, and 0.9+/-0.2 mm, respectively) when compared to only a single registration (1.3+/-0.9, 2.0+/-1.0, and 1.5+/-0.7 mm, respectively) (p<0.05). The accuracy in position of each robot joint configuration was unique as significant differences were detected between each of the configurations. These data demonstrate that the number of registrations and the robot joint configuration both affect the accuracy of the reproduced kinematics. Therefore, when using robotic technology to reproduce previously recorded kinematics, it may be necessary to perform these analyses for each individual robotic system and for each diarthrodial joint, as different joints will require the robot to be placed in

  2. Simulation and experimental verification of prompt gamma-ray emissions during proton irradiation.

    PubMed

    Schumann, A; Petzoldt, J; Dendooven, P; Enghardt, W; Golnik, C; Hueso-González, F; Kormoll, T; Pausch, G; Roemer, K; Fiedler, F

    2015-05-21

    Irradiation with protons and light ions offers new possibilities for tumor therapy but has a strong need for novel imaging modalities for treatment verification. The development of new detector systems, which can provide an in vivo range assessment or dosimetry, requires an accurate knowledge of the secondary radiation field and reliable Monte Carlo simulations. This paper presents multiple measurements to characterize the prompt γ-ray emissions during proton irradiation and benchmarks the latest Geant4 code against the experimental findings. Within the scope of this work, the total photon yield for different target materials, the energy spectra as well as the γ-ray depth profile were assessed. Experiments were performed at the superconducting AGOR cyclotron at KVI-CART, University of Groningen. Properties of the γ-ray emissions were experimentally determined. The prompt γ-ray emissions were measured utilizing a conventional HPGe detector system (Clover) and quantitatively compared to simulations. With the selected physics list QGSP_BIC_HP, Geant4 strongly overestimates the photon yield in most cases, sometimes up to 50%. The shape of the spectrum and qualitative occurrence of discrete γ lines is reproduced accurately. A sliced phantom was designed to determine the depth profile of the photons. The position of the distal fall-off in the simulations agrees with the measurements, albeit the peak height is also overestimated. Hence, Geant4 simulations of prompt γ-ray emissions from irradiation with protons are currently far less reliable as compared to simulations of the electromagnetic processes. Deviations from experimental findings were observed and quantified. Although there has been a constant improvement of Geant4 in the hadronic sector, there is still a gap to close.

  3. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of Information § 1016.35 Authority to reproduce Restricted Data. Secret Restricted Data will not be reproduced...

  4. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of Information § 1016.35 Authority to reproduce Restricted Data. Secret Restricted Data will not be reproduced...

  5. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  6. Accurate mass measurement: terminology and treatment of data.

    PubMed

    Brenton, A Gareth; Godfrey, A Ruth

    2010-11-01

    High-resolution mass spectrometry has become ever more accessible with improvements in instrumentation, such as modern FT-ICR and Orbitrap mass spectrometers. This has resulted in an increase in the number of articles submitted for publication quoting accurate mass data. There is a plethora of terms related to accurate mass analysis that are in current usage, many employed incorrectly or inconsistently. This article is based on a set of notes prepared by the authors for research students and staff in our laboratories as a guide to the correct terminology and basic statistical procedures to apply in relation to mass measurement, particularly for accurate mass measurement. It elaborates on the editorial by Gross in 1994 regarding the use of accurate masses for structure confirmation. We have presented and defined the main terms in use with reference to the International Union of Pure and Applied Chemistry (IUPAC) recommendations for nomenclature and symbolism for mass spectrometry. The correct use of statistics and treatment of data is illustrated as a guide to new and existing mass spectrometry users with a series of examples as well as statistical methods to compare different experimental methods and datasets. Copyright © 2010. Published by Elsevier Inc.

  7. Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.

    2017-10-01

    Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.

  8. Evaluation of 12 blood glucose monitoring systems for self-testing: system accuracy and measurement reproducibility.

    PubMed

    Freckmann, Guido; Baumstark, Annette; Schmid, Christina; Pleus, Stefan; Link, Manuela; Haug, Cornelia

    2014-02-01

    Systems for self-monitoring of blood glucose (SMBG) have to provide accurate and reproducible blood glucose (BG) values in order to ensure adequate therapeutic decisions by people with diabetes. Twelve SMBG systems were compared in a standardized manner under controlled laboratory conditions: nine systems were available on the German market and were purchased from a local pharmacy, and three systems were obtained from the manufacturer (two systems were available on the U.S. market, and one system was not yet introduced to the German market). System accuracy was evaluated following DIN EN ISO (International Organization for Standardization) 15197:2003. In addition, measurement reproducibility was assessed following a modified TNO (Netherlands Organization for Applied Scientific Research) procedure. Comparison measurements were performed with either the glucose oxidase method (YSI 2300 STAT Plus™ glucose analyzer; YSI Life Sciences, Yellow Springs, OH) or the hexokinase method (cobas(®) c111; Roche Diagnostics GmbH, Mannheim, Germany) according to the manufacturer's measurement procedure. The 12 evaluated systems showed between 71.5% and 100% of the measurement results within the required system accuracy limits. Ten systems fulfilled with the evaluated test strip lot minimum accuracy requirements specified by DIN EN ISO 15197:2003. In addition, accuracy limits of the recently published revision ISO 15197:2013 were applied and showed between 54.5% and 100% of the systems' measurement results within the required accuracy limits. Regarding measurement reproducibility, each of the 12 tested systems met the applied performance criteria. In summary, 83% of the systems fulfilled with the evaluated test strip lot minimum system accuracy requirements of DIN EN ISO 15197:2003. Each of the tested systems showed acceptable measurement reproducibility. In order to ensure sufficient measurement quality of each distributed test strip lot, regular evaluations are required.

  9. Considering RNAi experimental design in parasitic helminths.

    PubMed

    Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G

    2012-04-01

    Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.

  10. The temperature dependence of optical properties of tungsten in the visible and near-infrared domains: an experimental and theoretical study

    NASA Astrophysics Data System (ADS)

    Minissale, Marco; Pardanaud, Cedric; Bisson, Régis; Gallais, Laurent

    2017-11-01

    The knowledge of optical properties of tungsten at high temperatures is of crucial importance in fields such as nuclear fusion and aerospace applications. The optical properties of tungsten are well known at room temperature, but little has been done at temperatures between 300 K and 1000 K in the visible and near-infrared domains. Here, we investigate the temperature dependence of tungsten reflectivity from the ambient to high temperatures (<1000 K) in the 500-1050 nm spectral range, a region where interband transitions make a strong contribution. Experimental measurements, performed via a spectroscopic system coupled with laser remote heating, show that tungsten’s reflectivity increases with temperature and wavelength. We have described these dependences through a Fresnel and two Lorentz-Drude models. The Fresnel model accurately reproduces the experimental curve at a given temperature, but it is able to simulate the temperature dependency of reflectivity only thanks to an ad hoc choice of temperature formulae for the refractive indexes. Thus, a less empirical approach, based on Lorentz-Drude models, is preferred to describe the interaction of light and charge carriers in the solid. The first Lorentz-Drude model, which includes a temperature dependency on intraband transitions, fits experimental results only qualitatively. The second Lorentz-Drude model includes in addition a temperature dependency on interband transitions. It is able to reproduce the experimental results quantitatively, highlighting a non-trivial dependence of interband transitions as a function of temperature. Eventually, we use these temperature dependent Lorentz-Drude models to evaluate the total emissivity of tungsten from 300 K to 3500 K, and we compare our experimental and theoretical findings with previous results.

  11. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  12. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  13. Experimentally reproduced textures and mineral chemistries of high-titanium mare basalts

    NASA Technical Reports Server (NTRS)

    Usselman, T. M.; Lofgren, G. E.; Williams, R. J.; Donaldson, C. H.

    1975-01-01

    Many of the textures, morphologies, and mineral chemistries of the high-titanium mare basalts have been experimentally duplicated using single-stage cooling histories. Lunar high-titanium mare basalts are modeled in a 1 m thick gravitationally differentiating flow based on cooling rates, thermal models, and modal olivine contents. The low-pressure equilibrium phase relations of a synthetic high-titanium basalt composition were investigated as a function of oxygen fugacity, and petrographic criteria are developed for the recognition of phenocrysts which were present in the liquid at the time of eruption.

  14. Repeatability and reproducibility of corneal thickness using SOCT Copernicus HR.

    PubMed

    Vidal, Silvia; Viqueira, Valentín; Mas, David; Domenech, Begoña

    2013-05-01

    The aim of this study is to determine the reliability of corneal thickness measurements derived from SOCT Copernicus HR (Fourier domain OCT). Thirty healthy eyes of 30 subjects were evaluated. One eye of each patient was chosen randomly. Images were obtained of the central (up to 2.0 mm from the corneal apex) and paracentral (2.0 to 4.0 mm) cornea. We assessed corneal thickness (central and paracentral) and epithelium thickness. The intra-observer repeatability data were analysed using the intra-class correlation coefficient (ICC) for a range of 95 per cent within-subject standard deviation (S(W)) and the within-subject coefficient of variation (C(W)). The level of agreement by Bland-Altman analysis was also represented for the study of the reproducibility between observers and agreement between methods of measurement (automatic versus manual). The mean value of the central corneal thickness (CCT) was 542.4 ± 30.1 μm (SD). There was a high intra-observer agreement, finding the best result in the central sector with an intra-class correlation coefficient of 0.99, 95 per cent CI (0.989 to 0.997) and the worst, in the minimum corneal thickness, with an intra-class correlation coefficient of 0.672, 95 per cent CI (0.417 to 0.829). Reproducibility between observers was very high. The best result was found in the central sector thickness obtained both manually and automatically with an intra-class correlation coefficient of 0.990 in both cases and the worst result in the maximum corneal thickness with an intra-class correlation coefficient of 0.827. The agreement between measurement methods was also very high with intra-class correlation coefficient greater than 0.91. On the other hand the repeatability and reproducibility for epithelial measurements was poor. Pachymetric mapping with SOCT Copernicus HR was found to be highly repeatable and reproducible. We found that the device lacks an appropriate ergonomic design as proper focusing of the laser beam onto the

  15. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: application to pure copper, platinum, tungsten, and nickel at very high temperatures.

    PubMed

    Abadlia, L; Gasser, F; Khalouk, K; Mayoufi, M; Gasser, J G

    2014-09-01

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  16. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: Application to pure copper, platinum, tungsten, and nickel at very high temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abadlia, L.; Mayoufi, M.; Gasser, F.

    2014-09-15

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in thismore » paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.« less

  17. Experimental Evidence for LENR in a Polarized Pd/D Lattice

    NASA Astrophysics Data System (ADS)

    Szpak, S.

    2005-03-01

    Experimental evidence in support of claims that excess enthalpy production in a polarized Pd/D lattice is of a nuclear origin is questioned on various grounds, eg marginal intensity and difficulty in reproducing. Here, evidence is presented that is 100% reproducible and of sufficient intensity to be well outside of experimental errors. In addition to the thermal behavior, the nuclear manifestations include: X-ray emission; tritium production; and, when an operating cell is placed in an external electric field, fusion to create heavier metals such as Ca, Al, Mg, and Zn.

  18. Modeling an Excitable Biosynthetic Tissue with Inherent Variability for Paired Computational-Experimental Studies.

    PubMed

    Gokhale, Tanmay A; Kim, Jong M; Kirkton, Robert D; Bursac, Nenad; Henriquez, Craig S

    2017-01-01

    To understand how excitable tissues give rise to arrhythmias, it is crucially necessary to understand the electrical dynamics of cells in the context of their environment. Multicellular monolayer cultures have proven useful for investigating arrhythmias and other conduction anomalies, and because of their relatively simple structure, these constructs lend themselves to paired computational studies that often help elucidate mechanisms of the observed behavior. However, tissue cultures of cardiomyocyte monolayers currently require the use of neonatal cells with ionic properties that change rapidly during development and have thus been poorly characterized and modeled to date. Recently, Kirkton and Bursac demonstrated the ability to create biosynthetic excitable tissues from genetically engineered and immortalized HEK293 cells with well-characterized electrical properties and the ability to propagate action potentials. In this study, we developed and validated a computational model of these excitable HEK293 cells (called "Ex293" cells) using existing electrophysiological data and a genetic search algorithm. In order to reproduce not only the mean but also the variability of experimental observations, we examined what sources of variation were required in the computational model. Random cell-to-cell and inter-monolayer variation in both ionic conductances and tissue conductivity was necessary to explain the experimentally observed variability in action potential shape and macroscopic conduction, and the spatial organization of cell-to-cell conductance variation was found to not impact macroscopic behavior; the resulting model accurately reproduces both normal and drug-modified conduction behavior. The development of a computational Ex293 cell and tissue model provides a novel framework to perform paired computational-experimental studies to study normal and abnormal conduction in multidimensional excitable tissue, and the methodology of modeling variation can be

  19. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  20. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    PubMed

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  1. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  2. Theory of bi-molecular association dynamics in 2D for accurate model and experimental parameterization of binding rates

    PubMed Central

    Yogurtcu, Osman N.; Johnson, Margaret E.

    2015-01-01

    The dynamics of association between diffusing and reacting molecular species are routinely quantified using simple rate-equation kinetics that assume both well-mixed concentrations of species and a single rate constant for parameterizing the binding rate. In two-dimensions (2D), however, even when systems are well-mixed, the assumption of a single characteristic rate constant for describing association is not generally accurate, due to the properties of diffusional searching in dimensions d ≤ 2. Establishing rigorous bounds for discriminating between 2D reactive systems that will be accurately described by rate equations with a single rate constant, and those that will not, is critical for both modeling and experimentally parameterizing binding reactions restricted to surfaces such as cellular membranes. We show here that in regimes of intrinsic reaction rate (ka) and diffusion (D) parameters ka/D > 0.05, a single rate constant cannot be fit to the dynamics of concentrations of associating species independently of the initial conditions. Instead, a more sophisticated multi-parametric description than rate-equations is necessary to robustly characterize bimolecular reactions from experiment. Our quantitative bounds derive from our new analysis of 2D rate-behavior predicted from Smoluchowski theory. Using a recently developed single particle reaction-diffusion algorithm we extend here to 2D, we are able to test and validate the predictions of Smoluchowski theory and several other theories of reversible reaction dynamics in 2D for the first time. Finally, our results also mean that simulations of reactive systems in 2D using rate equations must be undertaken with caution when reactions have ka/D > 0.05, regardless of the simulation volume. We introduce here a simple formula for an adaptive concentration dependent rate constant for these chemical kinetics simulations which improves on existing formulas to better capture non-equilibrium reaction dynamics from dilute

  3. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-04

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted. Copyright © 2016, American Association for the Advancement of Science.

  4. Toward more accurate loss tangent measurements in reentrant cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, R. D.

    1980-05-01

    Karpova has described an absolute method for measurement of dielectric properties of a solid in a coaxial reentrant cavity. His cavity resonance equation yields very accurate results for dielectric constants. However, he presented only approximate expressions for the loss tangent. This report presents more exact expressions for that quantity and summarizes some experimental results.

  5. Repeatability and reproducibility of ribotyping and its computer interpretation.

    PubMed

    Lefresne, Gwénola; Latrille, Eric; Irlinger, Françoise; Grimont, Patrick A D

    2004-04-01

    Many molecular typing methods are difficult to interpret because their repeatability (within-laboratory variance) and reproducibility (between-laboratory variance) have not been thoroughly studied. In the present work, ribotyping of coryneform bacteria was the basis of a study involving within-gel and between-gel repeatability and between-laboratory reproducibility (two laboratories involved). The effect of different technical protocols, different algorithms, and different software for fragment size determination was studied. Analysis of variance (ANOVA) showed, within a laboratory, that there was no significant added variance between gels. However, between-laboratory variance was significantly higher than within-laboratory variance. This may be due to the use of different protocols. An experimental function was calculated to transform the data and make them compatible (i.e., erase the between-laboratory variance). The use of different interpolation algorithms (spline, Schaffer and Sederoff) was a significant source of variation in one laboratory only. The use of either Taxotron (Institut Pasteur) or GelCompar (Applied Maths) was not a significant source of added variation when the same algorithm (spline) was used. However, the use of Bio-Gene (Vilber Lourmat) dramatically increased the error (within laboratory, within gel) in one laboratory, while decreasing the error in the other laboratory; this might be due to automatic normalization attempts. These results were taken into account for building a database and performing automatic pattern identification using Taxotron. Conversion of the data considerably improved the identification of patterns irrespective of the laboratory in which the data were obtained.

  6. Modified method of recording and reproducing natural head position with a multicamera system and a laser level.

    PubMed

    Liu, Xiao-jing; Li, Qian-qian; Pang, Yuan-jie; Tian, Kai-yue; Xie, Zheng; Li, Zi-li

    2015-06-01

    As computer-assisted surgical design becomes increasingly popular in maxillofacial surgery, recording patients' natural head position (NHP) and reproducing it in the virtual environment are vital for preoperative design and postoperative evaluation. Our objective was to test the repeatability and accuracy of recording NHP using a multicamera system and a laser level. A laser level was used to project a horizontal reference line on a physical model, and a 3-dimensional image was obtained using a multicamera system. In surgical simulation software, the recorded NHP was reproduced in the virtual head position by registering the coordinate axes with the horizontal reference on both the frontal and lateral views. The repeatability and accuracy of the method were assessed using a gyroscopic procedure as the gold standard. The interclass correlation coefficients for pitch and roll were 0.982 (0.966, 0.991) and 0.995 (0.992, 0.998), respectively, indicating a high degree of repeatability. Regarding accuracy, the lack of agreement in orientation between the new method and the gold standard was within the ranges for pitch (-0.69°, 1.71°) and for roll (-0.92°, 1.20°); these have no clinical significance. This method of recording and reproducing NHP with a multicamera system and a laser level is repeatable, accurate, and clinically feasible. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  7. Integrated experimental and theoretical approach for the structural characterization of Hg2+ aqueous solutions

    NASA Astrophysics Data System (ADS)

    D'Angelo, Paola; Migliorati, Valentina; Mancini, Giordano; Barone, Vincenzo; Chillemi, Giovanni

    2008-02-01

    The structural and dynamic properties of the solvated Hg2+ ion in aqueous solution have been investigated by a combined experimental-theoretical approach employing x-ray absorption spectroscopy and molecular dynamics (MD) simulations. This method allows one to perform a quantitative analysis of the x-ray absorption near-edge structure (XANES) spectra of ionic solutions using a proper description of the thermal and structural fluctuations. XANES spectra have been computed starting from the MD trajectory, without carrying out any minimization in the structural parameter space. The XANES experimental data are accurately reproduced by a first-shell heptacoordinated cluster only if the second hydration shell is included in the calculations. These results confirm at the same time the existence of a sevenfold first hydration shell for the Hg2+ ion in aqueous solution and the reliability of the potentials used in the MD simulations. The combination of MD and XANES is found to be very helpful to get important new insights into the quantitative estimation of structural properties of disordered systems.

  8. Effect of Initial Conditions on Reproducibility of Scientific Research

    PubMed Central

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  9. Effect of initial conditions on reproducibility of scientific research.

    PubMed

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-06-01

    It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings.

  10. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  11. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  12. Reproducibility of ad libitum energy intake with the use of a computerized vending machine system123

    PubMed Central

    Votruba, Susanne B; Franks, Paul W; Krakoff, Jonathan; Salbe, Arline D

    2010-01-01

    Background: Accurate assessment of energy intake is difficult but critical for the evaluation of eating behavior and intervention effects. Consequently, methods to assess ad libitum energy intake under controlled conditions have been developed. Objective: Our objective was to evaluate the reproducibility of ad libitum energy intake with the use of a computerized vending machine system. Design: Twelve individuals (mean ± SD: 36 ± 8 y old; 41 ± 8% body fat) consumed a weight-maintaining diet for 3 d; subsequently, they self-selected all food with the use of a computerized vending machine system for an additional 3 d. Mean daily energy intake was calculated from the actual weight of foods consumed and expressed as a percentage of weight-maintenance energy needs (%WMEN). Subjects repeated the study multiple times during 2 y. The within-person reproducibility of energy intake was determined through the calculation of the intraclass correlation coefficients (ICCs) between visits. Results: Daily energy intake for all subjects was 5020 ± 1753 kcal during visit 1 and 4855 ± 1615 kcal during visit 2. There were no significant associations between energy intake and body weight, body mass index, or percentage body fat while subjects used the vending machines, which indicates that intake was not driven by body size or need. Despite overconsumption (%WMEN = 181 ± 57%), the reproducibility of intake between visits, whether expressed as daily energy intake (ICC = 0.90), %WMEN (ICC = 0.86), weight of food consumed (ICC = 0.87), or fat intake (g/d; ICC = 0.87), was highly significant (P < 0.0001). Conclusion: Although ad libitum energy intake exceeded %WMEN, the within-person reliability of this intake across multiple visits was high, which makes this a reproducible method for the measurement of ad libitum intake in subjects who reside in a research unit. This trial was registered at clinicaltrials.gov as NCT00342732. PMID:19923376

  13. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    NASA Astrophysics Data System (ADS)

    Al-Taie, H.; Smith, L. W.; Lesage, A. A. J.; See, P.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Kelly, M. J.; Smith, C. G.

    2015-08-01

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  14. Multispectral Image Compression for Improvement of Colorimetric and Spectral Reproducibility by Nonlinear Spectral Transform

    NASA Astrophysics Data System (ADS)

    Yu, Shanshan; Murakami, Yuri; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2006-09-01

    The article proposes a multispectral image compression scheme using nonlinear spectral transform for better colorimetric and spectral reproducibility. In the method, we show the reduction of colorimetric error under a defined viewing illuminant and also that spectral accuracy can be improved simultaneously using a nonlinear spectral transform called Labplus, which takes into account the nonlinearity of human color vision. Moreover, we show that the addition of diagonal matrices to Labplus can further preserve the spectral accuracy and has a generalized effect of improving the colorimetric accuracy under other viewing illuminants than the defined one. Finally, we discuss the usage of the first-order Markov model to form the analysis vectors for the higher order channels in Labplus to reduce the computational complexity. We implement a multispectral image compression system that integrates Labplus with JPEG2000 for high colorimetric and spectral reproducibility. Experimental results for a 16-band multispectral image show the effectiveness of the proposed scheme.

  15. Object strength--an accurate measure for small objects that is insensitive to partial volume effects.

    PubMed

    Tofts, P S; Silver, N C; Barker, G J; Gass, A

    2005-07-01

    There are currently four problems in characterising small nonuniform lesions or other objects in Magnetic Resonance images where partial volume effects are significant. Object size is over- or under-estimated; boundaries are often not reproducible; mean object value cannot be measured; and fuzzy borders cannot be accommodated. A new measure, Object Strength, is proposed. This is the sum of all abnormal intensities, above a uniform background value. For a uniform object, this is simply the product of the increase in intensity and the size of the object. Biologically, this could be at least as relevant as existing measures of size or mean intensity. We hypothesise that Object Strength will perform better than traditional area measurements in characterising small objects. In a pilot study, the reproducibility of object strength measurements was investigated using MR images of small multiple sclerosis (MS) lesions. In addition, accuracy was investigated using artificial lesions of known volume (0.3-6.2 ml) and realistic appearance. Reproducibility approached that of area measurements (in 33/90 lesion reports the difference between repeats was less than for area measurements). Total lesion volume was accurate to 0.2%. In conclusion, Object Strength has potential for improved characterisation of small lesions and objects in imaging and possibly spectroscopy.

  16. REPRODUCIBILITY OF MACULAR PIGMENT OPTICAL DENSITY MEASUREMENT BY TWO-WAVELENGTH AUTOFLUORESCENCE IN A CLINICAL SETTING.

    PubMed

    You, Qi Sheng; Bartsch, Dirk-Uwe G; Espina, Mark; Alam, Mostafa; Camacho, Natalia; Mendoza, Nadia; Freeman, William R

    2016-07-01

    Macular pigment, composed of lutein, zeaxanthin, and meso-zeaxanthin, is postulated to protect against age-related macular degeneration, likely because of filtering blue light and its antioxidant properties. Macular pigment optical density (MPOD) is reported to be associated with macular function evaluated by visual acuity and multifocal electroretinogram. Given the importance of macular pigment, reliable and accurate measurement methods are important. The main purpose of this study is to determine the reproducibility of MPOD measurement by two-wavelength autofluorescence method using scanning laser ophthalmoscopy. Sixty-eight eyes of 39 persons were enrolled in the study, including 11 normal eyes, 16 eyes with wet age-related macular degeneration, 16 eyes with dry age-related macular degeneration, 11 eyes with macular edema due to diabetic mellitus, branch retinal vein occlusion or macular telangiectasia, and 14 eyes with tractional maculopathy, including vitreomacular traction, epiretinal membrane, or macular hole. MPOD was measured with a two-wavelength (488 and 514 nm) autofluorescence method with the Spectralis HRA + OCT after pupil dilation. The measurement was repeated for each eye 10 minutes later. The analysis of variance and Bland-Altman plot were used to assess the reproducibility between the two measurements. The mean MPOD at eccentricities of 1° and 2° was 0.36 ± 0.17 (range: 0.04-0.69) and 0.15 ± 0.08 (range: -0.03 to 0.35) for the first measurement and 0.35 ± 0.17 (range: 0.02-0.68) and 0.15 ± 0.08 (range: -0.01 to 0.33) for the second measurement, respectively. The difference between the 2 measurements was not statistically significant, and the Bland-Altman plot showed 7.4% and 5.9% points outside the 95% limits of agreement, indicating an overall excellent reproducibility. Similarly, there is no significant difference between the first and second measurements of MPOD volume within eccentricities of 1°, 2°, and 6° radius, and the Bland

  17. Highly accurate nephelometric titrimetry.

    PubMed

    Zhan, Xiancheng; Li, Chengrong; Li, Zhiyi; Yang, Xiucen; Zhong, Shuguang; Yi, Tao

    2004-02-01

    A method that accurately indicates the end-point of precipitation reactions by the measurement of the relative intensity of the scattered light in the titrate is presented. A new nephelometric titrator with an internal nephelometric sensor has been devised. The work of the titrator including the sensor and change in the turbidity of the titrate and intensity of the scattered light are described. The accuracy of the nephelometric titrimetry is discussed theoretically. The titration of NaCl with AgNO(3) serves as a model. A relative error as well as deviation is within 0.2% under the experimental conditions. The applicability of the titrimetry in pharmaceutical analyses, for example, phenytoin sodium and procaine hydrochloride, is generally illustrated. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association

  18. A Machine Learned Classifier That Uses Gene Expression Data to Accurately Predict Estrogen Receptor Status

    PubMed Central

    Bastani, Meysam; Vos, Larissa; Asgarian, Nasimeh; Deschenes, Jean; Graham, Kathryn; Mackey, John; Greiner, Russell

    2013-01-01

    Background Selecting the appropriate treatment for breast cancer requires accurately determining the estrogen receptor (ER) status of the tumor. However, the standard for determining this status, immunohistochemical analysis of formalin-fixed paraffin embedded samples, suffers from numerous technical and reproducibility issues. Assessment of ER-status based on RNA expression can provide more objective, quantitative and reproducible test results. Methods To learn a parsimonious RNA-based classifier of hormone receptor status, we applied a machine learning tool to a training dataset of gene expression microarray data obtained from 176 frozen breast tumors, whose ER-status was determined by applying ASCO-CAP guidelines to standardized immunohistochemical testing of formalin fixed tumor. Results This produced a three-gene classifier that can predict the ER-status of a novel tumor, with a cross-validation accuracy of 93.17±2.44%. When applied to an independent validation set and to four other public databases, some on different platforms, this classifier obtained over 90% accuracy in each. In addition, we found that this prediction rule separated the patients' recurrence-free survival curves with a hazard ratio lower than the one based on the IHC analysis of ER-status. Conclusions Our efficient and parsimonious classifier lends itself to high throughput, highly accurate and low-cost RNA-based assessments of ER-status, suitable for routine high-throughput clinical use. This analytic method provides a proof-of-principle that may be applicable to developing effective RNA-based tests for other biomarkers and conditions. PMID:24312637

  19. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Grady, K; Davis, S; Seuntjens, J

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10more » × 10 cm{sup 2} Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm{sup 2} PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical

  20. Reproducibility in Computational Neuroscience Models and Simulations

    PubMed Central

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  1. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever.

    PubMed

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Pérez-Girón, José V; Krasemann, Susanne; Günther, Stephan; Muñoz-Fontela, César

    2016-05-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology.

  2. Wavelet SVM in Reproducing Kernel Hilbert Space for hyperspectral remote sensing image classification

    NASA Astrophysics Data System (ADS)

    Du, Peijun; Tan, Kun; Xing, Xiaoshi

    2010-12-01

    Combining Support Vector Machine (SVM) with wavelet analysis, we constructed wavelet SVM (WSVM) classifier based on wavelet kernel functions in Reproducing Kernel Hilbert Space (RKHS). In conventional kernel theory, SVM is faced with the bottleneck of kernel parameter selection which further results in time-consuming and low classification accuracy. The wavelet kernel in RKHS is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. Implications on semiparametric estimation are proposed in this paper. Airborne Operational Modular Imaging Spectrometer II (OMIS II) hyperspectral remote sensing image with 64 bands and Reflective Optics System Imaging Spectrometer (ROSIS) data with 115 bands were used to experiment the performance and accuracy of the proposed WSVM classifier. The experimental results indicate that the WSVM classifier can obtain the highest accuracy when using the Coiflet Kernel function in wavelet transform. In contrast with some traditional classifiers, including Spectral Angle Mapping (SAM) and Minimum Distance Classification (MDC), and SVM classifier using Radial Basis Function kernel, the proposed wavelet SVM classifier using the wavelet kernel function in Reproducing Kernel Hilbert Space is capable of improving classification accuracy obviously.

  3. Molecular Interactions of the Min Protein System Reproduce Spatiotemporal Patterning in Growing and Dividing Escherichia coli Cells.

    PubMed

    Walsh, James C; Angstmann, Christopher N; Duggin, Iain G; Curmi, Paul M G

    2015-01-01

    Oscillations of the Min protein system are involved in the correct midcell placement of the divisome during Escherichia coli cell division. Based on molecular interactions of the Min system, we formulated a mathematical model that reproduces Min patterning during cell growth and division. Specifically, the increase in the residence time of MinD attached to the membrane as its own concentration increases, is accounted for by dimerisation of membrane-bound MinD and its interaction with MinE. Simulation of this system generates unparalleled correlation between the waveshape of experimental and theoretical MinD distributions, suggesting that the dominant interactions of the physical system have been successfully incorporated into the model. For cells where MinD is fully-labelled with GFP, the model reproduces the stationary localization of MinD-GFP for short cells, followed by oscillations from pole to pole in larger cells, and the transition to the symmetric distribution during cell filamentation. Cells containing a secondary, GFP-labelled MinD display a contrasting pattern. The model is able to account for these differences, including temporary midcell localization just prior to division, by increasing the rate constant controlling MinD ATPase and heterotetramer dissociation. For both experimental conditions, the model can explain how cell division results in an equal distribution of MinD and MinE in the two daughter cells, and accounts for the temperature dependence of the period of Min oscillations. Thus, we show that while other interactions may be present, they are not needed to reproduce the main characteristics of the Min system in vivo.

  4. Accurate lithography simulation model based on convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  5. Accurate calculation of mutational effects on the thermodynamics of inhibitor binding to p38α MAP kinase: a combined computational and experimental study.

    PubMed

    Zhu, Shun; Travis, Sue M; Elcock, Adrian H

    2013-07-09

    A major current challenge for drug design efforts focused on protein kinases is the development of drug resistance caused by spontaneous mutations in the kinase catalytic domain. The ubiquity of this problem means that it would be advantageous to develop fast, effective computational methods that could be used to determine the effects of potential resistance-causing mutations before they arise in a clinical setting. With this long-term goal in mind, we have conducted a combined experimental and computational study of the thermodynamic effects of active-site mutations on a well-characterized and high-affinity interaction between a protein kinase and a small-molecule inhibitor. Specifically, we developed a fluorescence-based assay to measure the binding free energy of the small-molecule inhibitor, SB203580, to the p38α MAP kinase and used it measure the inhibitor's affinity for five different kinase mutants involving two residues (Val38 and Ala51) that contact the inhibitor in the crystal structure of the inhibitor-kinase complex. We then conducted long, explicit-solvent thermodynamic integration (TI) simulations in an attempt to reproduce the experimental relative binding affinities of the inhibitor for the five mutants; in total, a combined simulation time of 18.5 μs was obtained. Two widely used force fields - OPLS-AA/L and Amber ff99SB-ILDN - were tested in the TI simulations. Both force fields produced excellent agreement with experiment for three of the five mutants; simulations performed with the OPLS-AA/L force field, however, produced qualitatively incorrect results for the constructs that contained an A51V mutation. Interestingly, the discrepancies with the OPLS-AA/L force field could be rectified by the imposition of position restraints on the atoms of the protein backbone and the inhibitor without destroying the agreement for other mutations; the ability to reproduce experiment depended, however, upon the strength of the restraints' force constant

  6. Accurate Critical Stress Intensity Factor Griffith Crack Theory Measurements by Numerical Techniques

    PubMed Central

    Petersen, Richard C.

    2014-01-01

    Critical stress intensity factor (KIc) has been an approximation for fracture toughness using only load-cell measurements. However, artificial man-made cracks several orders of magnitude longer and wider than natural flaws have required a correction factor term (Y) that can be up to about 3 times the recorded experimental value [1-3]. In fact, over 30 years ago a National Academy of Sciences advisory board stated that empirical KIc testing was of serious concern and further requested that an accurate bulk fracture toughness method be found [4]. Now that fracture toughness can be calculated accurately by numerical integration from the load/deflection curve as resilience, work of fracture (WOF) and strain energy release (SIc) [5, 6], KIc appears to be unnecessary. However, the large body of previous KIc experimental test results found in the literature offer the opportunity for continued meta analysis with other more practical and accurate fracture toughness results using energy methods and numerical integration. Therefore, KIc is derived from the classical Griffith Crack Theory [6] to include SIc as a more accurate term for strain energy release rate (𝒢Ic), along with crack surface energy (γ), crack length (a), modulus (E), applied stress (σ), Y, crack-tip plastic zone defect region (rp) and yield strength (σys) that can all be determined from load and deflection data. Polymer matrix discontinuous quartz fiber-reinforced composites to accentuate toughness differences were prepared for flexural mechanical testing comprising of 3 mm fibers at different volume percentages from 0-54.0 vol% and at 28.2 vol% with different fiber lengths from 0.0-6.0 mm. Results provided a new correction factor and regression analyses between several numerical integration fracture toughness test methods to support KIc results. Further, bulk KIc accurate experimental values are compared with empirical test results found in literature. Also, several fracture toughness mechanisms

  7. How to Write a Reproducible Paper

    NASA Astrophysics Data System (ADS)

    Irving, D. B.

    2016-12-01

    The geosciences have undergone a computational revolution in recent decades, to the point where almost all modern research relies heavily on software and code. Despite this profound change in the research methods employed by geoscientists, the reporting of computational results has changed very little in academic journals. This lag has led to something of a reproducibility crisis, whereby it is impossible to replicate and verify most of today's published computational results. While it is tempting to decry the slow response of journals and funding agencies in the face of this crisis, there are very few examples of reproducible research upon which to base new communication standards. In an attempt to address this deficiency, this presentation will describe a procedure for reporting computational results that was employed in a recent Journal of Climate paper. The procedure was developed to be consistent with recommended computational best practices and seeks to minimize the time burden on authors, which has been identified as the most important barrier to publishing code. It should provide a starting point for geoscientists looking to publish reproducible research, and could be adopted by journals as a formal minimum communication standard.

  8. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  9. Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy

    PubMed Central

    Testa, M.; Schümann, J.; Lu, H.-M.; Shin, J.; Faddegon, B.; Perl, J.; Paganetti, H.

    2013-01-01

    nuclear interaction models used in the simulations. Results: SOBPs’ range and modulation width were reproduced, on average, with an accuracy of +1, −2 and ±3 mm, respectively. OF simulations reproduced measured data within ±3%. Simulated 2D dose-profiles show field flatness and average field radius within ±3% of measured profiles. The field symmetry resulted, on average in ±3% agreement with commissioned profiles. TOPAS accuracy in reproducing measured dose profiles downstream the half beam shifter is better than 2%. Dose rate function simulation reproduced the measurements within ∼2% showing that the four-dimensional modeling of the passively modulation system was implement correctly and millimeter accuracy can be achieved in reproducing measured data. For MLFCs simulations, 2% agreement was found between TOPAS and both sets of experimental measurements. The overall results show that TOPAS simulations are within the clinical accepted tolerances for all QA measurements performed at our institution. Conclusions: Our Monte Carlo simulations reproduced accurately the experimental data acquired through all the measurements performed in this study. Thus, TOPAS can reliably be applied to quality assurance for proton therapy and also as an input for commissioning of commercial treatment planning systems. This work also provides the basis for routine clinical dose calculations in patients for all passive scattering proton therapy centers using TOPAS. PMID:24320505

  10. The determination of accurate dipole polarizabilities alpha and gamma for the noble gases

    NASA Technical Reports Server (NTRS)

    Rice, Julia E.; Taylor, Peter R.; Lee, Timothy J.; Almlof, Jan

    1991-01-01

    Accurate static dipole polarizabilities alpha and gamma of the noble gases He through Xe were determined using wave functions of similar quality for each system. Good agreement with experimental data for the static polarizability gamma was obtained for Ne and Xe, but not for Ar and Kr. Calculations suggest that the experimental values for these latter ions are too low.

  11. Extended Eden model reproduces growth of an acellular slime mold.

    PubMed

    Wagner, G; Halvorsrud, R; Meakin, P

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  12. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  13. Compared to X-ray, three-dimensional computed tomography measurement is a reproducible radiographic method for normal proximal humerus.

    PubMed

    Jia, Xiaoyang; Chen, Yanxi; Qiang, Minfei; Zhang, Kun; Li, Haobo; Jiang, Yuchen; Zhang, Yijie

    2016-07-15

    Accurate comprehension of the normal humeral morphology is crucial for anatomical reconstruction in shoulder arthroplasty. However, traditional morphological measurements for humerus were mainly based on cadaver and radiography. The purpose of this study was to provide a series of precise and repeatable parameters of the normal proximal humerus for arthroplasty, based on the three-dimensional (3-D) measurements. Radiographic and 3-D computed tomography (CT) measurements of the proximal humerus were performed in a sample of 120 consecutive adults. Sex differences, two image modalities differences, and correlations of the parameters were evaluated. Intra- and inter-observer reproducibility was evaluated using intraclass correlation coefficients (ICCs). In the male group, all parameters except the neck-shaft angle of humerus, based on 3-D CT images, were greater than those in the female group (P < 0.05). All variables were significantly different between two image modalities (P < 0.05). In 3-D CT measurement, all parameters expect neck-shaft angle had correlation with each other (P < 0.001), particularly between two diameters of the humeral head (r = 0.907). All parameters in the 3-D CT measurement had excellent reproducibility (ICC range, 0.878 to 0.936) that was higher than those in the radiographs (ICC range, 0.741 to 0.858). The present study suggested that 3-D CT was more reproducible than plain radiography in the assessment of morphology of the normal proximal humerus. Therefore, this reproducible modality could be utilized in the preoperative planning. Our data could serve as an effective guideline for humeral component selection and improve the design of shoulder prosthesis.

  14. Reproducibility of Heart Rate Variability Is Parameter and Sleep Stage Dependent.

    PubMed

    Herzig, David; Eser, Prisca; Omlin, Ximena; Riener, Robert; Wilhelm, Matthias; Achermann, Peter

    2017-01-01

    Objective: Measurements of heart rate variability (HRV) during sleep have become increasingly popular as sleep could provide an optimal state for HRV assessments. While sleep stages have been reported to affect HRV, the effect of sleep stages on the variance of HRV parameters were hardly investigated. We aimed to assess the variance of HRV parameters during the different sleep stages. Further, we tested the accuracy of an algorithm using HRV to identify a 5-min segment within an episode of slow wave sleep (SWS, deep sleep). Methods: Polysomnographic (PSG) sleep recordings of 3 nights of 15 healthy young males were analyzed. Sleep was scored according to conventional criteria. HRV parameters of consecutive 5-min segments were analyzed within the different sleep stages. The total variance of HRV parameters was partitioned into between-subjects variance, between-nights variance, and between-segments variance and compared between the different sleep stages. Intra-class correlation coefficients of all HRV parameters were calculated for all sleep stages. To identify an SWS segment based on HRV, Pearson correlation coefficients of consecutive R-R intervals (rRR) of moving 5-min windows (20-s steps). The linear trend was removed from the rRR time series and the first segment with rRR values 0.1 units below the mean rRR for at least 10 min was identified. A 5-min segment was placed in the middle of such an identified segment and the corresponding sleep stage was used to assess the accuracy of the algorithm. Results: Good reproducibility within and across nights was found for heart rate in all sleep stages and for high frequency (HF) power in SWS. Reproducibility of low frequency (LF) power and of LF/HF was poor in all sleep stages. Of all the 5-min segments selected based on HRV data, 87% were accurately located within SWS. Conclusions: SWS, a stable state that, in contrast to waking, is unaffected by internal and external factors, is a reproducible state that allows

  15. Reproducibility of Heart Rate Variability Is Parameter and Sleep Stage Dependent

    PubMed Central

    Herzig, David; Eser, Prisca; Omlin, Ximena; Riener, Robert; Wilhelm, Matthias; Achermann, Peter

    2018-01-01

    Objective: Measurements of heart rate variability (HRV) during sleep have become increasingly popular as sleep could provide an optimal state for HRV assessments. While sleep stages have been reported to affect HRV, the effect of sleep stages on the variance of HRV parameters were hardly investigated. We aimed to assess the variance of HRV parameters during the different sleep stages. Further, we tested the accuracy of an algorithm using HRV to identify a 5-min segment within an episode of slow wave sleep (SWS, deep sleep). Methods: Polysomnographic (PSG) sleep recordings of 3 nights of 15 healthy young males were analyzed. Sleep was scored according to conventional criteria. HRV parameters of consecutive 5-min segments were analyzed within the different sleep stages. The total variance of HRV parameters was partitioned into between-subjects variance, between-nights variance, and between-segments variance and compared between the different sleep stages. Intra-class correlation coefficients of all HRV parameters were calculated for all sleep stages. To identify an SWS segment based on HRV, Pearson correlation coefficients of consecutive R-R intervals (rRR) of moving 5-min windows (20-s steps). The linear trend was removed from the rRR time series and the first segment with rRR values 0.1 units below the mean rRR for at least 10 min was identified. A 5-min segment was placed in the middle of such an identified segment and the corresponding sleep stage was used to assess the accuracy of the algorithm. Results: Good reproducibility within and across nights was found for heart rate in all sleep stages and for high frequency (HF) power in SWS. Reproducibility of low frequency (LF) power and of LF/HF was poor in all sleep stages. Of all the 5-min segments selected based on HRV data, 87% were accurately located within SWS. Conclusions: SWS, a stable state that, in contrast to waking, is unaffected by internal and external factors, is a reproducible state that allows

  16. An experimental system for symmetric capacitive rf discharge studies

    NASA Astrophysics Data System (ADS)

    Godyak, V. A.; Piejak, R. B.; Alexandrovich, B. M.

    1990-09-01

    An experimental system has been designed and built to comprehensively study the electrical and plasma characteristics in symmetric capacitively coupled rf discharges at low gas pressures. Descriptions of the system concept, the discharge chamber, the vacuum-gas control system, and the rf matching and electrical measurement system are presented together with some results of electrical measurements carried out in an argon discharge at 13.56 MHz. The system has been specifically designed to facilitate external discharge parameter measurements and probe measurements and to be compatible with a wide variety of other diagnostics. External electrical measurements and probe measurements within the discharge show that it is an ideal vehicle to study low-pressure rf discharge physics. Measurements from this system should be comparable to one-dimensional rf symmetric capacitive discharge theories and may help to verify them. Although only a few results are given here, the system has been operated reliably over a wide range of gas pressures and should give reproducible and accurate results for discharge electrical characteristics and plasma parameters over a wide range of driving frequency and gas components.

  17. Show and tell: disclosure and data sharing in experimental pathology.

    PubMed

    Schofield, Paul N; Ward, Jerrold M; Sundberg, John P

    2016-06-01

    Reproducibility of data from experimental investigations using animal models is increasingly under scrutiny because of the potentially negative impact of poor reproducibility on the translation of basic research. Histopathology is a key tool in biomedical research, in particular for the phenotyping of animal models to provide insights into the pathobiology of diseases. Failure to disclose and share crucial histopathological experimental details compromises the validity of the review process and reliability of the conclusions. We discuss factors that affect the interpretation and validation of histopathology data in publications and the importance of making these data accessible to promote replicability in research. © 2016. Published by The Company of Biologists Ltd.

  18. Reproducible Bioinformatics Research for Biologists

    USDA-ARS?s Scientific Manuscript database

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  19. Shear wave elastography for breast masses is highly reproducible.

    PubMed

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  20. Reproducibility of the Online Food4Me Food-Frequency Questionnaire for Estimating Dietary Intakes across Europe.

    PubMed

    Marshall, Steven J; Livingstone, Katherine M; Celis-Morales, Carlos; Forster, Hannah; Fallaize, Rosalind; O'Donovan, Clare B; Woolhead, Clara; Marsaux, Cyril Fm; Macready, Anna L; Navas-Carretero, Santiago; San-Cristobal, Rodrigo; Kolossa, Silvia; Tsirigoti, Lydia; Lambrinou, Christina P; Moschonis, George; Godlewska, Magdalena; Surwiłło, Agnieszka; Drevon, Christian A; Manios, Yannis; Traczyk, Iwona; Martínez, J Alfredo; Saris, Wim H; Daniel, Hannelore; Gibney, Eileen R; Brennan, Lorraine; Walsh, Marianne C; Lovegrove, Julie A; Gibney, Mike; Mathers, John C

    2016-05-01

    Accurate dietary assessment is key to understanding nutrition-related outcomes and is essential for estimating dietary change in nutrition-based interventions. The objective of this study was to assess the pan-European reproducibility of the Food4Me food-frequency questionnaire (FFQ) in assessing the habitual diet of adults. Participants from the Food4Me study, a 6-mo, Internet-based, randomized controlled trial of personalized nutrition conducted in the United Kingdom, Ireland, Spain, Netherlands, Germany, Greece, and Poland, were included. Screening and baseline data (both collected before commencement of the intervention) were used in the present analyses, and participants were included only if they completed FFQs at screening and at baseline within a 1-mo timeframe before the commencement of the intervention. Sociodemographic (e.g., sex and country) and lifestyle [e.g., body mass index (BMI, in kg/m(2)) and physical activity] characteristics were collected. Linear regression, correlation coefficients, concordance (percentage) in quartile classification, and Bland-Altman plots for daily intakes were used to assess reproducibility. In total, 567 participants (59% female), with a mean ± SD age of 38.7 ± 13.4 y and BMI of 25.4 ± 4.8, completed both FFQs within 1 mo (mean ± SD: 19.2 ± 6.2 d). Exact plus adjacent classification of total energy intake in participants was highest in Ireland (94%) and lowest in Poland (81%). Spearman correlation coefficients (ρ) in total energy intake between FFQs ranged from 0.50 for obese participants to 0.68 and 0.60 in normal-weight and overweight participants, respectively. Bland-Altman plots showed a mean difference between FFQs of 210 kcal/d, with the agreement deteriorating as energy intakes increased. There was little variation in reproducibility of total energy intakes between sex and age groups. The online Food4Me FFQ was shown to be reproducible across 7 European countries when administered within a 1-mo period to a

  1. An Experimentally-Supported Genome-Scale Metabolic Network Reconstruction for Yersinia pestis CO92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charusanti, Pep; Chauhan, Sadhana; Mcateer, Kathleen

    2011-10-13

    Yersinia pestis is a gram-negative bacterium that causes plague, a disease linked historically to the Black Death in Europe during the Middle Ages and to several outbreaks during the modern era. Metabolism in Y. pestis displays remarkable flexibility and robustness, allowing the bacterium to proliferate in both warm-blooded mammalian hosts and cold-blooded insect vectors such as fleas. Here we report a genome-scale reconstruction and mathematical model of metabolism for Y. pestis CO92 and supporting experimental growth and metabolite measurements. The model contains 815 genes, 678 proteins, 963 unique metabolites and 1678 reactions, accurately simulates growth on a range of carbonmore » sources both qualitatively and quantitatively, and identifies gaps in several key biosynthetic pathways and suggests how those gaps might be filled. Furthermore, our model presents hypotheses to explain certain known nutritional requirements characteristic of this strain. Y. pestis continues to be a dangerous threat to human health during modern times. The Y. pestis genome-scale metabolic reconstruction presented here, which has been benchmarked against experimental data and correctly reproduces known phenotypes, thus provides an in silico platform with which to investigate the metabolism of this important human pathogen.« less

  2. [The reproducibility of multifocal ERG recordings].

    PubMed

    Meigen, T; Friedrich, A

    2002-09-01

    Multifocal electroretinogram recordings (mfERG) can be used to detect a local dysfunction of the retina. In this study we tested both the intrasessional and inter-sessional reproducibility of mfERG amplitudes. MfERGs from 6 eyes of 6 normal subjects were recorded on two different days using DTL electrodes. The relative coefficient of variation ( RCV) was used to quantify the amplitude reproducibility. We tested the effect of (a) session (inter- vs. intrasessional), (b) recording duration (7.3 vs. 3.6 min), (c) trace type (hexagon traces vs. ring averages), and (d) amplitude definition (peak-trough analysis vs. scalar product) on RCV. RCV was 6.5+/-0.4% (Mean+/-SEM, n=96) when averaged across all recording conditions and all subjects. The ANOVA showed a significant difference ( p=0.018) between hexagon traces and ring averages. Another significant effect ( p=0.016) occurred for the interaction of (a) and (b). MfERGs can be recorded with a high degree of reproducibility even for short recording durations and single hexagon traces. As the factor (a) did not show a significant effect, the new placement of the DTL electrode in the second session does not necessarily increase the retest variability compared to a second recording within the same session.

  3. Acquisition of reproducible transmission near-infrared (NIR) spectra of solid samples with inconsistent shapes by irradiation with isotropically diffused radiation using polytetrafluoroethylene (PTFE) beads.

    PubMed

    Lee, Jinah; Duy, Pham Khac; Yoon, Jihye; Chung, Hoeil

    2014-06-21

    A bead-incorporated transmission scheme (BITS) has been demonstrated for collecting reproducible transmission near-infrared (NIR) spectra of samples with inconsistent shapes. Isotropically diffused NIR radiation was applied around a sample and the surrounding radiation was allowed to interact homogeneously with the sample for transmission measurement. Samples were packed in 1.40 mm polytetrafluoroethylene (PTFE) beads, ideal diffusers without NIR absorption, and then transmission spectra were collected by illuminating the sample-containing beads using NIR radiation. When collimated radiation was directly applied, a small portion of the non-fully diffused radiation (NFDR) propagated through the void space of the packing and eventually degraded the reproducibility. Pre-diffused radiation was introduced by placing an additional PTFE disk in front of the packing to diminish NFDR, which produced more reproducible spectral features. The proposed scheme was evaluated by analyzing two different solid samples: density determination for individual polyethylene (PE) pellets and identification of mining locality for tourmalines. Because spectral collection was reproducible, the use of the spectrum acquired from one PE pellet was sufficient to accurately determine the density of nine other pellets with different shapes. The differentiation of tourmalines, which are even more dissimilar in appearance, according to their mining locality was also feasible with the help of the scheme.

  4. High-Q and highly reproducible microdisks and microlasers.

    PubMed

    Zhang, Nan; Wang, Yujie; Sun, Wenzhao; Liu, Shuai; Huang, Can; Jiang, Xiaoshun; Xiao, Min; Xiao, Shumin; Song, Qinghai

    2018-01-25

    High quality (Q) factor microdisks are fundamental building blocks of on-chip integrated photonic circuits and biological sensors. The resonant modes in microdisks circulate near their boundaries, making their performances strongly dependent upon surface roughness. Surface-tension-induced microspheres and microtoroids are superior to other dielectric microdisks when comparing Q factors. However, most photonic materials such as silicon and negative photoresists are hard to be reflowed and thus the realizations of high-Q microdisks are strongly dependent on electron-beam lithography. Herein, we demonstrate a robust, cost-effective, and highly reproducible technique to fabricate ultrahigh-Q microdisks. By using silica microtoroids as masks, we have successfully replicated their ultrasmooth boundaries in a photoresist via anisotropic dry etching. The experimentally recorded Q factors of passive microdisks can be as large as 1.5 × 10 6 . Similarly, ultrahigh Q microdisk lasers have also been replicated in dye-doped polymeric films. The laser linewidth is only 8 pm, which is limited by the spectrometer and is much narrower than that in previous reports. Meanwhile, high-Q deformed microdisks have also been fabricated by controlling the shape of microtoroids, making the internal ray dynamics and external directional laser emissions controllable. Interestingly, this technique also applies to other materials. Silicon microdisks with Q > 10 6 have been experimentally demonstrated with a similar process. We believe this research will be important for the advances of high-Q micro-resonators and their applications.

  5. Reproducibility and Accuracy of Quantitative Myocardial Blood Flow Using 82Rb-PET: Comparison with 13N-Ammonia

    PubMed Central

    Fakhri, Georges El

    2011-01-01

    =0.843) and stress (r2=0.761). The Bland-Altman plots show no significant presence of proportional error at rest or stress, nor a dependence of the variations on the amplitude of the myocardial blood flow at rest or stress. A small systematic overestimation of 13N-ammonia MBF was observed with 82Rb at rest (0.129 ml/g/min) and the opposite, i.e., underestimation, at stress (0.22 ml/g/min). Conclusions Our results show that absolute quantitation of myocardial bloof flow is reproducible and accurate with 82Rb dynamic cardiac PET as compared to 13N-ammonia. The reproducibility of the quantitation approach itself was very good as well as inter-observer reproducibility. PMID:19525467

  6. A method to reproduce alpha-particle spectra measured with semiconductor detectors.

    PubMed

    Timón, A Fernández; Vargas, M Jurado; Sánchez, A Martín

    2010-01-01

    A method is proposed to reproduce alpha-particle spectra measured with silicon detectors, combining analytical and computer simulation techniques. The procedure includes the use of the Monte Carlo method to simulate the tracks of alpha-particles within the source and in the detector entrance window. The alpha-particle spectrum is finally obtained by the convolution of this simulated distribution and the theoretical distributions representing the contributions of the alpha-particle spectrometer to the spectrum. Experimental spectra from (233)U and (241)Am sources were compared with the predictions given by the proposed procedure, showing good agreement. The proposed method can be an important aid for the analysis and deconvolution of complex alpha-particle spectra. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions

    PubMed Central

    2016-01-01

    Objectives To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Methods Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005–Dec 2007 and Nov 2013–Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Results Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3–4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). Clinical Significance The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance. PMID:27556230

  8. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions.

    PubMed

    Mariano, Abigail D; Kowaleski, Michael P; Boudrieau, Randy J

    2016-01-01

    To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005-Dec 2007 and Nov 2013-Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3-4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance.

  9. Reproducibility and Consistency of In Vitro Nucleosome Reconstitutions Demonstrated by Invitrosome Isolation and Sequencing

    PubMed Central

    Kempton, Colton E.; Heninger, Justin R.; Johnson, Steven M.

    2014-01-01

    Nucleosomes and their positions in the eukaryotic genome play an important role in regulating gene expression by influencing accessibility to DNA. Many factors influence a nucleosome's final position in the chromatin landscape including the underlying genomic sequence. One of the primary reasons for performing in vitro nucleosome reconstitution experiments is to identify how the underlying DNA sequence will influence a nucleosome's position in the absence of other compounding cellular factors. However, concerns have been raised about the reproducibility of data generated from these kinds of experiments. Here we present data for in vitro nucleosome reconstitution experiments performed on linear plasmid DNA that demonstrate that, when coverage is deep enough, these reconstitution experiments are exquisitely reproducible and highly consistent. Our data also suggests that a coverage depth of 35X be maintained for maximal confidence when assaying nucleosome positions, but lower coverage levels may be generally sufficient. These coverage depth recommendations are sufficient in the experimental system and conditions used in this study, but may vary depending on the exact parameters used in other systems. PMID:25093869

  10. Experimental validation of Monte Carlo (MANTIS) simulated x-ray response of columnar CsI scintillator screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freed, Melanie; Miller, Stuart; Tang, Katherine

    Purpose: MANTIS is a Monte Carlo code developed for the detailed simulation of columnar CsI scintillator screens in x-ray imaging systems. Validation of this code is needed to provide a reliable and valuable tool for system optimization and accurate reconstructions for a variety of x-ray applications. Whereas previous validation efforts have focused on matching of summary statistics, in this work the authors examine the complete point response function (PRF) of the detector system in addition to relative light output values. Methods: Relative light output values and high-resolution PRFs have been experimentally measured with a custom setup. A corresponding set ofmore » simulated light output values and PRFs have also been produced, where detailed knowledge of the experimental setup and CsI:Tl screen structures are accounted for in the simulations. Four different screens were investigated with different thicknesses, column tilt angles, and substrate types. A quantitative comparison between the experimental and simulated PRFs was performed for four different incidence angles (0 deg., 15 deg., 30 deg., and 45 deg.) and two different x-ray spectra (40 and 70 kVp). The figure of merit (FOM) used measures the normalized differences between the simulated and experimental data averaged over a region of interest. Results: Experimental relative light output values ranged from 1.456 to 1.650 and were in approximate agreement for aluminum substrates, but poor agreement for graphite substrates. The FOMs for all screen types, incidence angles, and energies ranged from 0.1929 to 0.4775. To put these FOMs in context, the same FOM was computed for 2D symmetric Gaussians fit to the same experimental data. These FOMs ranged from 0.2068 to 0.8029. Our analysis demonstrates that MANTIS reproduces experimental PRFs with higher accuracy than a symmetric 2D Gaussian fit to the experimental data in the majority of cases. Examination of the spatial distribution of differences between the

  11. Graded Interface Models for more accurate Determination of van der Waals-London Dispersion Interactions across Grain Boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Benthem, Klaus; Tan, Guolong; French, Roger H

    2006-01-01

    Attractive van der Waals V London dispersion interactions between two half crystals arise from local physical property gradients within the interface layer separating the crystals. Hamaker coefficients and London dispersion energies were quantitatively determined for 5 and near- 13 grain boundaries in SrTiO3 by analysis of spatially resolved valence electron energy-loss spectroscopy (VEELS) data. From the experimental data, local complex dielectric functions were determined, from which optical properties can be locally analysed. Both local electronic structures and optical properties revealed gradients within the grain boundary cores of both investigated interfaces. The obtained results show that even in the presence ofmore » atomically structured grain boundary cores with widths of less than 1 nm, optical properties have to be represented with gradual changes across the grain boundary structures to quantitatively reproduce accurate van der Waals V London dispersion interactions. London dispersion energies of the order of 10% of the apparent interface energies of SrTiO3 were observed, demonstrating their significance in the grain boundary formation process. The application of different models to represent optical property gradients shows that long-range van der Waals V London dispersion interactions scale significantly with local, i.e atomic length scale property variations.« less

  12. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  13. Reproducible nucleation sites for flux dendrites in MgB 2

    NASA Astrophysics Data System (ADS)

    Johansen, T. H.; Shantsev, D. V.; Olsen, Å. A. F.; Roussel, M.; Pan, A. V.; Dou, S. X.

    2007-12-01

    Magneto-optical imaging was used to study dendritic flux penetration in films of MgB 2. By repeating experiments under the same external conditions, reproducible features were seen in the pattern formation; dendrites tend to nucleate from fixed locations along the edge. However, their detailed structure deeper inside the film is never reproduced. The reproducibility in nucleation sites is explained as a result of edge roughness causing field hot spots.

  14. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    NASA Astrophysics Data System (ADS)

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-03-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.

  15. Metaresearch for Evaluating Reproducibility in Ecology and Evolution.

    PubMed

    Fidler, Fiona; Chee, Yung En; Wintle, Bonnie C; Burgman, Mark A; McCarthy, Michael A; Gordon, Ascelin

    2017-03-01

    Recent replication projects in other disciplines have uncovered disturbingly low levels of reproducibility, suggesting that those research literatures may contain unverifiable claims. The conditions contributing to irreproducibility in other disciplines are also present in ecology. These include a large discrepancy between the proportion of "positive" or "significant" results and the average statistical power of empirical research, incomplete reporting of sampling stopping rules and results, journal policies that discourage replication studies, and a prevailing publish-or-perish research culture that encourages questionable research practices. We argue that these conditions constitute sufficient reason to systematically evaluate the reproducibility of the evidence base in ecology and evolution. In some cases, the direct replication of ecological research is difficult because of strong temporal and spatial dependencies, so here, we propose metaresearch projects that will provide proxy measures of reproducibility.

  16. Coplanar electrode microfluidic chip enabling accurate sheathless impedance cytometry.

    PubMed

    De Ninno, Adele; Errico, Vito; Bertani, Francesca Romana; Businaro, Luca; Bisegna, Paolo; Caselli, Federica

    2017-03-14

    Microfluidic impedance cytometry offers a simple non-invasive method for single-cell analysis. Coplanar electrode chips are especially attractive due to ease of fabrication, yielding miniaturized, reproducible, and ultimately low-cost devices. However, their accuracy is challenged by the dependence of the measured signal on particle trajectory within the interrogation volume, that manifests itself as an error in the estimated particle size, unless any kind of focusing system is used. In this paper, we present an original five-electrode coplanar chip enabling accurate particle sizing without the need for focusing. The chip layout is designed to provide a peculiar signal shape from which a new metric correlating with particle trajectory can be extracted. This metric is exploited to correct the estimated size of polystyrene beads of 5.2, 6 and 7 μm nominal diameter, reaching coefficient of variations lower than the manufacturers' quoted values. The potential impact of the proposed device in the field of life sciences is demonstrated with an application to Saccharomyces cerevisiae yeast.

  17. A machine learning method for fast and accurate characterization of depth-of-interaction gamma cameras

    NASA Astrophysics Data System (ADS)

    Pedemonte, Stefano; Pierce, Larry; Van Leemput, Koen

    2017-11-01

    Measuring the depth-of-interaction (DOI) of gamma photons enables increasing the resolution of emission imaging systems. Several design variants of DOI-sensitive detectors have been recently introduced to improve the performance of scanners for positron emission tomography (PET). However, the accurate characterization of the response of DOI detectors, necessary to accurately measure the DOI, remains an unsolved problem. Numerical simulations are, at the state of the art, imprecise, while measuring directly the characteristics of DOI detectors experimentally is hindered by the impossibility to impose the depth-of-interaction in an experimental set-up. In this article we introduce a machine learning approach for extracting accurate forward models of gamma imaging devices from simple pencil-beam measurements, using a nonlinear dimensionality reduction technique in combination with a finite mixture model. The method is purely data-driven, not requiring simulations, and is applicable to a wide range of detector types. The proposed method was evaluated both in a simulation study and with data acquired using a monolithic gamma camera designed for PET (the cMiCE detector), demonstrating the accurate recovery of the DOI characteristics. The combination of the proposed calibration technique with maximum- a posteriori estimation of the coordinates of interaction provided a depth resolution of  ≈1.14 mm for the simulated PET detector and  ≈1.74 mm for the cMiCE detector. The software and experimental data are made available at http://occiput.mgh.harvard.edu/depthembedding/.

  18. Multimodal Spatial Calibration for Accurately Registering EEG Sensor Positions

    PubMed Central

    Chen, Shengyong; Xiao, Gang; Li, Xiaoli

    2014-01-01

    This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views' calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain. PMID:24803954

  19. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  20. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  1. Inflow-vascular space occupancy (iVASO) reproducibility in the hippocampus and cortex at different blood water nulling times.

    PubMed

    Rane, Swati; Talati, Pratik; Donahue, Manus J; Heckers, Stephan

    2016-06-01

    Inflow-vascular space occupancy (iVASO) measures arterial cerebral blood volume (aCBV) using accurate blood water nulling (inversion time [TI]) when arterial blood reaches the capillary, i.e., at the arterial arrival time. This work assessed the reproducibility of iVASO measurements in the hippocampus and cortex at multiple TIs. The iVASO approach was implemented at multiple TIs in 10 healthy volunteers at 3 Tesla. aCBV values were measured at each TI in the left and right hippocampus, and the cortex. Reproducibility of aCBV measurements within scans (same day) and across sessions (different days) was assessed using the intraclass correlation coefficient (ICC). Overall hippocampal aCBV was significantly higher than cortical aCBV, likely due to higher gray matter volume. Hippocampal ICC values were high at short TIs (≤914 ms; intrascan values = 0.80-0.96, interscan values = 0.61-0.91). Cortically, high ICC values were observed at intermediate TIs of 914 (intra: 0.93, inter: 0.87) and 1034 ms (intra: 0.96, inter: 0.86). The ICC values were comparable to established contrast-based CBV measures. iVASO measurements are reproducible within and across sessions. TIs for iVASO measurements should be chosen carefully, taking into account heterogeneous arterial arrival times in different brain regions. Magn Reson Med 75:2379-2387, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  2. A comprehensive assessment of RNA-seq accuracy, reproducibility and information content by the Sequencing Quality Control consortium

    PubMed Central

    2014-01-01

    We present primary results from the Sequencing Quality Control (SEQC) project, coordinated by the United States Food and Drug Administration. Examining Illumina HiSeq, Life Technologies SOLiD and Roche 454 platforms at multiple laboratory sites using reference RNA samples with built-in controls, we assess RNA sequencing (RNA-seq) performance for junction discovery and differential expression profiling and compare it to microarray and quantitative PCR (qPCR) data using complementary metrics. At all sequencing depths, we discover unannotated exon-exon junctions, with >80% validated by qPCR. We find that measurements of relative expression are accurate and reproducible across sites and platforms if specific filters are used. In contrast, RNA-seq and microarrays do not provide accurate absolute measurements, and gene-specific biases are observed, for these and qPCR. Measurement performance depends on the platform and data analysis pipeline, and variation is large for transcript-level profiling. The complete SEQC data sets, comprising >100 billion reads (10Tb), provide unique resources for evaluating RNA-seq analyses for clinical and regulatory settings. PMID:25150838

  3. Central Corneal Thickness Reproducibility among Ten Different Instruments.

    PubMed

    Pierro, Luisa; Iuliano, Lorenzo; Gagliardi, Marco; Ambrosi, Alessandro; Rama, Paolo; Bandello, Francesco

    2016-11-01

    To assess agreement between one ultrasonic (US) and nine optical instruments for the measurement of central corneal thickness (CCT), and to evaluate intra- and inter-operator reproducibility. In this observational cross-sectional study, two masked operators measured CCT thickness twice in 28 healthy eyes. We used seven spectral-domain optical coherence tomography (SD-OCT) devices, one time-domain OCT, one Scheimpflug camera, and one US-based instrument. Inter- and intra-operator reproducibility was evaluated by intraclass correlation coefficient (ICC), coefficient of variation (CV), and Bland-Altman test analysis. Instrument-to-instrument reproducibility was determined by ANOVA for repeated measurements. We also tested how the devices disagreed regarding systemic bias and random error using a structural equation model. Mean CCT of all instruments ranged from 536 ± 42 μm to 577 ± 40 μm. An instrument-to-instrument correlation test showed high values among the 10 investigated devices (correlation coefficient range 0.852-0.995; p values <0.0001 in all cases). The highest correlation coefficient values were registered between 3D OCT-2000 Topcon-Spectral OCT/SLO Opko (0.995) and Cirrus HD-OCT Zeiss-RS-3000 Nidek (0.995), whereas the lowest were seen between SS-1000 CASIA and Spectral OCT/SLO Opko (0.852). ICC and CV showed excellent inter- and intra-operator reproducibility for all optic-based devices, except for the US-based device. Bland-Altman analysis demonstrated low mean biases between operators. Despite highlighting good intra- and inter-operator reproducibility, we found that a scale bias between instruments might interfere with thorough CCT monitoring. We suggest that optimal monitoring is achieved with the same operator and the same device.

  4. A rabbit ventricular action potential model replicating cardiac dynamics at rapid heart rates.

    PubMed

    Mahajan, Aman; Shiferaw, Yohannes; Sato, Daisuke; Baher, Ali; Olcese, Riccardo; Xie, Lai-Hua; Yang, Ming-Jim; Chen, Peng-Sheng; Restrepo, Juan G; Karma, Alain; Garfinkel, Alan; Qu, Zhilin; Weiss, James N

    2008-01-15

    Mathematical modeling of the cardiac action potential has proven to be a powerful tool for illuminating various aspects of cardiac function, including cardiac arrhythmias. However, no currently available detailed action potential model accurately reproduces the dynamics of the cardiac action potential and intracellular calcium (Ca(i)) cycling at rapid heart rates relevant to ventricular tachycardia and fibrillation. The aim of this study was to develop such a model. Using an existing rabbit ventricular action potential model, we modified the L-type calcium (Ca) current (I(Ca,L)) and Ca(i) cycling formulations based on new experimental patch-clamp data obtained in isolated rabbit ventricular myocytes, using the perforated patch configuration at 35-37 degrees C. Incorporating a minimal seven-state Markovian model of I(Ca,L) that reproduced Ca- and voltage-dependent kinetics in combination with our previously published dynamic Ca(i) cycling model, the new model replicates experimentally observed action potential duration and Ca(i) transient alternans at rapid heart rates, and accurately reproduces experimental action potential duration restitution curves obtained by either dynamic or S1S2 pacing.

  5. Metaresearch for Evaluating Reproducibility in Ecology and Evolution

    PubMed Central

    Fidler, Fiona; Chee, Yung En; Wintle, Bonnie C.; Burgman, Mark A.; McCarthy, Michael A.; Gordon, Ascelin

    2017-01-01

    Abstract Recent replication projects in other disciplines have uncovered disturbingly low levels of reproducibility, suggesting that those research literatures may contain unverifiable claims. The conditions contributing to irreproducibility in other disciplines are also present in ecology. These include a large discrepancy between the proportion of “positive” or “significant” results and the average statistical power of empirical research, incomplete reporting of sampling stopping rules and results, journal policies that discourage replication studies, and a prevailing publish-or-perish research culture that encourages questionable research practices. We argue that these conditions constitute sufficient reason to systematically evaluate the reproducibility of the evidence base in ecology and evolution. In some cases, the direct replication of ecological research is difficult because of strong temporal and spatial dependencies, so here, we propose metaresearch projects that will provide proxy measures of reproducibility. PMID:28596617

  6. Accurate Determination of Tunneling-Affected Rate Coefficients: Theory Assessing Experiment.

    PubMed

    Zuo, Junxiang; Xie, Changjian; Guo, Hua; Xie, Daiqian

    2017-07-20

    The thermal rate coefficients of a prototypical bimolecular reaction are determined on an accurate ab initio potential energy surface (PES) using ring polymer molecular dynamics (RPMD). It is shown that quantum effects such as tunneling and zero-point energy (ZPE) are of critical importance for the HCl + OH reaction at low temperatures, while the heavier deuterium substitution renders tunneling less facile in the DCl + OH reaction. The calculated RPMD rate coefficients are in excellent agreement with experimental data for the HCl + OH reaction in the entire temperature range of 200-1000 K, confirming the accuracy of the PES. On the other hand, the RPMD rate coefficients for the DCl + OH reaction agree with some, but not all, experimental values. The self-consistency of the theoretical results thus allows a quality assessment of the experimental data.

  7. An International Ki67 Reproducibility Study

    PubMed Central

    2013-01-01

    Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without

  8. Accurate Modeling of Dark-Field Scattering Spectra of Plasmonic Nanostructures.

    PubMed

    Jiang, Liyong; Yin, Tingting; Dong, Zhaogang; Liao, Mingyi; Tan, Shawn J; Goh, Xiao Ming; Allioux, David; Hu, Hailong; Li, Xiangyin; Yang, Joel K W; Shen, Zexiang

    2015-10-27

    Dark-field microscopy is a widely used tool for measuring the optical resonance of plasmonic nanostructures. However, current numerical methods for simulating the dark-field scattering spectra were carried out with plane wave illumination either at normal incidence or at an oblique angle from one direction. In actual experiments, light is focused onto the sample through an annular ring within a range of glancing angles. In this paper, we present a theoretical model capable of accurately simulating the dark-field light source with an annular ring. Simulations correctly reproduce a counterintuitive blue shift in the scattering spectra from gold nanodisks with a diameter beyond 140 nm. We believe that our proposed simulation method can be potentially applied as a general tool capable of simulating the dark-field scattering spectra of plasmonic nanostructures as well as other dielectric nanostructures with sizes beyond the quasi-static limit.

  9. Reliable and accurate extraction of Hamaker constants from surface force measurements.

    PubMed

    Miklavcic, S J

    2018-08-15

    A simple and accurate closed-form expression for the Hamaker constant that best represents experimental surface force data is presented. Numerical comparisons are made with the current standard least squares approach, which falsely assumes error-free separation measurements, and a nonlinear version assuming independent measurements of force and separation are subject to error. The comparisons demonstrate that not only is the proposed formula easily implemented it is also considerably more accurate. This option is appropriate for any value of Hamaker constant, high or low, and certainly for any interacting system exhibiting an inverse square distance dependent van der Waals force. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Economical and accurate protocol for calculating hydrogen-bond-acceptor strengths.

    PubMed

    El Kerdawy, Ahmed; Tautermann, Christofer S; Clark, Timothy; Fox, Thomas

    2013-12-23

    A series of density functional/basis set combinations and second-order Møller-Plesset calculations have been used to test their ability to reproduce the trends observed experimentally for the strengths of hydrogen-bond acceptors in order to identify computationally efficient techniques for routine use in the computational drug-design process. The effects of functionals, basis sets, counterpoise corrections, and constraints on the optimized geometries were tested and analyzed, and recommendations (M06-2X/cc-pVDZ and X3LYP/cc-pVDZ with single-point counterpoise corrections or X3LYP/aug-cc-pVDZ without counterpoise) were made for suitable moderately high-throughput techniques.

  11. 46 CFR 56.80-10 - Forming (reproduces 129.2).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Forming (reproduces 129.2). 56.80-10 Section 56.80-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND APPURTENANCES Bending and Forming § 56.80-10 Forming (reproduces 129.2). (a) Piping components may be formed...

  12. 46 CFR 56.80-10 - Forming (reproduces 129.2).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Forming (reproduces 129.2). 56.80-10 Section 56.80-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND APPURTENANCES Bending and Forming § 56.80-10 Forming (reproduces 129.2). (a) Piping components may be formed...

  13. Experimental Hydromechanical Characterization and Numerical Modelling of a Fractured and Porous Sandstone

    NASA Astrophysics Data System (ADS)

    Souley, Mountaka; Lopez, Philippe; Boulon, Marc; Thoraval, Alain

    2015-05-01

    The experimental device previously used to study the hydromechanical behaviour of individual fractures on a laboratory scale, was adapted to make it possible to measure flow through porous rock mass samples in addition to fracture flows. A first series of tests was performed to characterize the hydromechanical behaviour of the fracture individually as well as the porous matrix (sandstone) comprising the fracture walls. A third test in this series was used to validate the experimental approach. These tests showed non-linear evolution of the contact area on the fracture walls with respect to effective normal stress. Consequently, a non-linear relationship was noted between the hydraulic aperture on the one hand, and the effective normal stress and mechanical opening on the other hand. The results of the three tests were then analysed by numerical modelling. The VIPLEF/HYDREF numerical codes used take into account the dual-porosity of the sample (fracture + rock matrix) and can be used to reproduce hydromechanical loading accurately. The analyses show that the relationship between the hydraulic aperture of the fracture and the mechanical closure has a significant effect on fracture flow rate predictions. By taking simultaneous measurements of flow in both fracture and rock matrix, we were able to carry out a global evaluation of the conceptual approach used.

  14. Towards the Geometry of Reproducing Kernels

    NASA Astrophysics Data System (ADS)

    Galé, J. E.

    2010-11-01

    It is shown here how one is naturally led to consider a category whose objects are reproducing kernels of Hilbert spaces, and how in this way a differential geometry for such kernels may be settled down.

  15. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-04

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high. Copyright © 2016, American Association for the Advancement of Science.

  16. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    PubMed

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  17. Reproducible and controllable induction voltage adder for scaled beam experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  18. Thermodynamics of mixing water with dimethyl sulfoxide, as seen from computer simulations.

    PubMed

    Idrissi, Abdenacer; Marekha, Bogdan; Barj, Mohamed; Jedlovszky, Pál

    2014-07-24

    The Helmholtz free energy, energy, and entropy of mixing of eight different models of dimethyl sulfoxide (DMSO) with four widely used water models are calculated at 298 K over the entire composition range by means of thermodynamic integration along a suitably chosen thermodynamic path, and compared with experimental data. All 32 model combinations considered are able to reproduce the experimental values rather well, within RT (free energy and energy) and R (entropy) at any composition, and quite often the deviation from the experimental data is even smaller, being in the order of the uncertainty of the calculated free energy or energy, and entropy values of 0.1 kJ/mol and 0.1 J/(mol K), respectively. On the other hand, none of the model combinations considered can accurately reproduce all three experimental functions simultaneously. Furthermore, the fact that the entropy of mixing changes sign with increasing DMSO mole fraction is only reproduced by a handful of model pairs. Model combinations that (i) give the best reproduction of the experimental free energy, while still reasonably well reproducing the experimental energy and entropy of mixing, and (ii) that give the best reproduction of the experimental energy and entropy, while still reasonably well reproducing the experimental free energy of mixing, are identified.

  19. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  20. Repeatability and Reproducibility of Virtual Subjective Refraction.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-10-01

    To establish the repeatability and reproducibility of a virtual refraction process using simulated retinal images. With simulation software, aberrated images corresponding with each step of the refraction process were calculated following the typical protocol of conventional subjective refraction. Fifty external examiners judged simulated retinal images until the best sphero-cylindrical refraction and the best visual acuity were achieved starting from the aberrometry data of three patients. Data analyses were performed to assess repeatability and reproducibility of the virtual refraction as a function of pupil size and aberrometric profile of different patients. SD values achieved in three components of refraction (M, J0, and J45) are lower than 0.25D in repeatability analysis. Regarding reproducibility, we found SD values lower than 0.25D in the most cases. When the results of virtual refraction with different pupil diameters (4 and 6 mm) were compared, the mean of differences (MoD) obtained were not clinically significant (less than 0.25D). Only one of the aberrometry profiles with high uncorrected astigmatism shows poor results for the M component in reproducibility and pupil size dependence analysis. In all cases, vision achieved was better than 0 logMAR. A comparison between the compensation obtained with virtual and conventional subjective refraction was made as an example of this application, showing good quality retinal images in both processes. The present study shows that virtual refraction has similar levels of precision as conventional subjective refraction. Moreover, virtual refraction has also shown that when high low order astigmatism is present, the refraction result is less precise and highly dependent on pupil size.

  1. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.

  2. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics

    PubMed Central

    Zhao, Jun; Avila-Garcia, Maria Susana; Roos, Marco; Thompson, Mark; van der Horst, Eelke; Kaliyaperumal, Rajaram; Luo, Ruibang; Lee, Tin-Lap; Lam, Tak-wah; Edmunds, Scott C.; Sansone, Susanna-Assunta

    2015-01-01

    Motivation Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA), Nanopublications (NP), and Research Objects (RO) models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler. Results Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata. Availability SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http

  3. Accurate Phylogenetic Tree Reconstruction from Quartets: A Heuristic Approach

    PubMed Central

    Reaz, Rezwana; Bayzid, Md. Shamsuzzoha; Rahman, M. Sohel

    2014-01-01

    Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A ‘quartet’ is an unrooted tree over taxa, hence the quartet-based supertree methods combine many -taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets. PMID:25117474

  4. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  5. Optimizing laboratory animal stress paradigms: The H-H* experimental design.

    PubMed

    McCarty, Richard

    2017-01-01

    Major advances in behavioral neuroscience have been facilitated by the development of consistent and highly reproducible experimental paradigms that have been widely adopted. In contrast, many different experimental approaches have been employed to expose laboratory mice and rats to acute versus chronic intermittent stress. An argument is advanced in this review that more consistent approaches to the design of chronic intermittent stress experiments would provide greater reproducibility of results across laboratories and greater reliability relating to various neural, endocrine, immune, genetic, and behavioral adaptations. As an example, the H-H* experimental design incorporates control, homotypic (H), and heterotypic (H*) groups and allows for comparisons across groups, where each animal is exposed to the same stressor, but that stressor has vastly different biological and behavioral effects depending upon each animal's prior stress history. Implementation of the H-H* experimental paradigm makes possible a delineation of transcriptional changes and neural, endocrine, and immune pathways that are activated in precisely defined stressor contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    NASA Astrophysics Data System (ADS)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  7. Maximizing the quantitative accuracy and reproducibility of Förster resonance energy transfer measurement for screening by high throughput widefield microscopy

    PubMed Central

    Schaufele, Fred

    2013-01-01

    Förster resonance energy transfer (FRET) between fluorescent proteins (FPs) provides insights into the proximities and orientations of FPs as surrogates of the biochemical interactions and structures of the factors to which the FPs are genetically fused. As powerful as FRET methods are, technical issues have impeded their broad adoption in the biologic sciences. One hurdle to accurate and reproducible FRET microscopy measurement stems from variable fluorescence backgrounds both within a field and between different fields. Those variations introduce errors into the precise quantification of fluorescence levels on which the quantitative accuracy of FRET measurement is highly dependent. This measurement error is particularly problematic for screening campaigns since minimal well-to-well variation is necessary to faithfully identify wells with altered values. High content screening depends also upon maximizing the numbers of cells imaged, which is best achieved by low magnification high throughput microscopy. But, low magnification introduces flat-field correction issues that degrade the accuracy of background correction to cause poor reproducibility in FRET measurement. For live cell imaging, fluorescence of cell culture media in the fluorescence collection channels for the FPs commonly used for FRET analysis is a high source of background error. These signal-to-noise problems are compounded by the desire to express proteins at biologically meaningful levels that may only be marginally above the strong fluorescence background. Here, techniques are presented that correct for background fluctuations. Accurate calculation of FRET is realized even from images in which a non-flat background is 10-fold higher than the signal. PMID:23927839

  8. Reply to comment by Melsen et al. on "Most computational hydrology is not reproducible, so is it really science?"

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made by Melsen et al. [2017] on our previous commentary regarding reproducibility in computational hydrology. Re-executing someone else's code and workflow to derive a set of published results does not by itself constitute reproducibility. However, it forms a key part of the process: it demonstrates that all the degrees of freedom and choices made by the scientist in running the experiment are contained within that code and workflow. This does not only allow us to build and extend directly from the original work, but with full knowledge of decisions made in the original experimental setup, we can then focus our attention to the degrees of freedom of interest: those that occur in hydrological systems that are ultimately our subject of study.

  9. Understanding the effect of side groups in ionic liquids on carbon-capture properties: a combined experimental and theoretical effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Fangyong; Lartey, Michael; Damodaran, Krishnan

    2013-01-01

    Ionic liquids are an emerging class of materials with applications in a variety of fields. Steady progress has been made in the creation of ionic liquids tailored to specific applications. However, the understanding of the underlying structure–property relationships has been slower to develop. As a step in the effort to alleviate this deficiency, the influence of side groups on ionic liquid properties has been studied through an integrated approach utilizing synthesis, experimental determination of properties, and simulation techniques. To achieve this goal, a classical force field in the framework of OPLS/Amber force fields has been developed to predict ionic liquidmore » properties accurately. Cu(I)-catalyzed click chemistry was employed to synthesize triazolium-based ionic liquids with diverse side groups. Values of densities were predicted within 3% of experimental values, whereas self-diffusion coefficients were underestimated by about an order of magnitude though the trends were in excellent agreement, the activation energy calculated in simulation correlates well with experimental values. The predicted Henry coefficient for CO{sub 2} solubility reproduced the experimentally observed trends. This study highlights the importance of integrating experimental and computational approaches in property prediction and materials development, which is not only useful in the development of ionic liquids for CO{sub 2} capture but has application in many technological fields.« less

  10. Understanding the effect of side groups in ionic liquids on carbon-capture properties: a combined experimental and theoretical effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Fangyong; Lartey, Michael; Damodaran, Krishnan

    Ionic liquids are an emerging class of materials with applications in a variety of fields. Steady progress has been made in the creation of ionic liquids tailored to specific applications. However, the understanding of the underlying structure–property relationships has been slower to develop. As a step in the effort to alleviate this deficiency, the influence of side groups on ionic liquid properties has been studied through an integrated approach utilizing synthesis, experimental determination of properties, and simulation techniques. To achieve this goal, a classical force field in the framework of OPLS/Amber force fields has been developed to predict ionic liquidmore » properties accurately. Cu(I)-catalyzed click chemistry was employed to synthesize triazolium-based ionic liquids with diverse side groups. Values of densities were predicted within 3% of experimental values, whereas self-diffusion coefficients were underestimated by about an order of magnitude though the trends were in excellent agreement, the activation energy calculated in simulation correlates well with experimental values. The predicted Henry coefficient for CO{sub 2} solubility reproduced the experimentally observed trends. This study highlights the importance of integrating experimental and computational approaches in property prediction and materials development, which is not only useful in the development of ionic liquids for CO{sub 2} capture but has application in many technological fields.« less

  11. Reproducibility of UAV-based earth surface topography based on structure-from-motion algorithms.

    NASA Astrophysics Data System (ADS)

    Clapuyt, François; Vanacker, Veerle; Van Oost, Kristof

    2014-05-01

    A representation of the earth surface at very high spatial resolution is crucial to accurately map small geomorphic landforms with high precision. Very high resolution digital surface models (DSM) can then be used to quantify changes in earth surface topography over time, based on differencing of DSMs taken at various moments in time. However, it is compulsory to have both high accuracy for each topographic representation and consistency between measurements over time, as DSM differencing automatically leads to error propagation. This study investigates the reproducibility of reconstructions of earth surface topography based on structure-from-motion (SFM) algorithms. To this end, we equipped an eight-propeller drone with a standard reflex camera. This equipment can easily be deployed in the field, as it is a lightweight, low-cost system in comparison with classic aerial photo surveys and terrestrial or airborne LiDAR scanning. Four sets of aerial photographs were created for one test field. The sets of airphotos differ in focal length, and viewing angles, i.e. nadir view and ground-level view. In addition, the importance of the accuracy of ground control points for the construction of a georeferenced point cloud was assessed using two different GPS devices with horizontal accuracy at resp. the sub-meter and sub-decimeter level. Airphoto datasets were processed with SFM algorithm and the resulting point clouds were georeferenced. Then, the surface representations were compared with each other to assess the reproducibility of the earth surface topography. Finally, consistency between independent datasets is discussed.

  12. Development and reproducibility evaluation of a Monte Carlo-based standard LINAC model for quality assurance of multi-institutional clinical trials.

    PubMed

    Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki

    2014-11-01

    Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  13. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  14. Rapid, cost-effective and accurate quantification of Yucca schidigera Roezl. steroidal saponins using HPLC-ELSD method.

    PubMed

    Tenon, Mathieu; Feuillère, Nicolas; Roller, Marc; Birtić, Simona

    2017-04-15

    Yucca GRAS-labelled saponins have been and are increasingly used in food/feed, pharmaceutical or cosmetic industries. Existing techniques presently used for Yucca steroidal saponin quantification remain either inaccurate and misleading or accurate but time consuming and cost prohibitive. The method reported here addresses all of the above challenges. HPLC/ELSD technique is an accurate and reliable method that yields results of appropriate repeatability and reproducibility. This method does not over- or under-estimate levels of steroidal saponins. HPLC/ELSD method does not require each and every pure standard of saponins, to quantify the group of steroidal saponins. The method is a time- and cost-effective technique that is suitable for routine industrial analyses. HPLC/ELSD methods yield a saponin fingerprints specific to the plant species. As the method is capable of distinguishing saponin profiles from taxonomically distant species, it can unravel plant adulteration issues. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  16. PH Tester Gauge Repeatability and Reproducibility Study for WO3 Nanostructure Hydrothermal Growth Process

    NASA Astrophysics Data System (ADS)

    Abd Rashid, Amirul; Hayati Saad, Nor; Bien Chia Sheng, Daniel; Yee, Lee Wai

    2014-06-01

    PH value is one of the important variables for tungsten trioxide (WO3) nanostructure hydrothermal synthesis process. The morphology of the synthesized nanostructure can be properly controlled by measuring and controlling the pH value of the solution used in this facile synthesis route. Therefore, it is very crucial to ensure the gauge used for pH measurement is reliable in order to achieve the expected result. In this study, gauge repeatability and reproducibility (GR&R) method was used to assess the repeatability and reproducibility of the pH tester. Based on ANOVA method, the design of experimental metrics as well as the result of the experiment was analyzed using Minitab software. It was found that the initial GR&R value for the tester was at 17.55 % which considered as acceptable. To further improve the GR&R level, a new pH measuring procedure was introduced. With the new procedure, the GR&R value was able to be reduced to 2.05%, which means the tester is statistically very ideal to measure the pH of the solution prepared for WO3 hydrothermal synthesis process.

  17. Hot-Volumes as Uniform and Reproducible SERS-Detection Enhancers in Weakly-Coupled Metallic Nanohelices

    NASA Astrophysics Data System (ADS)

    Caridad, José M.; Winters, Sinéad; McCloskey, David; Duesberg, Georg S.; Donegan, John F.; Krstić, Vojislav

    2017-03-01

    Reproducible and enhanced optical detection of molecules in low concentrations demands simultaneously intense and homogeneous electric fields acting as robust signal amplifiers. To generate such sophisticated optical near-fields, different plasmonic nanostructures were investigated in recent years. These, however, exhibit either high enhancement factor (EF) or spatial homogeneity but not both. Small interparticle gaps or sharp nanostructures show enormous EFs but no near-field homogeneity. Meanwhile, approaches using rounded and separated monomers create uniform near-fields with moderate EFs. Here, guided by numerical simulations, we show how arrays of weakly-coupled Ag nanohelices achieve both homogeneous and strong near-field enhancements, reaching even the limit forreproducible detection of individual molecules. The unique near-field distribution of a single nanohelix consists of broad hot-spots, merging with those from neighbouring nanohelices in specific array configurations and generating a wide and uniform detection zone (“hot-volume”). We experimentally assessed these nanostructures via surface-enhanced Raman spectroscopy, obtaining a corresponding EF of ~107 and a relative standard deviation <10%. These values demonstrate arrays of nanohelices as state-of-the-art substrates for reproducible optical detection as well as compelling nanostructures for related fields such as near-field imaging.

  18. Reliability, robustness, and reproducibility in mouse behavioral phenotyping: a cross-laboratory study

    PubMed Central

    Mandillo, Silvia; Tucci, Valter; Hölter, Sabine M.; Meziane, Hamid; Banchaabouchi, Mumna Al; Kallnik, Magdalena; Lad, Heena V.; Nolan, Patrick M.; Ouagazzal, Abdel-Mouttalib; Coghill, Emma L.; Gale, Karin; Golini, Elisabetta; Jacquot, Sylvie; Krezel, Wojtek; Parker, Andy; Riet, Fabrice; Schneider, Ilka; Marazziti, Daniela; Auwerx, Johan; Brown, Steve D. M.; Chambon, Pierre; Rosenthal, Nadia; Tocchini-Valentini, Glauco; Wurst, Wolfgang

    2008-01-01

    Establishing standard operating procedures (SOPs) as tools for the analysis of behavioral phenotypes is fundamental to mouse functional genomics. It is essential that the tests designed provide reliable measures of the process under investigation but most importantly that these are reproducible across both time and laboratories. For this reason, we devised and tested a set of SOPs to investigate mouse behavior. Five research centers were involved across France, Germany, Italy, and the UK in this study, as part of the EUMORPHIA program. All the procedures underwent a cross-validation experimental study to investigate the robustness of the designed protocols. Four inbred reference strains (C57BL/6J, C3HeB/FeJ, BALB/cByJ, 129S2/SvPas), reflecting their use as common background strains in mutagenesis programs, were analyzed to validate these tests. We demonstrate that the operating procedures employed, which includes open field, SHIRPA, grip-strength, rotarod, Y-maze, prepulse inhibition of acoustic startle response, and tail flick tests, generated reproducible results between laboratories for a number of the test output parameters. However, we also identified several uncontrolled variables that constitute confounding factors in behavioral phenotyping. The EUMORPHIA SOPs described here are an important start-point for the ongoing development of increasingly robust phenotyping platforms and their application in large-scale, multicentre mouse phenotyping programs. PMID:18505770

  19. A Structural Molar Volume Model for Oxide Melts Part I: Li2O-Na2O-K2O-MgO-CaO-MnO-PbO-Al2O3-SiO2 Melts—Binary Systems

    NASA Astrophysics Data System (ADS)

    Thibodeau, Eric; Gheribi, Aimen E.; Jung, In-Ho

    2016-04-01

    A structural molar volume model was developed to accurately reproduce the molar volume of molten oxides. As the non-linearity of molar volume is related to the change in structure of molten oxides, the silicate tetrahedral Q-species, calculated from the modified quasichemical model with an optimized thermodynamic database, were used as basic structural units in the present model. Experimental molar volume data for unary and binary melts in the Li2O-Na2O-K2O-MgO-CaO-MnO-PbO-Al2O3-SiO2 system were critically evaluated. The molar volumes of unary oxide components and binary Q-species, which are model parameters of the present structural model, were determined to accurately reproduce the experimental data across the entire binary composition in a wide range of temperatures. The non-linear behavior of molar volume and thermal expansivity of binary melt depending on SiO2 content are well reproduced by the present model.

  20. The multiscale coarse-graining method. XI. Accurate interactions based on the centers of charge of coarse-grained sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Zhen; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    It is essential to be able to systematically construct coarse-grained (CG) models that can efficiently and accurately reproduce key properties of higher-resolution models such as all-atom. To fulfill this goal, a mapping operator is needed to transform the higher-resolution configuration to a CG configuration. Certain mapping operators, however, may lose information related to the underlying electrostatic properties. In this paper, a new mapping operator based on the centers of charge of CG sites is proposed to address this issue. Four example systems are chosen to demonstrate this concept. Within the multiscale coarse-graining framework, CG models that use this mapping operatormore » are found to better reproduce the structural correlations of atomistic models. The present work also demonstrates the flexibility of the mapping operator and the robustness of the force matching method. For instance, important functional groups can be isolated and emphasized in the CG model.« less

  1. Accurately estimating PSF with straight lines detected by Hough transform

    NASA Astrophysics Data System (ADS)

    Wang, Ruichen; Xu, Liangpeng; Fan, Chunxiao; Li, Yong

    2018-04-01

    This paper presents an approach to estimating point spread function (PSF) from low resolution (LR) images. Existing techniques usually rely on accurate detection of ending points of the profile normal to edges. In practice however, it is often a great challenge to accurately localize profiles of edges from a LR image, which hence leads to a poor PSF estimation of the lens taking the LR image. For precisely estimating the PSF, this paper proposes firstly estimating a 1-D PSF kernel with straight lines, and then robustly obtaining the 2-D PSF from the 1-D kernel by least squares techniques and random sample consensus. Canny operator is applied to the LR image for obtaining edges and then Hough transform is utilized to extract straight lines of all orientations. Estimating 1-D PSF kernel with straight lines effectively alleviates the influence of the inaccurate edge detection on PSF estimation. The proposed method is investigated on both natural and synthetic images for estimating PSF. Experimental results show that the proposed method outperforms the state-ofthe- art and does not rely on accurate edge detection.

  2. Gene expression profiling of whole blood: Comparison of target preparation methods for accurate and reproducible microarray analysis

    PubMed Central

    Vartanian, Kristina; Slottke, Rachel; Johnstone, Timothy; Casale, Amanda; Planck, Stephen R; Choi, Dongseok; Smith, Justine R; Rosenbaum, James T; Harrington, Christina A

    2009-01-01

    Background Peripheral blood is an accessible and informative source of transcriptomal information for many human disease and pharmacogenomic studies. While there can be significant advantages to analyzing RNA isolated from whole blood, particularly in clinical studies, the preparation of samples for microarray analysis is complicated by the need to minimize artifacts associated with highly abundant globin RNA transcripts. The impact of globin RNA transcripts on expression profiling data can potentially be reduced by using RNA preparation and labeling methods that remove or block globin RNA during the microarray assay. We compared four different methods for preparing microarray hybridization targets from human whole blood collected in PAXGene tubes. Three of the methods utilized the Affymetrix one-cycle cDNA synthesis/in vitro transcription protocol but varied treatment of input RNA as follows: i. no treatment; ii. treatment with GLOBINclear; or iii. treatment with globin PNA oligos. In the fourth method cDNA targets were prepared with the Ovation amplification and labeling system. Results We find that microarray targets generated with labeling methods that reduce globin mRNA levels or minimize the impact of globin transcripts during hybridization detect more transcripts in the microarray assay compared with the standard Affymetrix method. Comparison of microarray results with quantitative PCR analysis of a panel of genes from the NF-kappa B pathway shows good correlation of transcript measurements produced with all four target preparation methods, although method-specific differences in overall correlation were observed. The impact of freezing blood collected in PAXGene tubes on data reproducibility was also examined. Expression profiles show little or no difference when RNA is extracted from either fresh or frozen blood samples. Conclusion RNA preparation and labeling methods designed to reduce the impact of globin mRNA transcripts can significantly improve the

  3. How reproducible is cutaneous electrogastrography? An in-depth evidence-based study.

    PubMed

    Jonderko, K; Kasicka-Jonderko, A; Krusiec-Swidergoł, B; Dzielicki, M; Strój, L; Doliński, M; Doliński, K; Błońska-Fajfrowska, B

    2005-12-01

    To check on reproducibility of parameters of the cutaneous electrogastrogram registered at a close or a distant time span. Twenty-two volunteers recruited by an advertisement (11 females and 11 males, median age 25 years, range: 18-35) underwent three surface electrogastrography examinations of which two were taken on consecutive days and the third one was accomplished at least 2 weeks before or after the two other sessions. The examination involved a 30-min fasted recording, followed by a 90-min postprandial registration after intake of a 394-kcal mixed solid-liquid test meal. Parameters of the electrogastrogram pertaining to the frequency of the gastric slow waves exhibited good to moderate reproducibility, whereas fair reproducibility characterized parameters expected to describe the power of gastric slow waves. With the exception of the difference fed minus fasted power (DeltaDP), in no instance was the medium term reproducibility any worse than the short term one. Categorical data analysis revealed that the relative time share of normogastria postprandially exhibited a better reproducibility than in the fasted period. The Cohen's kappa-value of 0.459 for the DeltaDP for the medium term reproducibility placed this parameter within the range of moderate agreement between repeat examinations. Of the two two-parameter combinations considered, the alliance of the fasted and fed normogastria performed worse than any of those parameters considered alone, whereas a combination of the DeltaDP with the fed-state normogastria revealed a kappa-value amounting to 0.510 for the medium term reproducibility. The feasibility of some electrogastrographic parameters to convey clinically useful information may be hampered by their fair reproducibility. Recoding of parameters of the cutaneous electrogastrogram from primary continuous to secondary categorical may help achieve a better agreement between repeat examinations.

  4. Three-dimensional reproducibility of natural head position.

    PubMed

    Weber, Diana W; Fallis, Drew W; Packer, Mark D

    2013-05-01

    Although natural head position has proven to be reliable in the sagittal plane, with an increasing interest in 3-dimensional craniofacial analysis, a determination of its reproducibility in the coronal and axial planes is essential. This study was designed to evaluate the reproducibility of natural head position over time in the sagittal, coronal, and axial planes of space with 3-dimensional imaging. Three-dimensional photographs were taken of 28 adult volunteers (ages, 18-40 years) in natural head position at 5 times: baseline, 4 hours, 8 hours, 24 hours, and 1 week. Using the true vertical and horizontal laser lines projected in an iCAT cone-beam computed tomography machine (Imaging Sciences International, Hatfield, Pa) for orientation, we recorded references for natural head position on the patient's face with semipermanent markers. By using a 3-dimensional camera system, photographs were taken at each time point to capture the orientation of the reference points. By superimposing each of the 5 photographs on stable anatomic surfaces, changes in the position of the markers were recorded and assessed for parallelism by using 3dMDvultus (3dMD, Atlanta, Ga) and software (Dolphin Imaging & Management Solutions, Chatsworth, Calif). No statistically significant differences were observed between the 5 time points in any of the 3 planes of space. However, a statistically significant difference was observed between the mean angular deviations of 3 reference planes, with a hierarchy of natural head position reproducibility established as coronal > axial > sagittal. Within the parameters of this study, natural head position was found to be reproducible in the sagittal, coronal, and axial planes of space. The coronal plane had the least variation over time, followed by the axial and sagittal planes. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  5. Vesicular Disease in 9-Week-Old Pigs Experimentally Infected with Senecavirus A

    DOE PAGES

    Montiel, Nestor; Buckley, Alexandra; Guo, Baoqing; ...

    2016-07-01

    Senecavirus A has been infrequently associated with vesicular disease in swine since 1988. However, clinical disease has not been reproduced after experimental infection with this virus. Here we report vesicular disease in 9-week-old pigs after Sencavirus A infection by the intranasal route under experimental conditions.

  6. Vesicular Disease in 9-Week-Old Pigs Experimentally Infected with Senecavirus A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montiel, Nestor; Buckley, Alexandra; Guo, Baoqing

    Senecavirus A has been infrequently associated with vesicular disease in swine since 1988. However, clinical disease has not been reproduced after experimental infection with this virus. Here we report vesicular disease in 9-week-old pigs after Sencavirus A infection by the intranasal route under experimental conditions.

  7. Reproducibility of the time to peak torque and the joint angle at peak torque on knee of young sportsmen on the isokinetic dynamometer.

    PubMed

    Bernard, P-L; Amato, M; Degache, F; Edouard, P; Ramdani, S; Blain, H; Calmels, P; Codine, P

    2012-05-01

    Although peak torque has shown acceptable reproducibility, this may not be the case with two other often used parameters: time to peak torque (TPT) and the angle of peak torque (APT). Those two parameters should be used for the characterization of muscular adaptations in athletes. The isokinetic performance of the knee extensors and flexors in both limbs was measured in 29 male athletes. The experimental protocol consisted of three consecutive identical paradigms separated by 45 min breaks. Each test consisted of four maximal concentric efforts performed at 60 and 180°/s. Reproducibility was quantified by the standard error measurement (SEM), the coefficient of variation (CV) and by means of intra-class correlation coefficients (ICCs) with the calculation of 6 forms of ICCs. Using ICC as the indicator of reproducibility, the correlations for TPT of both limbs showed a range of 0.51-0.65 in extension and 0.50-0.63 in flexion. For APT, the values were 0.46-0.60 and 0.51-0.81, respectively. In addition, the calculated standard error of measurement (SEM) and CV scores confirmed the low level of absolute reproducibility. Due to their low reproducibility, neither TPT nor APT can serve as independent isokinetic parameters of knee flexor and extensor performance. So, given its reproducibility level, TPT and APT should not be used for the characterization of muscular adaptations in athletes. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  8. Measures of low back function: a review of reproducibility studies.

    PubMed

    Essendrop, Morten; Maul, Irina; Läubli, Thomas; Riihimäki, Hilkka; Schibye, Bente

    2002-05-01

    The objective of the present study was to make a systematic literature review with preset quality criteria concerning reproducibility of the tests of the low back regarding strength, endurance and range of motion. Literature in Medline and local databases was reviewed for articles concerning the reproducibility of strength, endurance, and range of motion measurements. Measures of low back function are widely used, and are important for both clinical and research purposes in relation to low back problems. A review of the reproducibility of these tests has not previously been made. After extensive discussion among all the authors, general evaluation parameters were defined for the quality assessment. Every study was graded from 0 to 2 for each parameter. Parameters evaluated were: number of subjects, subject description, method description, test/retest interval, description of results, and statistics. The literature search revealed a total of 79 studies. Most studies suffered from methodological weaknesses and only eleven studies received ten or more quality points (maximum 14). The results from the highest graded studies are highlighted. It may be concluded that there is a considerable lack of information about the reproducibility of functional measures for the low back, and therefore a recommendation for consensus is difficult. However, most tests performed in the sagittal plane are reliable for use on groups. Measures of low back function are thought to be of great importance for clinicians, and low back researchers in general. A review of reproducibility will be helpful both as a survey of tests, and to provide information on the usefulness in relation to the level of reproducibility.

  9. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  10. The MIMIC Code Repository: enabling reproducibility in critical care research.

    PubMed

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  11. A study of accurate exchange-correlation functionals through adiabatic connection

    NASA Astrophysics Data System (ADS)

    Singh, Rabeet; Harbola, Manoj K.

    2017-10-01

    A systematic way of improving exchange-correlation energy functionals of density functional theory has been to make them satisfy more and more exact relations. Starting from the initial generalized gradient approximation (GGA) functionals, this has culminated into the recently proposed SCAN (strongly constrained and appropriately normed) functional that satisfies several known constraints and is appropriately normed. The ultimate test for the functionals developed is the accuracy of energy calculated by employing them. In this paper, we test these exchange-correlation functionals—the GGA hybrid functionals B3LYP and PBE0 and the meta-GGA functional SCAN—from a different perspective. We study how accurately these functionals reproduce the exchange-correlation energy when electron-electron interaction is scaled as αVee with α varying between 0 and 1. Our study reveals interesting comparison between these functionals and the associated difference Tc between the interacting and the non-interacting kinetic energy for the same density.

  12. Materials for interocclusal records and their ability to reproduce a 3-dimensional jaw relationship.

    PubMed

    Ockert-Eriksson, G; Eriksson, A; Lockowandt, P; Eriksson, O

    2000-01-01

    The purpose of this study was to determine if accuracy and dimensional stability of vinyl polysiloxanes and irreversible hydrocolloids stabilized by a tray used for fixed prosthodontics, removable partial, and complete denture cases are comparable to those of waxes and record rims and if storage time (24 hours or 6 days) affects dimensional stability of the tested materials. Two waxes, two record rims, three vinyl polysiloxanes, and one irreversible hydrocolloid (alginate) were examined. Three pairs of master casts with measuring steel rods were mounted on an articulator (initial position). Five records were made of each material, and the upper cast was remounted after 24 hours or 6 days so that deviations from the initial position could be measured. Vinyl polysiloxanes reinforced by a stabilization tray were the most accurate materials able to reproduce a settled interocclusal position. Mounting casts (fixed prosthodontics cases) without records gave accuracy similar to wax records. Record rims used for removable partial and complete denture cases produced lesser accuracy than vinyl polysiloxanes and irreversible hydrocolloid stabilized by a tray. Accuracy was not significantly affected by storage time. The results show that accuracy of vinyl polysiloxanes and irreversible hydrocolloids reinforced by a tray is superior to that of record rims with regard to the complete denture case and is among the most accurate with regard to the removable partial denture case. For fixed prosthodontics, however, reinforcement is unnecessary.

  13. Robust and accurate vectorization of line drawings.

    PubMed

    Hilaire, Xavier; Tombre, Karl

    2006-06-01

    This paper presents a method for vectorizing the graphical parts of paper-based line drawings. The method consists of separating the input binary image into layers of homogeneous thickness, skeletonizing each layer, segmenting the skeleton by a method based on random sampling, and simplifying the result. The segmentation method is robust with a best bound of 50 percent noise reached for indefinitely long primitives. Accurate estimation of the recognized vector's parameters is enabled by explicitly computing their feasibility domains. Theoretical performance analysis and expression of the complexity of the segmentation method are derived. Experimental results and comparisons with other vectorization systems are also provided.

  14. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  15. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  16. Reproducibility of dynamically represented acoustic lung images from healthy individuals

    PubMed Central

    Maher, T M; Gat, M; Allen, D; Devaraj, A; Wells, A U; Geddes, D M

    2008-01-01

    Background and aim: Acoustic lung imaging offers a unique method for visualising the lung. This study was designed to demonstrate reproducibility of acoustic lung images recorded from healthy individuals at different time points and to assess intra- and inter-rater agreement in the assessment of dynamically represented acoustic lung images. Methods: Recordings from 29 healthy volunteers were made on three separate occasions using vibration response imaging. Reproducibility was measured using quantitative, computerised assessment of vibration energy. Dynamically represented acoustic lung images were scored by six blinded raters. Results: Quantitative measurement of acoustic recordings was highly reproducible with an intraclass correlation score of 0.86 (very good agreement). Intraclass correlations for inter-rater agreement and reproducibility were 0.61 (good agreement) and 0.86 (very good agreement), respectively. There was no significant difference found between the six raters at any time point. Raters ranged from 88% to 95% in their ability to identically evaluate the different features of the same image presented to them blinded on two separate occasions. Conclusion: Acoustic lung imaging is reproducible in healthy individuals. Graphic representation of lung images can be interpreted with a high degree of accuracy by the same and by different reviewers. PMID:18024534

  17. Reproducibility of an optical measurement system for the clinical evaluation of active knee rotation in weight-bearing, healthy subjects.

    PubMed

    Testa, R; Chouteau, J; Viste, A; Cheze, L; Fessy, M-H; Moyen, B

    2012-04-01

    A knee is typically evaluated passively by a clinician during an office visit, without using dedicated measurement tools. When the knee is evaluated with the patient standing and actively participating in the movement, the results will differ than when the knee is passively moved through its range-of-motion by the surgeon. If a precise measurement system was available, it could provide additional information to the clinician during this evaluation. The goal of this study was to verify the reproducibility of a fast, flexible optical measurement system to measure rotational knee laxity during weight-bearing. Two passive reflective targets were placed on the legs of 11 subjects to monitor femur and tibia displacements in three dimensions. Subjects performed internal and external rotation movements with the knee extended or flexed 30°. During each movement, seven variables were measured: internal rotation, external rotation and overall laxity in extension and 30° flexion, along with neutral rotation value in 30° flexion. Measurement accuracy was also assessed and the right and left knees were compared. Reproducibility was assessed over two measurements sessions. The calculated intra-class correlation coefficient (ICC) for reproducibility was above 0.9 for five of the seven variables measured. The calculated ICC for the right/left comparison was above 0.75 for five of the seven variables measured. These results confirmed that the proposed system provides reproducible measurements. Our right/left comparison results were consistent with the published literature. This system is fast, reproducible and flexible, which makes it suitable for assessing various weight-bearing movements during clinical evaluations. Level III, experimental study. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  18. Logical Experimental Design and Execution in the Biomedical Sciences.

    PubMed

    Holder, Daniel J; Marino, Michael J

    2017-03-17

    Lack of reproducibility has been highlighted as a significant problem in biomedical research. The present unit is devoted to describing ways to help ensure that research findings can be replicated by others, with a focus on the design and execution of laboratory experiments. Essential components for this include clearly defining the question being asked, using available information or information from pilot studies to aid in the design the experiment, and choosing manipulations under a logical framework based on Mill's "methods of knowing" to build confidence in putative causal links. Final experimental design requires systematic attention to detail, including the choice of controls, sample selection, blinding to avoid bias, and the use of power analysis to determine the sample size. Execution of the experiment is done with care to ensure that the independent variables are controlled and the measurements of the dependent variables are accurate. While there are always differences among laboratories with respect to technical expertise, equipment, and suppliers, execution of the steps itemized in this unit will ensure well-designed and well-executed experiments to answer any question in biomedical research. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  19. Reproducibility of cerebrospinal venous blood flow and vessel anatomy with the use of phase contrast-vastly undersampled isotropic projection reconstruction and contrast-enhanced MRA.

    PubMed

    Schrauben, E M; Johnson, K M; Huston, J; Del Rio, A M; Reeder, S B; Field, A; Wieben, O

    2014-05-01

    The chronic cerebrospinal venous insufficiency hypothesis raises interest in cerebrospinal venous blood flow imaging, which is more complex and less established than in arteries. For accurate assessment of venous flow in chronic cerebrospinal venous insufficiency diagnosis and research, we must account for physiologic changes in flow patterns. This study examines day-to-day flow variability in cerebrospinal veins by use of 4D MR flow and contrast-enhanced MRA under typical, uncontrolled conditions in healthy individuals. Ten healthy volunteers were scanned in a test-retest fashion by use of a 4D flow MR imaging technique and contrast-enhanced MRA. Flow parameters obtained from phase contrast-vastly undersampled isotropic projection reconstruction and contrast-enhanced MRA scoring measurements in the head, neck, and chest veins were analyzed for internal consistency and interscan reproducibility. Internal consistency was satisfied at the torcular herophili, with an input-output difference of 2.2%. Percentages of variations in flow were 20.3%, internal jugular vein; 20.4%, azygos vein; 6.8%, transverse sinus; and 5.1%, common carotid artery. Retrograde flow was found in the lower internal jugular vein (4.8%) and azygos vein (7.2%). Contrast-enhanced MRA interscan κ values for the internal jugular vein (left: 0.474, right: 0.366) and azygos vein (-0.053) showed poor interscan agreement. Phase contrast-vastly undersampled isotropic projection reconstruction blood flow measurements are reliable and highly reproducible in intracranial veins and in the common carotid artery but not in veins of the neck (internal jugular vein) and chest (azygos vein) because of normal physiologic variation. Retrograde flow normally may be observed in the lower internal jugular vein and azygos vein. Low interrater agreement in contrast-enhanced MRA scans was observed. These findings have important implications for imaging diagnosis and experimental research of chronic cerebrospinal venous

  20. Accurate Identification of Fear Facial Expressions Predicts Prosocial Behavior

    PubMed Central

    Marsh, Abigail A.; Kozak, Megan N.; Ambady, Nalini

    2009-01-01

    The fear facial expression is a distress cue that is associated with the provision of help and prosocial behavior. Prior psychiatric studies have found deficits in the recognition of this expression by individuals with antisocial tendencies. However, no prior study has shown accuracy for recognition of fear to predict actual prosocial or antisocial behavior in an experimental setting. In 3 studies, the authors tested the prediction that individuals who recognize fear more accurately will behave more prosocially. In Study 1, participants who identified fear more accurately also donated more money and time to a victim in a classic altruism paradigm. In Studies 2 and 3, participants’ ability to identify the fear expression predicted prosocial behavior in a novel task designed to control for confounding variables. In Study 3, accuracy for recognizing fear proved a better predictor of prosocial behavior than gender, mood, or scores on an empathy scale. PMID:17516803

  1. Accurate identification of fear facial expressions predicts prosocial behavior.

    PubMed

    Marsh, Abigail A; Kozak, Megan N; Ambady, Nalini

    2007-05-01

    The fear facial expression is a distress cue that is associated with the provision of help and prosocial behavior. Prior psychiatric studies have found deficits in the recognition of this expression by individuals with antisocial tendencies. However, no prior study has shown accuracy for recognition of fear to predict actual prosocial or antisocial behavior in an experimental setting. In 3 studies, the authors tested the prediction that individuals who recognize fear more accurately will behave more prosocially. In Study 1, participants who identified fear more accurately also donated more money and time to a victim in a classic altruism paradigm. In Studies 2 and 3, participants' ability to identify the fear expression predicted prosocial behavior in a novel task designed to control for confounding variables. In Study 3, accuracy for recognizing fear proved a better predictor of prosocial behavior than gender, mood, or scores on an empathy scale.

  2. Fast and Accurate Exhaled Breath Ammonia Measurement

    PubMed Central

    Solga, Steven F.; Mudalel, Matthew L.; Spacek, Lisa A.; Risby, Terence H.

    2014-01-01

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations. PMID:24962141

  3. Accurate experimental determination of the isotope effects on the triple point temperature of water. I. Dependence on the 2H abundance

    NASA Astrophysics Data System (ADS)

    Faghihi, V.; Peruzzi, A.; Aerts-Bijma, A. T.; Jansen, H. G.; Spriensma, J. J.; van Geel, J.; Meijer, H. A. J.

    2015-12-01

    Variation in the isotopic composition of water is one of the major contributors to uncertainty in the realization of the triple point of water (TPW). Although the dependence of the TPW on the isotopic composition of the water has been known for years, there is still a lack of a detailed and accurate experimental determination of the values for the correction constants. This paper is the first of two articles (Part I and Part II) that address quantification of isotope abundance effects on the triple point temperature of water. In this paper, we describe our experimental assessment of the 2H isotope effect. We manufactured five triple point cells with prepared water mixtures with a range of 2H isotopic abundances encompassing widely the natural abundance range, while the 18O and 17O isotopic abundance were kept approximately constant and the 18O  -  17O ratio was close to the Meijer-Li relationship for natural waters. The selected range of 2H isotopic abundances led to cells that realised TPW temperatures between approximately  -140 μK to  +2500 μK with respect to the TPW temperature as realized by VSMOW (Vienna Standard Mean Ocean Water). Our experiment led to determination of the value for the δ2H correction parameter of A2H  =  673 μK / (‰ deviation of δ2H from VSMOW) with a combined uncertainty of 4 μK (k  =  1, or 1σ).

  4. Machine learning of accurate energy-conserving molecular force fields.

    PubMed

    Chmiela, Stefan; Tkatchenko, Alexandre; Sauceda, Huziel E; Poltavsky, Igor; Schütt, Kristof T; Müller, Klaus-Robert

    2017-05-01

    Using conservation of energy-a fundamental property of closed classical and quantum mechanical systems-we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio molecular dynamics (AIMD) trajectories. The GDML implementation is able to reproduce global potential energy surfaces of intermediate-sized molecules with an accuracy of 0.3 kcal mol -1 for energies and 1 kcal mol -1 Å̊ -1 for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.

  5. Machine learning of accurate energy-conserving molecular force fields

    PubMed Central

    Chmiela, Stefan; Tkatchenko, Alexandre; Sauceda, Huziel E.; Poltavsky, Igor; Schütt, Kristof T.; Müller, Klaus-Robert

    2017-01-01

    Using conservation of energy—a fundamental property of closed classical and quantum mechanical systems—we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio molecular dynamics (AIMD) trajectories. The GDML implementation is able to reproduce global potential energy surfaces of intermediate-sized molecules with an accuracy of 0.3 kcal mol−1 for energies and 1 kcal mol−1 Å̊−1 for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods. PMID:28508076

  6. Accurate deuterium spectroscopy for fundamental studies

    NASA Astrophysics Data System (ADS)

    Wcisło, P.; Thibault, F.; Zaborowski, M.; Wójtewicz, S.; Cygan, A.; Kowzan, G.; Masłowski, P.; Komasa, J.; Puchalski, M.; Pachucki, K.; Ciuryło, R.; Lisak, D.

    2018-07-01

    We present an accurate measurement of the weak quadrupole S(2) 2-0 line in self-perturbed D2 and theoretical ab initio calculations of both collisional line-shape effects and energy of this rovibrational transition. The spectra were collected at the 247-984 Torr pressure range with a frequency-stabilized cavity ring-down spectrometer linked to an optical frequency comb (OFC) referenced to a primary time standard. Our line-shape modeling employed quantum calculations of molecular scattering (the pressure broadening and shift and their speed dependencies were calculated, while the complex frequency of optical velocity-changing collisions was fitted to experimental spectra). The velocity-changing collisions are handled with the hard-sphere collisional kernel. The experimental and theoretical pressure broadening and shift are consistent within 5% and 27%, respectively (the discrepancy for shift is 8% when referred not to the speed averaged value, which is close to zero, but to the range of variability of the speed-dependent shift). We use our high pressure measurement to determine the energy, ν0, of the S(2) 2-0 transition. The ab initio line-shape calculations allowed us to mitigate the expected collisional systematics reaching the 410 kHz accuracy of ν0. We report theoretical determination of ν0 taking into account relativistic and QED corrections up to α5. Our estimation of the accuracy of the theoretical ν0 is 1.3 MHz. We observe 3.4σ discrepancy between experimental and theoretical ν0.

  7. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  8. Quantification of atrial dynamics using cardiovascular magnetic resonance: inter-study reproducibility.

    PubMed

    Kowallick, Johannes T; Morton, Geraint; Lamata, Pablo; Jogiya, Roy; Kutty, Shelby; Hasenfuß, Gerd; Lotz, Joachim; Nagel, Eike; Chiribiri, Amedeo; Schuster, Andreas

    2015-05-17

    Cardiovascular magnetic resonance (CMR) offers quantification of phasic atrial functions based on volumetric assessment and more recently, on CMR feature tracking (CMR-FT) quantitative strain and strain rate (SR) deformation imaging. Inter-study reproducibility is a key requirement for longitudinal studies but has not been defined for CMR-based quantification of left atrial (LA) and right atrial (RA) dynamics. Long-axis 2- and 4-chamber cine images were acquired at 9:00 (Exam A), 9:30 (Exam B) and 14:00 (Exam C) in 16 healthy volunteers. LA and RA reservoir, conduit and contractile booster pump functions were quantified by volumetric indexes as derived from fractional volume changes and by strain and SR as derived from CMR-FT. Exam A and B were compared to assess the inter-study reproducibility. Morning and afternoon scans were compared to address possible diurnal variation of atrial function. Inter-study reproducibility was within acceptable limits for all LA and RA volumetric, strain and SR parameters. Inter-study reproducibility was better for volumetric indexes and strain than for SR parameters and better for LA than for RA dynamics. For the LA, reservoir function showed the best reproducibility (intraclass correlation coefficient (ICC) 0.94-0.97, coefficient of variation (CoV) 4.5-8.2%), followed by conduit (ICC 0.78-0.97, CoV 8.2-18.5%) and booster pump function (ICC 0.71-0.95, CoV 18.3-22.7). Similarly, for the RA, reproducibility was best for reservoir function (ICC 0.76-0.96, CoV 7.5-24.0%) followed by conduit (ICC 0.67-0.91, CoV 13.9-35.9) and booster pump function (ICC 0.73-0.90, CoV 19.4-32.3). Atrial dynamics were not measurably affected by diurnal variation between morning and afternoon scans. Inter-study reproducibility for CMR-based derivation of LA and RA functions is acceptable using either volumetric, strain or SR parameters with LA function showing higher reproducibility than RA function assessment. Amongst the different functional components

  9. Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments

    PubMed Central

    Russo, Francesco; Righelli, Dario

    2016-01-01

    We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414

  10. Repeatability and reproducibility of intracellular molar concentration assessed by synchrotron-based x-ray fluorescence microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merolle, L., E-mail: lucia.merolle@elettra.eu; Gianoncelli, A.; Malucelli, E., E-mail: emil.malucelli@unibo.it

    2016-01-28

    Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the majormore » obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.« less

  11. The application of midbond basis sets in efficient and accurate ab initio calculations on electron-deficient systems

    NASA Astrophysics Data System (ADS)

    Choi, Chu Hwan

    2002-09-01

    Ab initio chemistry has shown great promise in reproducing experimental results and in its predictive power. The many complicated computational models and methods seem impenetrable to an inexperienced scientist, and the reliability of the results is not easily interpreted. The application of midbond orbitals is used to determine a general method for use in calculating weak intermolecular interactions, especially those involving electron-deficient systems. Using the criteria of consistency, flexibility, accuracy and efficiency we propose a supermolecular method of calculation using the full counterpoise (CP) method of Boys and Bernardi, coupled with Moller-Plesset (MP) perturbation theory as an efficient electron-correlative method. We also advocate the use of the highly efficient and reliable correlation-consistent polarized valence basis sets of Dunning. To these basis sets, we add a general set of midbond orbitals and demonstrate greatly enhanced efficiency in the calculation. The H2-H2 dimer is taken as a benchmark test case for our method, and details of the computation are elaborated. Our method reproduces with great accuracy the dissociation energies of other previous theoretical studies. The added efficiency of extending the basis sets with conventional means is compared with the performance of our midbond-extended basis sets. The improvement found with midbond functions is notably superior in every case tested. Finally, a novel application of midbond functions to the BH5 complex is presented. The system is an unusual van der Waals complex. The interaction potential curves are presented for several standard basis sets and midbond-enhanced basis sets, as well as for two popular, alternative correlation methods. We report that MP theory appears to be superior to coupled-cluster (CC) in speed, while it is more stable than B3LYP, a widely-used density functional theory (DFT). Application of our general method yields excellent results for the midbond basis sets

  12. An Effective and Reproducible Model of Ventricular Fibrillation in Crossbred Yorkshire Swine (Sus scrofa) for Use in Physiologic Research.

    PubMed

    Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C

    2015-10-01

    Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF.

  13. Data management routines for reproducible research using the G-Node Python Client library

    PubMed Central

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654

  14. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  15. Accurate SERS detection of malachite green in aquatic products on basis of graphene wrapped flexible sensor.

    PubMed

    Ouyang, Lei; Yao, Ling; Zhou, Taohong; Zhu, Lihua

    2018-10-16

    Malachite Green (MG) is a banned pesticide for aquaculture products. As a required inspection item, its fast and accurate determination before the products' accessing market is very important. Surface enhanced Raman scattering (SERS) is a promising tool for MG sensing, but it requires the overcoming of several problems such as fairly poor sensitivity and reproducibility, especially laser induced chemical conversion and photo-bleaching during SERS observation. By using a graphene wrapped Ag array based flexible membrane sensor, a modified SERS strategy was proposed for the sensitive and accurate detection of MG. The graphene layer functioned as an inert protector for impeding chemical transferring of the bioproduct Leucomalachite Green (LMG) to MG during the SERS detection, and as a heat transmitter for preventing laser induced photo-bleaching, which enables the separate detection of MG and LMG in fish extracts. The combination of the Ag array and the graphene cover also produced plentiful densely and uniformly distributed hot spots, leading to analytical enhancement factor up to 3.9 × 10 8 and excellent reproducibility (relative standard deviation low to 5.8% for 70 runs). The proposed method was easily used for MG detection with limit of detection (LOD) as low as 2.7 × 10 -11  mol L -1 . The flexibility of the sensor enable it have a merit for in-field fast detection of MG residues on the scale of a living fish through a surface extraction and paste transferring manner. The developed strategy was successfully applied in the analysis of real samples, showing good prospects for both the fast inspection and quantitative detection of MG. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Research Elements: new article types by Elsevier to facilitate reproducibility in science

    NASA Astrophysics Data System (ADS)

    Zudilova-Seinstra, Elena; van Hensbergen, Kitty; Wacek, Bart

    2016-04-01

    When researchers start to make plans for new experiments, this is the beginning of a whole cycle of work, including experimental designs, tweaking of existing methods, developing protocols, writing code, collecting and processing experimental data, etc. A large part of this very useful information rarely gets published, which makes experiments difficult to reproduce. The same holds for experimental data, which is not always provided in a reusable format and lacks descriptive information. Furthermore, many types of data, such as a replication data, negative datasets or data from "intermediate experiments" often don't get published because they have no place in a research journal. To address this concern, Elsevier launched a series of peer-reviewed journal titles grouped under the umbrella of Research Elements (https://www.elsevier.com/books-and-journals/research-elements) that allow researchers to publish their data, software, materials and methods and other elements of the research cycle in a brief article format. To facilitate reproducibility, Research Elements have thoroughly thought out submission templates that include all necessary information and metadata as well as peer-review criteria defined per article type. Research Elements can be applicable to multiple research areas; for example, a number of multidisciplinary journals (Data in Brief, SoftwareX, MethodsX) welcome submissions from a large number of subject areas. At other times, these elements are better served within a single field; therefore, a number of domain-specific journals (e.g.: Genomics Data, Chemical Data Collections, Neurocomputing) support the new article formats, too. Upon publication, all Research Elements are assigned with persistent identifiers for direct citation and easy discoverability. Persistent identifiers are also used for interlinking Research Elements and relevant research papers published in traditional journals. Some Research Elements allow post-publication article updates

  17. Methods to increase reproducibility in differential gene expression via meta-analysis

    PubMed Central

    Sweeney, Timothy E.; Haynes, Winston A.; Vallania, Francesco; Ioannidis, John P.; Khatri, Purvesh

    2017-01-01

    Findings from clinical and biological studies are often not reproducible when tested in independent cohorts. Due to the testing of a large number of hypotheses and relatively small sample sizes, results from whole-genome expression studies in particular are often not reproducible. Compared to single-study analysis, gene expression meta-analysis can improve reproducibility by integrating data from multiple studies. However, there are multiple choices in designing and carrying out a meta-analysis. Yet, clear guidelines on best practices are scarce. Here, we hypothesized that studying subsets of very large meta-analyses would allow for systematic identification of best practices to improve reproducibility. We therefore constructed three very large gene expression meta-analyses from clinical samples, and then examined meta-analyses of subsets of the datasets (all combinations of datasets with up to N/2 samples and K/2 datasets) compared to a ‘silver standard’ of differentially expressed genes found in the entire cohort. We tested three random-effects meta-analysis models using this procedure. We showed relatively greater reproducibility with more-stringent effect size thresholds with relaxed significance thresholds; relatively lower reproducibility when imposing extraneous constraints on residual heterogeneity; and an underestimation of actual false positive rate by Benjamini–Hochberg correction. In addition, multivariate regression showed that the accuracy of a meta-analysis increased significantly with more included datasets even when controlling for sample size. PMID:27634930

  18. 36 CFR 903.12 - Fees for furnishing and reproducing records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Fees for furnishing and reproducing records. 903.12 Section 903.12 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION PRIVACY ACT § 903.12 Fees for furnishing and reproducing records. (a) Individuals will not be...

  19. Reproducibility of Macular Pigment Optical Density Measurement by Two-wave Length Auto-fluorescence in a Clinical Setting

    PubMed Central

    You, Qi-Sheng; Bartsch, Dirk-Uwe G.; Espina, Mark; Alam, Mostafa; Camacho, Natalia; Mendoza, Nadia; Freeman, William

    2015-01-01

    Purpose Macular pigment, composed of lutein, zeaxanthin, and meso-zeaxanthin, is postulated to protect against age-related macular degeneration (AMD), likely due to filtering blue light and its antioxidant properties. Macular pigment optical density (MPOD) is reported to be associated with macular function evaluated by visual acuity and multifocal electroretinogram. Given the importance of macular pigment, reliable and accurate measurement methods are important. The main purpose of current study is to determine the reproducibility of MPOD measurement by two-wave length auto-fluorescence method using scanning laser ophthalmoscopy. Methods Sixty eight eyes of 39 persons were enrolled in the study, including 11 normal eyes, 16 eyes with wet AMD, 16 eyes with dry AMD, 11 eyes with macular edema due to diabetic mellitus, branch retinal vein occlusion or macular telangiectasia and 14 eyes with tractional maculopathy including vitreomacular traction, epiretinal membrane or macular hole. MPOD was measured with a two-wavelength (488 and 514 nm) auto-fluorescence method with the Spectralis HRA+OCT after pupil dilation. The measurement was repeated for each eye 10 minutes later. The Analysis of variance (ANOVA) and Bland-Altman plot were used to assess the reproducibility between the two measurements. Results The mean MPOD at eccentricities of 1° and 2° was 0.36±0.17 (range: 0.04–0.69) and 0.15±0.08(range: −0.03, 0.35) for the first measurement and 0.35±0.17 (range: 0.02, 0.68) and 0.15±0.08 (range: −0.01, 0.33) for the second measurement respectively. The difference between the two measurements was not statistically significant, and the Bland-Altman plot showed 7.4% and 5.9% points outside the 95% limits of agreement, indicating an overall excellent reproducibility. Similarly, there is no significant difference between the first and second measurements of MPOD volume within eccentricities of 1°, 2° and 6° radius, and the Bland-Altman plot showed 8.8%, 2.9% and

  20. Microbiologic tests in epidemiologic studies: are they reproducible?

    PubMed

    Aass, A M; Preus, H R; Zambon, J J; Gjermo, P

    1994-12-01

    Microbiologic assessments are often included in longitudinal studies to elucidate the significance of the association of certain Gram-negative bacteria and the development of periodontal diseases. In such studies, the reliability of methods is crucial. There are several methods to identify putative pathogens, and some of them are commercially available. The purpose of the present study was to compare the reproducibility of four different methods for detecting Actinobacillus actinomycetemcomitans, Porphyromonas gingivalis, and Prevotella intermedia in order to evaluate their usefulness in epidemiologic studies. The test panel consisted of 10 young subjects and 10 adult periodontitis patients. Subgingival plaque was sampled from sites showing bone loss and "healthy" control sites. The four different methods for detecting the target bacteria were 1) cultivation, 2) Evalusite (a chair-side kit based on ELISA), 3) OmniGene, Inc, based on DNA probes, and 4) indirect immunofluorescence (IIF). The test procedure was repeated after a 1-wk interval and was performed by one examiner. Sites reported to be positive for a microorganism by any of the four methods at one or both examinations were considered to be positive for that organism and included in the analysis. The reproducibility of the four methods was low. The IIF and the cultivation methods showed somewhat higher reproducibility than did the commercial systems. A second test was done for Evalusite, three paper points for sampling being used instead of one as described in the manual. The reproducibility of the second test was improved, indicating that the detection level of the system may influence the reliability.

  1. Experimental investigation of a transonic potential flow around a symmetric airfoil

    NASA Technical Reports Server (NTRS)

    Hiller, W. J.; Meier, G. E. A.

    1981-01-01

    Experimental flow investigations on smooth airfoils were done using numerical solutions for transonic airfoil streaming with shockless supersonic range. The experimental flow reproduced essential sections of the theoretically computed frictionless solution. Agreement is better in the expansion part of the of the flow than in the compression part. The flow was nearly stationary in the entire velocity range investigated.

  2. Simultaneous hermaphrodites reproducing in pairs self-fertilize some of their eggs: an experimental test of predictions of mixed-mating and Hermaphrodite's Dilemma theory.

    PubMed

    Lüscher, A; Milinski, M

    2003-09-01

    Theory predicts (1) that mixed-mating systems (i.e. reproduction through both selfing and outcrossing) should usually not evolve and (2) that reproducing simultaneous hermaphrodites should be in a conflict over the preferred sexual role (The Hermaphrodite's Dilemma). In an in vitro system with the endoparasitic cestode Schistocephalus solidus, a simultaneous hermaphrodite, we tested predictions of both the mixed-mating and the Hermaphrodite's Dilemma theory. Using microsatellite markers, we measured the proportion of selfed offspring and the total reproductive output of each worm within pairs varying in mean weight and weight difference. Worms produced more outbred offspring not only with increasing total weight of the pair, but also with decreasing weight difference between the two paired worms. These results suggest: (1) that this parasite species reproduces by mixed-mating, which may be maintained by stochastic density fluctuations in the definitive host and hence unpredictability of self reproduction and (2) reproductive conflict may prevent worm pairs from achieving an optimal intermediate selfing rate.

  3. Automatic, accurate, and reproducible segmentation of the brain and cerebro-spinal fluid in T1-weighted volume MRI scans and its application to serial cerebral and intracranial volumetry

    NASA Astrophysics Data System (ADS)

    Lemieux, Louis

    2001-07-01

    A new fully automatic algorithm for the segmentation of the brain and cerebro-spinal fluid (CSF) from T1-weighted volume MRI scans of the head was specifically developed in the context of serial intra-cranial volumetry. The method is an extension of a previously published brain extraction algorithm. The brain mask is used as a basis for CSF segmentation based on morphological operations, automatic histogram analysis and thresholding. Brain segmentation is then obtained by iterative tracking of the brain-CSF interface. Grey matter (GM), white matter (WM) and CSF volumes are calculated based on a model of intensity probability distribution that includes partial volume effects. Accuracy was assessed using a digital phantom scan. Reproducibility was assessed by segmenting pairs of scans from 20 normal subjects scanned 8 months apart and 11 patients with epilepsy scanned 3.5 years apart. Segmentation accuracy as measured by overlap was 98% for the brain and 96% for the intra-cranial tissues. The volume errors were: total brain (TBV): -1.0%, intra-cranial (ICV):0.1%, CSF: +4.8%. For repeated scans, matching resulted in improved reproducibility. In the controls, the coefficient of reliability (CR) was 1.5% for the TVB and 1.0% for the ICV. In the patients, the Cr for the ICV was 1.2%.

  4. Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites.

    PubMed

    Agin, Patricia Poh; Edmonds, Susan H

    2002-08-01

    The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.

  5. Improvement of experimental testing and network training conditions with genome-wide microarrays for more accurate predictions of drug gene targets

    PubMed Central

    2014-01-01

    Background Genome-wide microarrays have been useful for predicting chemical-genetic interactions at the gene level. However, interpreting genome-wide microarray results can be overwhelming due to the vast output of gene expression data combined with off-target transcriptional responses many times induced by a drug treatment. This study demonstrates how experimental and computational methods can interact with each other, to arrive at more accurate predictions of drug-induced perturbations. We present a two-stage strategy that links microarray experimental testing and network training conditions to predict gene perturbations for a drug with a known mechanism of action in a well-studied organism. Results S. cerevisiae cells were treated with the antifungal, fluconazole, and expression profiling was conducted under different biological conditions using Affymetrix genome-wide microarrays. Transcripts were filtered with a formal network-based method, sparse simultaneous equation models and Lasso regression (SSEM-Lasso), under different network training conditions. Gene expression results were evaluated using both gene set and single gene target analyses, and the drug’s transcriptional effects were narrowed first by pathway and then by individual genes. Variables included: (i) Testing conditions – exposure time and concentration and (ii) Network training conditions – training compendium modifications. Two analyses of SSEM-Lasso output – gene set and single gene – were conducted to gain a better understanding of how SSEM-Lasso predicts perturbation targets. Conclusions This study demonstrates that genome-wide microarrays can be optimized using a two-stage strategy for a more in-depth understanding of how a cell manifests biological reactions to a drug treatment at the transcription level. Additionally, a more detailed understanding of how the statistical model, SSEM-Lasso, propagates perturbations through a network of gene regulatory interactions is achieved

  6. Reproducibility of corneal astigmatism measurements with a hand held keratometer in preschool children.

    PubMed Central

    Harvey, E M; Miller, J M; Dobson, V

    1995-01-01

    AIMS--To evaluate the overall accuracy and reproducibility of the Alcon portable autokeratometer (PAK) measurements in infants and young children. METHODS--The accuracy of the Alcon PAK in measuring toric reference surfaces (1, 3, 5, and 7 D) under various suboptimal measurement conditions was assessed, and the reproducibility of PAK measurements of corneal astigmatism in newborn infants (n = 5), children (n = 19, age 3-5 years), and adults (n = 14) was evaluated. RESULTS--Measurements of toric reference surfaces indicated (a) no significant effect of distance (17-30 mm) on accuracy of measurements, (b) no systematic relation between amount of toricity and accuracy of measurements, (c) no systematic relation between angle of measurement and accuracy, (d) no difference in accuracy of measurements when the PAK is hand held in comparison with when it is mounted, (e) no difference in accuracy of measurements when axis of toricity is oriented obliquely than when it is oriented horizontally, with respect to the PAK, and (f) a small positive bias (+0.16 D) in measurement of spherical equivalent. The PAK did not prove useful for screening newborns. However, measurements were successfully obtained from 18/19 children and 14/14 adults. There was no significant difference in median measurement deviation (deviation of a subject's five measurements from his/her mean) between children (0.21 D) and adults (0.13 D). CONCLUSIONS--The PAK produces accurate measurements of surface curvature under a variety of suboptimal conditions. Variability of PAK measurements in preschool children is small enough to suggest that it would be useful for screening for corneal astigmatism in young children. PMID:8534668

  7. Accurate acoustic power measurement for low-intensity focused ultrasound using focal axial vibration velocity

    NASA Astrophysics Data System (ADS)

    Tao, Chenyang; Guo, Gepu; Ma, Qingyu; Tu, Juan; Zhang, Dong; Hu, Jimin

    2017-07-01

    Low-intensity focused ultrasound is a form of therapy that can have reversible acoustothermal effects on biological tissue, depending on the exposure parameters. The acoustic power (AP) should be chosen with caution for the sake of safety. To recover the energy of counteracted radial vibrations at the focal point, an accurate AP measurement method using the focal axial vibration velocity (FAVV) is proposed in explicit formulae and is demonstrated experimentally using a laser vibrometer. The experimental APs for two transducers agree well with theoretical calculations and numerical simulations, showing that AP is proportional to the square of the FAVV, with a fixed power gain determined by the physical parameters of the transducers. The favorable results suggest that the FAVV can be used as a valuable parameter for non-contact AP measurement, providing a new strategy for accurate power control for low-intensity focused ultrasound in biomedical engineering.

  8. Reproducibility of a four-point clinical severity score for glabellar frown lines.

    PubMed

    Honeck, P; Weiss, C; Sterry, W; Rzany, B

    2003-08-01

    Focal injections of botulinum toxin A are used successfully for the treatment of hyperkinetic facial wrinkles. Efficacy can be measured by several methods. However, so far none has been investigated for its reproducibility. Objectives To investigate the reproducibility of a clinical 0-3 score for glabellar frown lines. In the first part of the study, a standardized photographic documentation of glabellar frown lines was produced. Based on the results of this phase, a consensus atlas of glabellar frown lines was developed and participants were trained using this atlas. In the main study, 50 standardized photographs were shown on two consecutive days to 28 dermatologists. The reproducibility of the score was investigated by conventional kappa statistics. In the main study, we found an unweighted kappa according to Fleiss of 0.62 for interobserver reproducibility. Intraobserver reproducibility showed an unweighted kappa according to Cohen of between 0.57 and 0.91 for each observer, and a weighted kappa according to Cicchetti and Allison of between 0.68 and 0.94. The clinical 0-3 score for glabellar frown lines shows a good inter- and intraobserver reproducibility.

  9. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    NASA Astrophysics Data System (ADS)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  10. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    NASA Astrophysics Data System (ADS)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  11. Progress toward openness, transparency, and reproducibility in cognitive neuroscience.

    PubMed

    Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal

    2017-05-01

    Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.

  12. The quest for improved reproducibility in MALDI mass spectrometry.

    PubMed

    O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P

    2018-03-01

    Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.

  13. An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.

    PubMed

    2012-11-01

    Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.

  14. The Road to Reproducibility in Animal Research.

    PubMed

    Jilka, Robert L

    2016-07-01

    Reproducibility of research findings is the hallmark of scientific advance. However, the recently noted lack of reproducibility and transparency of published research using animal models of human biology and disease has alarmed funders, scientists, and the public. Improved reporting of methodology and better use of statistical tools are needed to enhance the quality and utility of published research. Reporting guidelines like Animal Research: Reporting In Vivo Experiments (ARRIVE) have been devised to achieve these goals, but most biomedical research journals, including the JBMR, have not been able to obtain high compliance. Cooperative efforts among authors, reviewers and editors-empowered by increased awareness of their responsibilities, and enabled by user-friendly guidelines-are needed to solve this problem. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  15. Reproducibility of structural strength and stiffness for graphite-epoxy aircraft spoilers

    NASA Technical Reports Server (NTRS)

    Howell, W. E.; Reese, C. D.

    1978-01-01

    Structural strength reproducibility of graphite epoxy composite spoilers for the Boeing 737 aircraft was evaluated by statically loading fifteen spoilers to failure at conditions simulating aerodynamic loads. Spoiler strength and stiffness data were statistically modeled using a two parameter Weibull distribution function. Shape parameter values calculated for the composite spoiler strength and stiffness were within the range of corresponding shape parameter values calculated for material property data of composite laminates. This agreement showed that reproducibility of full scale component structural properties was within the reproducibility range of data from material property tests.

  16. Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.

    PubMed

    Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo

    2015-12-01

    The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Accurate modelling of unsteady flows in collapsible tubes.

    PubMed

    Marchandise, Emilie; Flaud, Patrice

    2010-01-01

    The context of this paper is the development of a general and efficient numerical haemodynamic tool to help clinicians and researchers in understanding of physiological flow phenomena. We propose an accurate one-dimensional Runge-Kutta discontinuous Galerkin (RK-DG) method coupled with lumped parameter models for the boundary conditions. The suggested model has already been successfully applied to haemodynamics in arteries and is now extended for the flow in collapsible tubes such as veins. The main difference with cardiovascular simulations is that the flow may become supercritical and elastic jumps may appear with the numerical consequence that scheme may not remain monotone if no limiting procedure is introduced. We show that our second-order RK-DG method equipped with an approximate Roe's Riemann solver and a slope-limiting procedure allows us to capture elastic jumps accurately. Moreover, this paper demonstrates that the complex physics associated with such flows is more accurately modelled than with traditional methods such as finite difference methods or finite volumes. We present various benchmark problems that show the flexibility and applicability of the numerical method. Our solutions are compared with analytical solutions when they are available and with solutions obtained using other numerical methods. Finally, to illustrate the clinical interest, we study the emptying process in a calf vein squeezed by contracting skeletal muscle in a normal and pathological subject. We compare our results with experimental simulations and discuss the sensitivity to parameters of our model.

  18. Reproducing Domestic Laborers through Office Education.

    ERIC Educational Resources Information Center

    Valli, Linda R.

    Scholars have long acknowledged the role schools have in reproducing a sexual division of labor. Despite the reemergence of a feminist movement and anti-sex-discrimination legislation, schools are still places where boys and girls tend to study different curricula and where traditional sex roles are perpetuated. Physics, calculus, and shop classes…

  19. An accurate computational method for the diffusion regime verification

    NASA Astrophysics Data System (ADS)

    Zhokh, Alexey A.; Strizhak, Peter E.

    2018-04-01

    The diffusion regime (sub-diffusive, standard, or super-diffusive) is defined by the order of the derivative in the corresponding transport equation. We develop an accurate computational method for the direct estimation of the diffusion regime. The method is based on the derivative order estimation using the asymptotic analytic solutions of the diffusion equation with the integer order and the time-fractional derivatives. The robustness and the computational cheapness of the proposed method are verified using the experimental methane and methyl alcohol transport kinetics through the catalyst pellet.

  20. Accurate determinations of alpha(s) from realistic lattice QCD.

    PubMed

    Mason, Q; Trottier, H D; Davies, C T H; Foley, K; Gray, A; Lepage, G P; Nobes, M; Shigemitsu, J

    2005-07-29

    We obtain a new value for the QCD coupling constant by combining lattice QCD simulations with experimental data for hadron masses. Our lattice analysis is the first to (1) include vacuum polarization effects from all three light-quark flavors (using MILC configurations), (2) include third-order terms in perturbation theory, (3) systematically estimate fourth and higher-order terms, (4) use an unambiguous lattice spacing, and (5) use an [symbol: see text](a2)-accurate QCD action. We use 28 different (but related) short-distance quantities to obtain alpha((5)/(MS))(M(Z)) = 0.1170(12).

  1. On the Possibility to Combine the Order Effect with Sequential Reproducibility for Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Basieva, Irina; Khrennikov, Andrei

    2015-10-01

    In this paper we study the problem of a possibility to use quantum observables to describe a possible combination of the order effect with sequential reproducibility for quantum measurements. By the order effect we mean a dependence of probability distributions (of measurement results) on the order of measurements. We consider two types of the sequential reproducibility: adjacent reproducibility (A-A) (the standard perfect repeatability) and separated reproducibility(A-B-A). The first one is reproducibility with probability 1 of a result of measurement of some observable A measured twice, one A measurement after the other. The second one, A-B-A, is reproducibility with probability 1 of a result of A measurement when another quantum observable B is measured between two A's. Heuristically, it is clear that the second type of reproducibility is complementary to the order effect. We show that, surprisingly, this may not be the case. The order effect can coexist with a separated reproducibility as well as adjacent reproducibility for both observables A and B. However, the additional constraint in the form of separated reproducibility of the B-A-B type makes this coexistence impossible. The problem under consideration was motivated by attempts to apply the quantum formalism outside of physics, especially, in cognitive psychology and psychophysics. However, it is also important for foundations of quantum physics as a part of the problem about the structure of sequential quantum measurements.

  2. A novel method for the accurate evaluation of Poisson's ratio of soft polymer materials.

    PubMed

    Lee, Jae-Hoon; Lee, Sang-Soo; Chang, Jun-Dong; Thompson, Mark S; Kang, Dong-Joong; Park, Sungchan; Park, Seonghun

    2013-01-01

    A new method with a simple algorithm was developed to accurately measure Poisson's ratio of soft materials such as polyvinyl alcohol hydrogel (PVA-H) with a custom experimental apparatus consisting of a tension device, a micro X-Y stage, an optical microscope, and a charge-coupled device camera. In the proposed method, the initial positions of the four vertices of an arbitrarily selected quadrilateral from the sample surface were first measured to generate a 2D 1st-order 4-node quadrilateral element for finite element numerical analysis. Next, minimum and maximum principal strains were calculated from differences between the initial and deformed shapes of the quadrilateral under tension. Finally, Poisson's ratio of PVA-H was determined by the ratio of minimum principal strain to maximum principal strain. This novel method has an advantage in the accurate evaluation of Poisson's ratio despite misalignment between specimens and experimental devices. In this study, Poisson's ratio of PVA-H was 0.44 ± 0.025 (n = 6) for 2.6-47.0% elongations with a tendency to decrease with increasing elongation. The current evaluation method of Poisson's ratio with a simple measurement system can be employed to a real-time automated vision-tracking system which is used to accurately evaluate the material properties of various soft materials.

  3. Highly Accurate Quantitative Analysis Of Enantiomeric Mixtures from Spatially Frequency Encoded 1H NMR Spectra.

    PubMed

    Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas

    2018-02-06

    We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.

  4. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  5. Reproducibility of CMRO2 determination using dynamic 17 O MRI.

    PubMed

    Niesporek, Sebastian C; Umathum, Reiner; Lommen, Jonathan M; Behl, Nicolas G R; Paech, Daniel; Bachert, Peter; Ladd, Mark E; Nagel, Armin M

    2018-06-01

    To assess the reproducibility of 17 O MRI-based determination of the cerebral metabolic rate of oxygen consumption (CMRO 2 ) in healthy volunteers. To assess the influence of image acquisition and reconstruction parameters on dynamic quantification of functional parameters such as CMRO 2 . Dynamic 17 O MRI data were simulated and used to investigate influences of temporal resolution (Δt) and partial volume correction (PVC) on the determination of CMRO 2 . Three healthy volunteers were examined in two separate examinations. In vivo 17 O MRI measurements were conducted with a nominal spatial resolution of (7.5 mm) 3 using a density-adapted radial sequence with golden angle acquisition scheme. In each measurement, 4.0 ± 0.1 L of 70%-enriched 17 O gas were administered using a rebreathing system. Data were corrected with a PVC algorithm, and CMRO 2 was determined in gray matter (GM) and white matter (WM) compartments using a three-phase metabolic model (baseline, 17 O inhalation, decay phase). Comparison with the ground truth of simulations revealed improved CMRO 2 determination after application of PVC and with Δt ≤ 2:00 min. Evaluation of in vivo data yields to CMRO 2,GM  = 2.31 ± 0.1 μmol/g/min and to CMRO 2,WM  = 0.69 ± 0.04 μmol/g/min with coefficients of variation (CoV) of 0.3-5.5% and 4.3-5.0% for intra-volunteer and inter-volunteer data, respectively. This in vivo 17 O inhalation study demonstrated that the proposed experimental setup enables reproducible determination of CMRO 2 in healthy volunteers. Magn Reson Med 79:2923-2934, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  6. Magnetohydrodynamic generator experimental studies

    NASA Technical Reports Server (NTRS)

    Pierson, E. S.

    1972-01-01

    The results for an experimental study of a one wavelength MHD induction generator operating on a liquid flow are presented. First the design philosophy and the experimental generator design are summarized, including a description of the flow loop and instrumentation. Next a Fourier series method of treating the fact that the magnetic flux density produced by the stator is not a pure traveling sinusoid is described and some results summarized. This approach appears to be of interest after revisions are made, but the initial results are not accurate. Finally, some of the experimental data is summarized for various methods of excitation.

  7. Accurate Determination of the Frequency Response Function of Submerged and Confined Structures by Using PZT-Patches†.

    PubMed

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme; Egusquiza, Mònica; Bossio, Matias

    2017-03-22

    To accurately determine the dynamic response of a structure is of relevant interest in many engineering applications. Particularly, it is of paramount importance to determine the Frequency Response Function (FRF) for structures subjected to dynamic loads in order to avoid resonance and fatigue problems that can drastically reduce their useful life. One challenging case is the experimental determination of the FRF of submerged and confined structures, such as hydraulic turbines, which are greatly affected by dynamic problems as reported in many cases in the past. The utilization of classical and calibrated exciters such as instrumented hammers or shakers to determine the FRF in such structures can be very complex due to the confinement of the structure and because their use can disturb the boundary conditions affecting the experimental results. For such cases, Piezoelectric Patches (PZTs), which are very light, thin and small, could be a very good option. Nevertheless, the main drawback of these exciters is that the calibration as dynamic force transducers (relationship voltage/force) has not been successfully obtained in the past. Therefore, in this paper, a method to accurately determine the FRF of submerged and confined structures by using PZTs is developed and validated. The method consists of experimentally determining some characteristic parameters that define the FRF, with an uncalibrated PZT exciting the structure. These parameters, which have been experimentally determined, are then introduced in a validated numerical model of the tested structure. In this way, the FRF of the structure can be estimated with good accuracy. With respect to previous studies, where only the natural frequencies and mode shapes were considered, this paper discuss and experimentally proves the best excitation characteristic to obtain also the damping ratios and proposes a procedure to fully determine the FRF. The method proposed here has been validated for the structure vibrating

  8. Accurate prediction of secondary metabolite gene clusters in filamentous fungi.

    PubMed

    Andersen, Mikael R; Nielsen, Jakob B; Klitgaard, Andreas; Petersen, Lene M; Zachariasen, Mia; Hansen, Tilde J; Blicher, Lene H; Gotfredsen, Charlotte H; Larsen, Thomas O; Nielsen, Kristian F; Mortensen, Uffe H

    2013-01-02

    Biosynthetic pathways of secondary metabolites from fungi are currently subject to an intense effort to elucidate the genetic basis for these compounds due to their large potential within pharmaceutics and synthetic biochemistry. The preferred method is methodical gene deletions to identify supporting enzymes for key synthases one cluster at a time. In this study, we design and apply a DNA expression array for Aspergillus nidulans in combination with legacy data to form a comprehensive gene expression compendium. We apply a guilt-by-association-based analysis to predict the extent of the biosynthetic clusters for the 58 synthases active in our set of experimental conditions. A comparison with legacy data shows the method to be accurate in 13 of 16 known clusters and nearly accurate for the remaining 3 clusters. Furthermore, we apply a data clustering approach, which identifies cross-chemistry between physically separate gene clusters (superclusters), and validate this both with legacy data and experimentally by prediction and verification of a supercluster consisting of the synthase AN1242 and the prenyltransferase AN11080, as well as identification of the product compound nidulanin A. We have used A. nidulans for our method development and validation due to the wealth of available biochemical data, but the method can be applied to any fungus with a sequenced and assembled genome, thus supporting further secondary metabolite pathway elucidation in the fungal kingdom.

  9. Modest validity and fair reproducibility of dietary patterns derived by cluster analysis.

    PubMed

    Funtikova, Anna N; Benítez-Arciniega, Alejandra A; Fitó, Montserrat; Schröder, Helmut

    2015-03-01

    Cluster analysis is widely used to analyze dietary patterns. We aimed to analyze the validity and reproducibility of the dietary patterns defined by cluster analysis derived from a food frequency questionnaire (FFQ). We hypothesized that the dietary patterns derived by cluster analysis have fair to modest reproducibility and validity. Dietary data were collected from 107 individuals from population-based survey, by an FFQ at baseline (FFQ1) and after 1 year (FFQ2), and by twelve 24-hour dietary recalls (24-HDR). Repeatability and validity were measured by comparing clusters obtained by the FFQ1 and FFQ2 and by the FFQ2 and 24-HDR (reference method), respectively. Cluster analysis identified a "fruits & vegetables" and a "meat" pattern in each dietary data source. Cluster membership was concordant for 66.7% of participants in FFQ1 and FFQ2 (reproducibility), and for 67.0% in FFQ2 and 24-HDR (validity). Spearman correlation analysis showed reasonable reproducibility, especially in the "fruits & vegetables" pattern, and lower validity also especially in the "fruits & vegetables" pattern. κ statistic revealed a fair validity and reproducibility of clusters. Our findings indicate a reasonable reproducibility and fair to modest validity of dietary patterns derived by cluster analysis. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Exploratory results from a new rotary shear designed to reproduce the extreme deformation conditions of crustal earthquakes

    NASA Astrophysics Data System (ADS)

    Di Toro, G.; Nielsen, S. B.; Spagnuolo, E.; Smith, S.; Violay, M. E.; Niemeijer, A. R.; Di Felice, F.; Di Stefano, G.; Romeo, G.; Scarlato, P.

    2011-12-01

    A challenging goal in experimental rock deformation is to reproduce the extreme deformation conditions typical of coseismic slip in crustal earthquakes: large slip (up to 50 m), slip rates (0.1-10 m/s), accelerations (> 10 m/s2) and normal stress (> 50 MPa). Moreover, fault zones usually contain non-cohesive rocks (gouges) and fluids. The integration of all these deformation conditions is such a technical challenge that there is currently no apparatus in the world that can reproduce seismic slip. Yet, the determination of rock friction at seismic slip rates remains one of the main unknowns in earthquake physics, as it cannot be determined (or very approximately) by seismic wave inversion analysis. In the last thirty years, rotary shear apparatus were designed that combine large normal stresses and slip but low slip rates (high-pressure rotary shears first designed by Tullis) or low normal stresses but large slip rates and slip (rotary shears first designed by Shimamoto). Here we present the results of experiments using a newly-constructed Slow to HIgh Velocity Apparatus (SHIVA), installed at INGV in Rome, which extends the combination of normal stress, slip and slip rate achieved by previous apparatus and reproduces the conditions likely to occur during an earthquake in the shallow crust. SHIVA uses two brushless engines (max power 300 kW, max torque 930 Nm) and an air actuator (thrust 5 tons) in a rotary shear configuration (nominally infinite displacement) to slide hollow rock cylinders (30/50 mm int./ext. diameter) at slip rates ranging from 10 micron/s up to 6.5 m/s, accelerations up to 80 m/s2 and normal stresses up to 50 MPa. SHIVA can also perform experiments in which the torque on the sample (rather than the slip rate) is progressively increased until spontaneous failure occurs: this experimental capability should better reproduce natural conditions. The apparatus is equipped with a sample chamber to carry out experiments in the presence of fluids (up to 15

  11. Rigor, Reproducibility and in vitro CSF assays: The Devil in the Details

    PubMed Central

    Moody, Olivia A.; Talwar, Sahil; Jenkins, Meagan A.; Freeman, Amanda A.; Trotti, Lynn Marie; García, Paul S.; Bliwise, Donald; Lynch, Joseph W.; Cherson, Brad; Hernandez, Eric M; Feldman, Neil; Saini, Prabhjyot; Rye, David B.; Jenkins, Andrew

    2017-01-01

    Divergent results and misinterpretation of non-significant findings remain problematic in science – especially in retrospective, hypothesis generating, translational research.1 When such divergence occurs, it is imperative that the cause of the divergence be established. In their recent paper in Annals of Neurology, Dauvilliers et al2 challenged our earlier finding that cerebrospinal fluid (CSF) from some patients with unexplained excessive daytime sleepiness enhances the activation of GABAA receptors (GABAA-R)3. They present data from 15 subjects in which they were unable to find evidence of enhanced activation of GABAA receptors. Here we: 1) establish how flaws in Dauvilliers’ experimental design account for this difference; 2) present new data demonstrating the robustness and reproducibility of our methods and 3) summarize the clinical promise of GABAA-R antagonism in treating IH and related disorders. PMID:28440033

  12. Method for reproducibly preparing a low-melting high-carbon yield precursor

    DOEpatents

    Smith, Wesley E.; Napier, Jr., Bradley

    1978-01-01

    The present invention is directed to a method for preparing a reproducible synthetic carbon precursor by the autoclave polymerization of indene (C.sub.9 H.sub.8) at a temperature in the range of 470.degree.-485.degree. C, and at a pressure in the range of about 1000 to about 4300 psi. Volatiles in the resulting liquid indene polymer are removed by vacuum outgassing to form a solid carbon precursor characterized by having a relatively low melting temperature, high-carbon yield, and high reproducibility which provide for the fabrication of carbon and graphite composites having strict requirements for reproducible properties.

  13. Reproducibility of abdominal fat assessment by ultrasound and computed tomography.

    PubMed

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge

    2017-01-01

    To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  14. An empirical analysis of journal policy effectiveness for computational reproducibility.

    PubMed

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  15. An empirical analysis of journal policy effectiveness for computational reproducibility

    PubMed Central

    Seiler, Jennifer; Ma, Zhaokun

    2018-01-01

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050

  16. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and

  17. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  18. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  19. Postprandial appetite ratings are reproducible and moderately related to total day energy intakes, but not ad libitum lunch energy intakes, in healthy young women.

    PubMed

    Tucker, Amy J; Heap, Sarah; Ingram, Jessica; Law, Marron; Wright, Amanda J

    2016-04-01

    Reproducibility and validity testing of appetite ratings and energy intakes are needed in experimental and natural settings. Eighteen healthy young women ate a standardized breakfast for 8 days. Days 1 and 8, they rated their appetite (Hunger, Fullness, Desire to Eat, Prospective Food Consumption (PFC)) over a 3.5 h period using visual analogue scales, consumed an ad libitum lunch, left the research center and recorded food intake for the remainder of the day. Days 2-7, participants rated their at-home Hunger at 0 and 30 min post-breakfast and recorded food intake for the day. Total area under the curve (AUC) over the 180 min period before lunch, and energy intakes were calculated. Reproducibility of satiety measures between days was evaluated using coefficients of repeatability (CR), coefficients of variation (CV) and intra-class coefficients (ri). Correlation analysis was used to examine validity between satiety measures. AUCs for Hunger, Desire to Eat and PFC (ri = 0.73-0.78), ad libitum energy intakes (ri = 0.81) and total day energy intakes (ri​ = 0.48) were reproducible; fasted ratings were not. Average AUCs for Hunger, Desire to Eat and PFC, Desire to Eat at nadir and PFC at fasting, nadir and 180 min were correlated to total day energy intakes (r = 0.50-0.77, P < 0.05), but no ratings were correlated to lunch consumption. At-home Hunger ratings were weakly reproducible but not correlated to reported total energy intakes. Satiety ratings did not concur with next meal intake but PFC ratings may be useful predictors of intake. Overall, this study adds to the limited satiety research on women and challenges the accepted measures of satiety in an experimental setting. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Reproducibility of clinical research in critical care: a scoping review.

    PubMed

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  1. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    PubMed

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  2. Repeatability and Reproducibility of Retinal Nerve Fiber Layer Parameters Measured by Scanning Laser Polarimetry with Enhanced Corneal Compensation in Normal and Glaucomatous Eyes

    PubMed Central

    Ara, Mirian; Pajarin, Ana B.

    2015-01-01

    Objective. To assess the intrasession repeatability and intersession reproducibility of peripapillary retinal nerve fiber layer (RNFL) thickness parameters measured by scanning laser polarimetry (SLP) with enhanced corneal compensation (ECC) in healthy and glaucomatous eyes. Methods. One randomly selected eye of 82 healthy individuals and 60 glaucoma subjects was evaluated. Three scans were acquired during the first visit to evaluate intravisit repeatability. A different operator obtained two additional scans within 2 months after the first session to determine intervisit reproducibility. The intraclass correlation coefficient (ICC), coefficient of variation (COV), and test-retest variability (TRT) were calculated for all SLP parameters in both groups. Results. ICCs ranged from 0.920 to 0.982 for intravisit measurements and from 0.910 to 0.978 for intervisit measurements. The temporal-superior-nasal-inferior-temporal (TSNIT) average was the highest (0.967 and 0.946) in normal eyes, while nerve fiber indicator (NFI; 0.982) and inferior average (0.978) yielded the best ICC in glaucomatous eyes for intravisit and intervisit measurements, respectively. All COVs were under 10% in both groups, except NFI. TSNIT average had the lowest COV (2.43%) in either type of measurement. Intervisit TRT ranged from 6.48 to 12.84. Conclusions. The reproducibility of peripapillary RNFL measurements obtained with SLP-ECC was excellent, indicating that SLP-ECC is sufficiently accurate for monitoring glaucoma progression. PMID:26185762

  3. Repeatability and Reproducibility of Retinal Nerve Fiber Layer Parameters Measured by Scanning Laser Polarimetry with Enhanced Corneal Compensation in Normal and Glaucomatous Eyes.

    PubMed

    Ara, Mirian; Ferreras, Antonio; Pajarin, Ana B; Calvo, Pilar; Figus, Michele; Frezzotti, Paolo

    2015-01-01

    To assess the intrasession repeatability and intersession reproducibility of peripapillary retinal nerve fiber layer (RNFL) thickness parameters measured by scanning laser polarimetry (SLP) with enhanced corneal compensation (ECC) in healthy and glaucomatous eyes. One randomly selected eye of 82 healthy individuals and 60 glaucoma subjects was evaluated. Three scans were acquired during the first visit to evaluate intravisit repeatability. A different operator obtained two additional scans within 2 months after the first session to determine intervisit reproducibility. The intraclass correlation coefficient (ICC), coefficient of variation (COV), and test-retest variability (TRT) were calculated for all SLP parameters in both groups. ICCs ranged from 0.920 to 0.982 for intravisit measurements and from 0.910 to 0.978 for intervisit measurements. The temporal-superior-nasal-inferior-temporal (TSNIT) average was the highest (0.967 and 0.946) in normal eyes, while nerve fiber indicator (NFI; 0.982) and inferior average (0.978) yielded the best ICC in glaucomatous eyes for intravisit and intervisit measurements, respectively. All COVs were under 10% in both groups, except NFI. TSNIT average had the lowest COV (2.43%) in either type of measurement. Intervisit TRT ranged from 6.48 to 12.84. The reproducibility of peripapillary RNFL measurements obtained with SLP-ECC was excellent, indicating that SLP-ECC is sufficiently accurate for monitoring glaucoma progression.

  4. Reproducing kernel potential energy surfaces in biomolecular simulations: Nitric oxide binding to myoglobin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soloviov, Maksym; Meuwly, Markus, E-mail: m.meuwly@unibas.ch

    2015-09-14

    Multidimensional potential energy surfaces based on reproducing kernel-interpolation are employed to explore the energetics and dynamics of free and bound nitric oxide in myoglobin (Mb). Combining a force field description for the majority of degrees of freedom and the higher-accuracy representation for the NO ligand and the Fe out-of-plane motion allows for a simulation approach akin to a mixed quantum mechanics/molecular mechanics treatment. However, the kernel-representation can be evaluated at conventional force-field speed. With the explicit inclusion of the Fe-out-of-plane (Fe-oop) coordinate, the dynamics and structural equilibrium after photodissociation of the ligand are correctly described compared to experiment. Experimentally, themore » Fe-oop coordinate plays an important role for the ligand dynamics. This is also found here where the isomerization dynamics between the Fe–ON and Fe–NO state is significantly affected whether or not this co-ordinate is explicitly included. Although the Fe–ON conformation is metastable when considering only the bound {sup 2}A state, it may disappear once the {sup 4}A state is included. This explains the absence of the Fe–ON state in previous experimental investigations of MbNO.« less

  5. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    PubMed

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2018-02-01

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, P<0.05). Facial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  7. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  8. Electrical Detection of C-Reactive Protein Using a Single Free-Standing, Thermally Controlled Piezoresistive Microcantilever for Highly Reproducible and Accurate Measurements

    PubMed Central

    Yen, Yi-Kuang; Lai, Yu-Cheng; Hong, Wei-Ting; Pheanpanitporn, Yotsapoom; Chen, Chuin-Shan; Huang, Long-Sun

    2013-01-01

    This study demonstrates a novel method for electrical detection of C-reactive protein (CRP) as a means of identifying an infection in the body, or as a cardiovascular disease risk assay. The method uses a single free-standing, thermally controlled piezoresistive microcantilever biosensor. In a commonly used sensing arrangement of conventional dual cantilevers in the Wheatstone bridge circuit, reference and gold-coated sensing cantilevers that inherently have heterogeneous surface materials and different multilayer structures may yield independent responses to the liquid environmental changes of chemical substances, flow field and temperature, leading to unwanted signal disturbance for biosensing targets. In this study, the single free-standing microcantilever for biosensing applications is employed to resolve the dual-beam problem of individual responses in chemical solutions and, in a thermally controlled system, to maintain its sensor performance due to the sensitive temperature effect. With this type of single temperature-controlled microcantilever sensor, the electrical detection of various CRP concentrations from 1 μg/mL to 200 μg/mL was performed, which covers the clinically relevant range. Induced surface stresses were measured at between 0.25 N/m and 3.4 N/m with high reproducibility. Moreover, the binding affinity (KD) of CRP and anti-CRP interaction was found to be 18.83 ± 2.99 μg/mL, which agreed with results in previous reported studies. This biosensing technique thus proves valuable in detecting inflammation, and in cardiovascular disease risk assays. PMID:23899933

  9. Reproducibility, Controllability, and Optimization of Lenr Experiments

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  10. A nondestructive, reproducible method of measuring joint reaction force at the distal radioulnar joint.

    PubMed

    Canham, Colin D; Schreck, Michael J; Maqsoodi, Noorullah; Doolittle, Madison; Olles, Mark; Elfar, John C

    2015-06-01

    To develop a nondestructive method of measuring distal radioulnar joint (DRUJ) joint reaction force (JRF) that preserves all periarticular soft tissues and more accurately reflects in vivo conditions. Eight fresh-frozen human cadaveric limbs were obtained. A threaded Steinmann pin was placed in the middle of the lateral side of the distal radius transverse to the DRUJ. A second pin was placed into the middle of the medial side of the distal ulna colinear to the distal radial pin. Specimens were mounted onto a tensile testing machine using a custom fixture. A uniaxial distracting force was applied across the DRUJ while force and displacement were simultaneously measured. Force-displacement curves were generated and a best-fit polynomial was solved to determine JRF. All force-displacement curves demonstrated an initial high slope where relatively large forces were required to distract the joint. This ended with an inflection point followed by a linear area with a low slope, where small increases in force generated larger amounts of distraction. Each sample was measured 3 times and there was high reproducibility between repeated measurements. The average baseline DRUJ JRF was 7.5 N (n = 8). This study describes a reproducible method of measuring DRUJ reaction forces that preserves all periarticular stabilizing structures. This technique of JRF measurement may also be suited for applications in the small joints of the wrist and hand. Changes in JRF can alter native joint mechanics and lead to pathology. Reliable methods of measuring these forces are important for determining how pathology and surgical interventions affect joint biomechanics. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  11. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    PubMed

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  12. Convergence in parameters and predictions using computational experimental design.

    PubMed

    Hagen, David R; White, Jacob K; Tidor, Bruce

    2013-08-06

    Typically, biological models fitted to experimental data suffer from significant parameter uncertainty, which can lead to inaccurate or uncertain predictions. One school of thought holds that accurate estimation of the true parameters of a biological system is inherently problematic. Recent work, however, suggests that optimal experimental design techniques can select sets of experiments whose members probe complementary aspects of a biochemical network that together can account for its full behaviour. Here, we implemented an experimental design approach for selecting sets of experiments that constrain parameter uncertainty. We demonstrated with a model of the epidermal growth factor-nerve growth factor pathway that, after synthetically performing a handful of optimal experiments, the uncertainty in all 48 parameters converged below 10 per cent. Furthermore, the fitted parameters converged to their true values with a small error consistent with the residual uncertainty. When untested experimental conditions were simulated with the fitted models, the predicted species concentrations converged to their true values with errors that were consistent with the residual uncertainty. This paper suggests that accurate parameter estimation is achievable with complementary experiments specifically designed for the task, and that the resulting parametrized models are capable of accurate predictions.

  13. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    PubMed Central

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge

    2017-01-01

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. PMID:28670024

  14. Reproducible detection of disease-associated markers from gene expression data.

    PubMed

    Omae, Katsuhiro; Komori, Osamu; Eguchi, Shinto

    2016-08-18

    Detection of disease-associated markers plays a crucial role in gene screening for biological studies. Two-sample test statistics, such as the t-statistic, are widely used to rank genes based on gene expression data. However, the resultant gene ranking is often not reproducible among different data sets. Such irreproducibility may be caused by disease heterogeneity. When we divided data into two subsets, we found that the signs of the two t-statistics were often reversed. Focusing on such instability, we proposed a sign-sum statistic that counts the signs of the t-statistics for all possible subsets. The proposed method excludes genes affected by heterogeneity, thereby improving the reproducibility of gene ranking. We compared the sign-sum statistic with the t-statistic by a theoretical evaluation of the upper confidence limit. Through simulations and applications to real data sets, we show that the sign-sum statistic exhibits superior performance. We derive the sign-sum statistic for getting a robust gene ranking. The sign-sum statistic gives more reproducible ranking than the t-statistic. Using simulated data sets we show that the sign-sum statistic excludes hetero-type genes well. Also for the real data sets, the sign-sum statistic performs well in a viewpoint of ranking reproducibility.

  15. Selection and testing of reference genes for accurate RT-qPCR in rice seedlings under iron toxicity.

    PubMed

    Santos, Fabiane Igansi de Castro Dos; Marini, Naciele; Santos, Railson Schreinert Dos; Hoffman, Bianca Silva Fernandes; Alves-Ferreira, Marcio; de Oliveira, Antonio Costa

    2018-01-01

    Reverse Transcription quantitative PCR (RT-qPCR) is a technique for gene expression profiling with high sensibility and reproducibility. However, to obtain accurate results, it depends on data normalization by using endogenous reference genes whose expression is constitutive or invariable. Although the technique is widely used in plant stress analyzes, the stability of reference genes for iron toxicity in rice (Oryza sativa L.) has not been thoroughly investigated. Here, we tested a set of candidate reference genes for use in rice under this stressful condition. The test was performed using four distinct methods: NormFinder, BestKeeper, geNorm and the comparative ΔCt. To achieve reproducible and reliable results, Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines were followed. Valid reference genes were found for shoot (P2, OsGAPDH and OsNABP), root (OsEF-1a, P8 and OsGAPDH) and root+shoot (OsNABP, OsGAPDH and P8) enabling us to perform further reliable studies for iron toxicity in both indica and japonica subspecies. The importance of the study of other than the traditional endogenous genes for use as normalizers is also shown here.

  16. Selection and testing of reference genes for accurate RT-qPCR in rice seedlings under iron toxicity

    PubMed Central

    dos Santos, Fabiane Igansi de Castro; Marini, Naciele; dos Santos, Railson Schreinert; Hoffman, Bianca Silva Fernandes; Alves-Ferreira, Marcio

    2018-01-01

    Reverse Transcription quantitative PCR (RT-qPCR) is a technique for gene expression profiling with high sensibility and reproducibility. However, to obtain accurate results, it depends on data normalization by using endogenous reference genes whose expression is constitutive or invariable. Although the technique is widely used in plant stress analyzes, the stability of reference genes for iron toxicity in rice (Oryza sativa L.) has not been thoroughly investigated. Here, we tested a set of candidate reference genes for use in rice under this stressful condition. The test was performed using four distinct methods: NormFinder, BestKeeper, geNorm and the comparative ΔCt. To achieve reproducible and reliable results, Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines were followed. Valid reference genes were found for shoot (P2, OsGAPDH and OsNABP), root (OsEF-1a, P8 and OsGAPDH) and root+shoot (OsNABP, OsGAPDH and P8) enabling us to perform further reliable studies for iron toxicity in both indica and japonica subspecies. The importance of the study of other than the traditional endogenous genes for use as normalizers is also shown here. PMID:29494624

  17. Empirical evaluation of cross-site reproducibility in radiomic features for characterizing prostate MRI

    NASA Astrophysics Data System (ADS)

    Chirra, Prathyush; Leo, Patrick; Yim, Michael; Bloch, B. Nicolas; Rastinehad, Ardeshir R.; Purysko, Andrei; Rosen, Mark; Madabhushi, Anant; Viswanath, Satish

    2018-02-01

    The recent advent of radiomics has enabled the development of prognostic and predictive tools which use routine imaging, but a key question that still remains is how reproducible these features may be across multiple sites and scanners. This is especially relevant in the context of MRI data, where signal intensity values lack tissue specific, quantitative meaning, as well as being dependent on acquisition parameters (magnetic field strength, image resolution, type of receiver coil). In this paper we present the first empirical study of the reproducibility of 5 different radiomic feature families in a multi-site setting; specifically, for characterizing prostate MRI appearance. Our cohort comprised 147 patient T2w MRI datasets from 4 different sites, all of which were first pre-processed to correct acquisition-related for artifacts such as bias field, differing voxel resolutions, as well as intensity drift (non-standardness). 406 3D voxel wise radiomic features were extracted and evaluated in a cross-site setting to determine how reproducible they were within a relatively homogeneous non-tumor tissue region; using 2 different measures of reproducibility: Multivariate Coefficient of Variation and Instability Score. Our results demonstrated that Haralick features were most reproducible between all 4 sites. By comparison, Laws features were among the least reproducible between sites, as well as performing highly variably across their entire parameter space. Similarly, the Gabor feature family demonstrated good cross-site reproducibility, but for certain parameter combinations alone. These trends indicate that despite extensive pre-processing, only a subset of radiomic features and associated parameters may be reproducible enough for use within radiomics-based machine learning classifier schemes.

  18. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    PubMed

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  19. Reproducibility of electronic tooth colour measurements.

    PubMed

    Ratzmann, Anja; Klinke, Thomas; Schwahn, Christian; Treichel, Anja; Gedrange, Tomasz

    2008-10-01

    Clinical methods of investigation, such as tooth colour determination, should be simple, quick and reproducible. The determination of tooth colours usually relies upon manual comparison of a patient's tooth colour with a colour ring. After some days, however, measurement results frequently lack unequivocal reproducibility. This study aimed to examine an electronic method for reliable colour measurement. The colours of the teeth 14 to 24 were determined by three different examiners in 10 subjects using the colour measuring device Shade Inspector. In total, 12 measurements per tooth were taken. Two measurement time points were scheduled to be taken, namely at study onset (T(1)) and after 6 months (T(2)). At either time point, two measurement series per subject were taken by the different examiners at 2-week intervals. The inter-examiner and intra-examiner agreement of the measurement results was assessed. The concordance for lightness and colour intensity (saturation) was represented by the intra-class correlation coefficient. The categorical variable colour shade (hue) was assessed using the kappa statistic. The study results show that tooth colour can be measured independently of the examiner. Good agreement was found between the examiners.

  20. Region-specific protein misfolding cyclic amplification reproduces brain tropism of prion strains.

    PubMed

    Privat, Nicolas; Levavasseur, Etienne; Yildirim, Serfildan; Hannaoui, Samia; Brandel, Jean-Philippe; Laplanche, Jean-Louis; Béringue, Vincent; Seilhean, Danielle; Haïk, Stéphane

    2017-10-06

    Human prion diseases such as Creutzfeldt-Jakob disease are transmissible brain proteinopathies, characterized by the accumulation of a misfolded isoform of the host cellular prion protein (PrP) in the brain. According to the prion model, prions are defined as proteinaceous infectious particles composed solely of this abnormal isoform of PrP (PrP Sc ). Even in the absence of genetic material, various prion strains can be propagated in experimental models. They can be distinguished by the pattern of disease they produce and especially by the localization of PrP Sc deposits within the brain and the spongiform lesions they induce. The mechanisms involved in this strain-specific targeting of distinct brain regions still are a fundamental, unresolved question in prion research. To address this question, we exploited a prion conversion in vitro assay, protein misfolding cyclic amplification (PMCA), by using experimental scrapie and human prion strains as seeds and specific brain regions from mice and humans as substrates. We show here that region-specific PMCA in part reproduces the specific brain targeting observed in experimental, acquired, and sporadic Creutzfeldt-Jakob diseases. Furthermore, we provide evidence that, in addition to cellular prion protein, other region- and species-specific molecular factors influence the strain-dependent prion conversion process. This important step toward understanding prion strain propagation in the human brain may impact research on the molecular factors involved in protein misfolding and the development of ultrasensitive methods for diagnosing prion disease. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  2. Accurate color synthesis of three-dimensional objects in an image

    NASA Astrophysics Data System (ADS)

    Xin, John H.; Shen, Hui-Liang

    2004-05-01

    Our study deals with color synthesis of a three-dimensional object in an image; i.e., given a single image, a target color can be accurately mapped onto the object such that the color appearance of the synthesized object closely resembles that of the actual one. As it is almost impossible to acquire the complete geometric description of the surfaces of an object in an image, this study attempted to recover the implicit description of geometry for the color synthesis. The description was obtained from either a series of spectral reflectances or the RGB signals at different surface positions on the basis of the dichromatic reflection model. The experimental results showed that this implicit image-based representation is related to the object geometry and is sufficient for accurate color synthesis of three-dimensional objects in an image. The method established is applicable to the color synthesis of both rigid and deformable objects and should contribute to color fidelity in virtual design, manufacturing, and retailing.

  3. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Aidan P.; Swiler, Laura P.; Trott, Christian R.

    2015-03-15

    Here, we present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1].more » The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.« less

  4. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, A.P., E-mail: athomps@sandia.gov; Swiler, L.P., E-mail: lpswile@sandia.gov; Trott, C.R., E-mail: crtrott@sandia.gov

    2015-03-15

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. Themore » SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.« less

  5. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    NASA Astrophysics Data System (ADS)

    Thompson, A. P.; Swiler, L. P.; Trott, C. R.; Foiles, S. M.; Tucker, G. J.

    2015-03-01

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.

  6. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  7. Accurate Determination of the Frequency Response Function of Submerged and Confined Structures by Using PZT-Patches †

    PubMed Central

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme; Egusquiza, Mònica; Bossio, Matias

    2017-01-01

    To accurately determine the dynamic response of a structure is of relevant interest in many engineering applications. Particularly, it is of paramount importance to determine the Frequency Response Function (FRF) for structures subjected to dynamic loads in order to avoid resonance and fatigue problems that can drastically reduce their useful life. One challenging case is the experimental determination of the FRF of submerged and confined structures, such as hydraulic turbines, which are greatly affected by dynamic problems as reported in many cases in the past. The utilization of classical and calibrated exciters such as instrumented hammers or shakers to determine the FRF in such structures can be very complex due to the confinement of the structure and because their use can disturb the boundary conditions affecting the experimental results. For such cases, Piezoelectric Patches (PZTs), which are very light, thin and small, could be a very good option. Nevertheless, the main drawback of these exciters is that the calibration as dynamic force transducers (relationship voltage/force) has not been successfully obtained in the past. Therefore, in this paper, a method to accurately determine the FRF of submerged and confined structures by using PZTs is developed and validated. The method consists of experimentally determining some characteristic parameters that define the FRF, with an uncalibrated PZT exciting the structure. These parameters, which have been experimentally determined, are then introduced in a validated numerical model of the tested structure. In this way, the FRF of the structure can be estimated with good accuracy. With respect to previous studies, where only the natural frequencies and mode shapes were considered, this paper discuss and experimentally proves the best excitation characteristic to obtain also the damping ratios and proposes a procedure to fully determine the FRF. The method proposed here has been validated for the structure vibrating

  8. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2018-02-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.

  9. High- and Reproducible-Performance Graphene/II-VI Semiconductor Film Hybrid Photodetectors

    PubMed Central

    Huang, Fan; Jia, Feixiang; Cai, Caoyuan; Xu, Zhihao; Wu, Congjun; Ma, Yang; Fei, Guangtao; Wang, Min

    2016-01-01

    High- and reproducible-performance photodetectors are critical to the development of many technologies, which mainly include one-dimensional (1D) nanostructure based and film based photodetectors. The former suffer from a huge performance variation because the performance is quite sensitive to the synthesis microenvironment of 1D nanostructure. Herein, we show that the graphene/semiconductor film hybrid photodetectors not only possess a high performance but also have a reproducible performance. As a demo, the as-produced graphene/ZnS film hybrid photodetector shows a high responsivity of 1.7 × 107 A/W and a fast response speed of 50 ms, and shows a highly reproducible performance, in terms of narrow distribution of photocurrent (38–65 μA) and response speed (40–60 ms) for 20 devices. Graphene/ZnSe film and graphene/CdSe film hybrid photodetectors fabricated by this method also show a high and reproducible performance. The general method is compatible with the conventional planar process, and would be easily standardized and thus pay a way for the photodetector applications. PMID:27349692

  10. Reproducibility of the spectral components of the electroencephalogram during driver fatigue.

    PubMed

    Lal, Saroj K L; Craig, Ashley

    2005-02-01

    To date, no study has tested the reproducibility of EEG changes that occur during driver fatigue. For the EEG changes to be useful in the development of a fatigue countermeasure device the EEG response during each onset period of fatigue in individuals needs to be reproducible. It should be noted that fatigue during driving is not a continuous process but consists of successive episodes of 'microsleeps' where the subject may go in and out of a fatigue state. The aim of the present study was to investigate the reproducibility of fatigue during driving in both professional and non-professional drivers. Thirty five non-professional drivers and twenty professional drivers were tested during two separate sessions of a driver simulator task. EEG, EOG and behavioural measurements of fatigue were obtained during the driving task. The results showed high reproducibility for the delta and theta bands (r>0.95) in both groups of drivers. The results are discussed in light of implications for future studies and for the development of an EEG based fatigue countermeasure device.

  11. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    PubMed

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Reproducibility of airway luminal size in asthma measured by HRCT.

    PubMed

    Brown, Robert H; Henderson, Robert J; Sugar, Elizabeth A; Holbrook, Janet T; Wise, Robert A

    2017-10-01

    Brown RH, Henderson RJ, Sugar EA, Holbrook JT, Wise RA, on behalf of the American Lung Association Airways Clinical Research Centers. Reproducibility of airway luminal size in asthma measured by HRCT. J Appl Physiol 123: 876-883, 2017. First published July 13, 2017; doi:10.1152/japplphysiol.00307.2017.-High-resolution CT (HRCT) is a well-established imaging technology used to measure lung and airway morphology in vivo. However, there is a surprising lack of studies examining HRCT reproducibility. The CPAP Trial was a multicenter, randomized, three-parallel-arm, sham-controlled 12-wk clinical trial to assess the use of a nocturnal continuous positive airway pressure (CPAP) device on airway reactivity to methacholine. The lack of a treatment effect of CPAP on clinical or HRCT measures provided an opportunity for the current analysis. We assessed the reproducibility of HRCT imaging over 12 wk. Intraclass correlation coefficients (ICCs) were calculated for individual airway segments, individual lung lobes, both lungs, and air trapping. The ICC [95% confidence interval (CI)] for airway luminal size at total lung capacity ranged from 0.95 (0.91, 0.97) to 0.47 (0.27, 0.69). The ICC (95% CI) for airway luminal size at functional residual capacity ranged from 0.91 (0.85, 0.95) to 0.32 (0.11, 0.65). The ICC measurements for airway distensibility index and wall thickness were lower, ranging from poor (0.08) to moderate (0.63) agreement. The ICC for air trapping at functional residual capacity was 0.89 (0.81, 0.94) and varied only modestly by lobe from 0.76 (0.61, 0.87) to 0.95 (0.92, 0.97). In stable well-controlled asthmatic subjects, it is possible to reproducibly image unstimulated airway luminal areas over time, by region, and by size at total lung capacity throughout the lungs. Therefore, any changes in luminal size on repeat CT imaging are more likely due to changes in disease state and less likely due to normal variability. NEW & NOTEWORTHY There is a surprising lack

  14. Reproducible, high performance patch antenna array apparatus and method of fabrication

    DOEpatents

    Strassner, II, Bernd H.

    2007-01-23

    A reproducible, high-performance patch antenna array apparatus includes a patch antenna array provided on a unitary dielectric substrate, and a feed network provided on the same unitary substrate and proximity coupled to the patch antenna array. The reproducibility is enhanced by using photolithographic patterning and etching to produce both the patch antenna array and the feed network.

  15. Experimental phytophotodermatitis.

    PubMed

    Gonçalves, N E L; de Almeida, H L; Hallal, E C; Amado, M

    2005-12-01

    Phytophotodermatitis (PPD) is defined as a phototoxic reaction of the skin after contact with substances derived from plants and subsequent exposure to sunlight. It is a frequent disease in our outpatient clinics during summer because of contact with Tahitian lemon. Our objectives were to experimentally reproduce PPD in rats, to identify whether PPD is induced by minimal exposure periods to sunlight, to find what kinds of lemons and which parts of the lemon (the fruit juice or the peel juice) may trigger the disease; to know whether the use of sunblock prevents the reaction; and to perform light microscopy of the lesions to describe their histology. Adult rats (Rattus norwegicus), three in each experiment, were used. After painting the rats with the fruit juice or the peel juice they were exposed to sunlight for 2.5, 5, 7.5, and 10 min. Tahitian and Sicilian lemons were used in the experiments. Biopsies with 3-mm punches of different times of exposure were performed. The peel juice of both lemons reproduced PPD, which was clinically evident after 48 h. When the peel juice was alone applied there was no reaction; moreover, exposure to sunlight alone triggered no reaction. Two and a half minutes of exposure time was sufficient to induce phototoxic reaction, which was time dependent (the longer the exposure the more intense the reaction). Histopathological studies showed epithelial time-dependent vacuolar degeneration. The use of sunblock diminished the intensity of the reaction but did not prevent it. PPD can be reproduced in an animal model. It may be caused by the peel juice of Tahitian and Sicilian lemon. Because of an extremely short time of exposure (2.5 min) is sufficient to induce PPD it is necessary to alert the population, of the need for caution when handling lemons, especially outdoors despite using sunblock.

  16. Relevant principal factors affecting the reproducibility of insect primary culture.

    PubMed

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  17. Wire like link for cycle reproducible and cycle accurate hardware accelerator

    DOEpatents

    Asaad, Sameh; Kapur, Mohit; Parker, Benjamin D

    2015-04-07

    First and second field programmable gate arrays are provided which implement first and second blocks of a circuit design to be simulated. The field programmable gate arrays are operated at a first clock frequency and a wire like link is provided to send a plurality of signals between them. The wire like link includes a serializer, on the first field programmable gate array, to serialize the plurality of signals; a deserializer on the second field programmable gate array, to deserialize the plurality of signals; and a connection between the serializer and the deserializer. The serializer and the deserializer are operated at a second clock frequency, greater than the first clock frequency, and the second clock frequency is selected such that latency of transmission and reception of the plurality of signals is less than the period corresponding to the first clock frequency.

  18. Experimental determination of satellite bolted joints thermal resistance

    NASA Technical Reports Server (NTRS)

    Mantelli, Marcia Barbosa Henriques; Basto, Jose Edson

    1990-01-01

    The thermal resistance was experimentally determined of the bolted joints of the first Brazilian satellite (SCD 01). These joints, used to connect the satellite structural panels, are reproduced in an experimental apparatus, keeping, as much as possible, the actual dimensions and materials. A controlled amount of heat is forced to pass through the joint and the difference of temperature between the panels is measured. The tests are conducted in a vacuum chamber with liquid nitrogen cooled walls, that simulates the space environment. Experimental procedures are used to avoid much heat losses, which are carefully calculated. Important observations about the behavior of the joint thermal resistance with the variation of the mean temperature are made.

  19. The extraction of accurate coordinates of images on photographic plates by means of a scanning type measuring machine

    NASA Technical Reports Server (NTRS)

    Ross, B. E.

    1971-01-01

    The Moire method experimental stress analysis is similar to a problem encountered in astrometry. It is necessary to extract accurate coordinates from images on photographic plates. The solution to the mutual problem found applicable to the field of experimental stress analysis is presented to outline the measurement problem. A discussion of the photo-reading device developed to make the measurements follows.

  20. Exploring the Role of Information Professionals in Improving Research Reproducibility:A Case Study in Geosciences

    NASA Astrophysics Data System (ADS)

    Yan, A.; West, J.

    2016-12-01

    The validity of Geosciences research is of great significance to general public and policy-makers. In an earlier study, we surveyed 136 faculty and graduate students in geosciences. The result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing, suggesting a general lack of research reproducibility in geosciences. Although there is much enthusiasm for creation of technologies such as workflow system, literate programming, and cloud-based system to facilitate reproducibility, much less emphasis has been placed on the information services essential for meaningful use of these tools. Library and Information Science (LIS) has a rich tradition of providing customized service for research communities. LIS professionals such as academic librarians have made strong contribution to resources locating, software recommending, data curation, metadata guidance, project management, submission review and author training. In particular, university libraries have been actively developing tools and offering guidelines, consultations, and trainings on Data Management Plan (DMP) required by National Science Foundation (NSF). And effective data management is a significant first step towards reproducible research. Hereby we argue that LIS professionals may be well-positioned to assist researchers to make their research reproducible. In this study, we aim to answer the question: how can LIS professionals assist geoscience researchers in making their research capable of being reproduced? We first synthesize different definitions of "reproducibility" and provide a conceptual framework of "reproducibility" in geosciences to resolve some of the misunderstandings around related terminology. Using a case study approach, we then examine 1) university librarians' technical skills, domain knowledge, professional activities, together with their awareness of, readiness for, and attitudes towards research reproducibility and

  1. Numerical and Experimental Studies of Particle Settling in Real Fracture Geometries

    NASA Astrophysics Data System (ADS)

    Roy, Pratanu; Du Frane, Wyatt L.; Kanarska, Yuliya; Walsh, Stuart D. C.

    2016-11-01

    Proppant is a vital component of hydraulic stimulation operations, improving conductivity by maintaining fracture aperture. While correct placement is a necessary part of ensuring that proppant performs efficiently, the transport behavior of proppant in natural rock fractures is poorly understood. In particular, as companies pursue new propping strategies involving new types of proppant, more accurate models of proppant behavior are needed to help guide their deployment. A major difficulty with simulating reservoir-scale proppant behavior is that continuum models traditionally used to represent large-scale slurry behavior loose applicability in fracture geometries. Particle transport models are often based on representative volumes that are at the same scale or larger than fractures found in hydraulic fracturing operations, making them inappropriate for modeling these types of flows. In the absence of a first-principles approach, empirical closure relations are needed. However, even such empirical closure relationships are difficult to derive without an accurate understanding of proppant behavior on the particle level. Thus, there is a need for experiments and simulations capable of probing phenomena at the sub-fracture scale. In this paper, we present results from experimental and numerical studies investigating proppant behavior at the sub-fracture level, in particular, the role of particle dispersion during proppant settling. In the experimental study, three-dimensional printing techniques are used to accurately reproduce the topology of a fractured Marcellus shale sample inside a particle-flow cell. By recreating the surface in clear plastic resin, proppant movement within the fracture can be tracked directly in real time without the need for X-ray imaging. Particle tracking is further enhanced through the use of mixtures of transparent and opaque proppant analogues. The accompanying numerical studies employ a high-fidelity three-dimensional particle-flow model

  2. Exhaled breath temperature in children: reproducibility and influencing factors.

    PubMed

    Vermeulen, S; Barreto, M; La Penna, F; Prete, A; Martella, S; Biagiarelli, F; Villa, M P

    2014-09-01

    This study will investigate the reproducibility and influencing factors of exhaled breath temperature measured with the tidal breathing technique in asthmatic patients and healthy children. Exhaled breath temperature, fractional exhaled nitric oxide, and spirometry were assessed in 124 children (63 healthy and 61 asthmatic), aged 11.2 ± 2.5 year, M/F 73/51. A modified version of the American Thoracic Society questionnaire on the child's present and past respiratory history was obtained from parents. Parents were also asked to provide detailed information on their child's medication use during the previous 4 weeks. Ear temperature, ambient temperature, and relative-ambient humidity were also recorded. Exhaled breath temperature measurements were highly reproducible; the second measurement was higher than the first measurement, consistent with a test-retest situation. In 13 subjects, between-session within-day reproducibility of exhaled breath temperature was still high. Exhaled breath temperature increased with age and relative-ambient humidity. Exhaled breath temperature was comparable in healthy and asthmatic children; when adjusted for potential confounders (i.e. ambient conditions and subject characteristics), thermal values of asthmatic patients exceeded those of the healthy children by 1.1 °C. Normalized exhaled breath temperature, by subtracting ambient temperature, was lower in asthmatic patients treated with inhaled corticosteroids than in those who were corticosteroid-naive. Measurements of exhaled breath temperature are highly reproducible, yet influenced by several factors. Corrected values, i.e. normalized exhaled breath temperature, could help us to assess the effect of therapy with inhaled corticosteroids. More studies are needed to improve the usefulness of the exhaled breath temperature measured with the tidal breathing technique in children.

  3. Random sampling causes the low reproducibility of rare eukaryotic OTUs in Illumina COI metabarcoding.

    PubMed

    Leray, Matthieu; Knowlton, Nancy

    2017-01-01

    common β descriptors but will exclude positive records of taxa that are functionally important. Our results further reinforce the need for technical replicates (parallel PCR and sequencing from the same sample) in metabarcoding experimental designs. Data reproducibility should be determined empirically as it will depend upon the sequencing depth, the type of sample, the sequence analysis pipeline, and the number of replicates. Moreover, estimating relative biomasses or abundances based on read counts remains elusive at the OTU level.

  4. Accuracy and reproducibility of virtual edentulous casts created by laboratory impression scan protocols.

    PubMed

    Peng, Lingyan; Chen, Li; Harris, Bryan T; Bhandari, Bikash; Morton, Dean; Lin, Wei-Shao

    2018-04-24

    significantly different (P=.968), and the percentage of measurement data points within 1 SD of mean RMS values (90.1 ±1.1% versus 89.5 ±0.8%) were also not significantly different (P=.662). The numeric distance differences across 5 regions were affected by the scanning protocols (P<.001). The laboratory scanner and laboratory scanner-spray groups had significantly higher numeric distance differences at the apex of the denture border and crest of the ridge regions (P<.001). The CBCT scanner created more accurate and reproducible virtual edentulous casts, and the antiglare spray only significantly improved the accuracy and reproducibility of virtual edentulous casts created by the dental laboratory laser scanner. The accuracy of the virtual edentulous casts was different across 5 regions and was affected by the scanning protocols. Copyright © 2018 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  5. Can the Λ CDM model reproduce MOND-like behavior?

    NASA Astrophysics Data System (ADS)

    Dai, De-Chang; Lu, Chunyu

    2017-12-01

    It is usually believed that MOND can describe the galactic rotational curves with only baryonic matter and without any dark matter very well, while the Λ CDM model is expected to have difficulty in reproducing MOND-like behavior. Here, we use EAGLE's data to learn whether the Λ CDM model can reproduce MOND-like behavior. EAGLE's simulation result clearly reproduces the MOND-like behavior for ab⪆10-12 m/s 2 at z =0 , although the acceleration constant, a0, is a little larger than the observational data indicate. We find that a0 increases with the redshift in a way different from what Milgrom proposed (a0∝H ). Therefore, while galaxy rotation curves can be fitted by MOND's empirical function in the Λ CDM model, there is no clear connection between a0 and the Hubble constant. We also find that a0 at z ⪆1 is well separated from a0 at z =0 . Once we have enough galaxies observed at high redshifts, we will be able to rule out the modified gravity model based on MOND-like empirical function with a z -independent a0.

  6. Accurate simulation of backscattering spectra in the presence of sharp resonances

    NASA Astrophysics Data System (ADS)

    Barradas, N. P.; Alves, E.; Jeynes, C.; Tosaki, M.

    2006-06-01

    In elastic backscattering spectrometry, the shape of the observed spectrum due to resonances in the nuclear scattering cross-section is influenced by many factors. If the energy spread of the beam before interaction is larger than the resonance width, then a simple convolution with the energy spread on exit and with the detection system resolution will lead to a calculated spectrum with a resonance much sharper than the observed signal. Also, the yield from a thin layer will not be calculated accurately. We have developed an algorithm for the accurate simulation of backscattering spectra in the presence of sharp resonances. Albeit approximate, the algorithm leads to dramatic improvements in the quality and accuracy of the simulations. It is simple to implement and leads to only small increases of the calculation time, being thus suitable for routine data analysis. We show different experimental examples, including samples with roughness and porosity.

  7. Accurate structural and spectroscopic characterization of prebiotic molecules: The neutral and cationic acetyl cyanide and their related species.

    PubMed

    Bellili, A; Linguerri, R; Hochlaf, M; Puzzarini, C

    2015-11-14

    In an effort to provide an accurate structural and spectroscopic characterization of acetyl cyanide, its two enolic isomers and the corresponding cationic species, state-of-the-art computational methods, and approaches have been employed. The coupled-cluster theory including single and double excitations together with a perturbative treatment of triples has been used as starting point in composite schemes accounting for extrapolation to the complete basis-set limit as well as core-valence correlation effects to determine highly accurate molecular structures, fundamental vibrational frequencies, and rotational parameters. The available experimental data for acetyl cyanide allowed us to assess the reliability of our computations: structural, energetic, and spectroscopic properties have been obtained with an overall accuracy of about, or better than, 0.001 Å, 2 kcal/mol, 1-10 MHz, and 11 cm(-1) for bond distances, adiabatic ionization potentials, rotational constants, and fundamental vibrational frequencies, respectively. We are therefore confident that the highly accurate spectroscopic data provided herein can be useful for guiding future experimental investigations and/or astronomical observations.

  8. EVALUATION OF THE REPRODUCIBILITY OF TWO TECHNIQUES USED TO DETERMINE AND RECORD CENTRIC RELATION IN ANGLE’S CLASS I PATIENTS

    PubMed Central

    Paixão, Fernanda; Silva, Wilkens Aurélio Buarque e; Silva, Frederico Andrade e; Ramos, Guilherme da Gama; Cruz, Mônica Vieira de Jesus

    2007-01-01

    The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson’s Bilateral Manipulation and Gysi’s Gothic Arch Tracing. Twenty volunteers (14 females and 6 males) with no dental loss, presenting occlusal contacts according to those described in Angle’s I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson’s Bilateral Manipulation and to the Gysi’s Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27). Shapiro-Wilk test was applied and the results allowed application of Student’s t-test (sampling error of 5%). The techniques showed different degrees of variability. The Gysi’s Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records. PMID:19089144

  9. Intra-observer reproducibility and diagnostic performance of breast shear-wave elastography in Asian women.

    PubMed

    Park, Hye Young; Han, Kyung Hwa; Yoon, Jung Hyun; Moon, Hee Jung; Kim, Min Jung; Kim, Eun-Kyung

    2014-06-01

    Our aim was to evaluate intra-observer reproducibility of shear-wave elastography (SWE) in Asian women. Sixty-four breast masses (24 malignant, 40 benign) were examined with SWE in 53 consecutive Asian women (mean age, 44.9 y old). Two SWE images were obtained for each of the lesions. The intra-observer reproducibility was assessed by intra-class correlation coefficients (ICC). We also evaluated various clinicoradiologic factors that can influence reproducibility in SWE. The ICC of intra-observer reproducibility was 0.789. In clinicoradiologic factor evaluation, masses surrounded by mixed fatty and glandular tissue (ICC: 0.619) showed lower intra-observer reproducibility compared with lesions that were surrounded by glandular tissue alone (ICC: 0.937; p < 0.05). Overall, the intra-observer reproducibility of breast SWE was excellent in Asian women. However, it may decrease when breast tissue is in a heterogeneous background. Therefore, SWE should be performed carefully in these cases. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  10. Long-term reproducibility of relative sensitivity factors obtained with CAMECA Wf

    NASA Astrophysics Data System (ADS)

    Gui, D.; Xing, Z. X.; Huang, Y. H.; Mo, Z. Q.; Hua, Y. N.; Zhao, S. P.; Cha, L. Z.

    2008-12-01

    As the wafer size continues to increase and the feature size of the integrated circuits (IC) continues to shrink, process control of IC manufacturing becomes ever more important to reduce the cost of failures caused by the drift of processes or equipments. Characterization tools with high precision and reproducibility are required to capture any abnormality of the process. Although Secondary ion mass spectrometry (SIMS) has been widely used in dopant profile control, it was reported that magnetic sector SIMS, compared to quadrupole SIMS, has lower short-term repeatability and long-term reproducibility due to the high extraction field applied between sample and extraction lens. In this paper, we demonstrate that CAMECA Wf can deliver high long-term reproducibility because of its high-level automation and improved design of immersion lens. The relative standard deviation (R.S.D.) of the relative sensitivity factors (RSF) of three typical elements, i.e., boron (B), phosphorous (P) and nitrogen (N), over 3 years are 3.7%, 5.5% and 4.1%, respectively. The high reproducibility results have a practical implication that deviation can be estimated without testing the standards.

  11. Accessing the reproducibility and specificity of pepsin and other aspartic proteases.

    PubMed

    Ahn, Joomi; Cao, Min-Jie; Yu, Ying Qing; Engen, John R

    2013-06-01

    The aspartic protease pepsin is less specific than other endoproteinases. Because aspartic proteases like pepsin are active at low pH, they are utilized in hydrogen deuterium exchange mass spectrometry (HDX MS) experiments for digestion under hydrogen exchange quench conditions. We investigated the reproducibility, both qualitatively and quantitatively, of online and offline pepsin digestion to understand the compliment of reproducible pepsin fragments that can be expected during a typical pepsin digestion. The collection of reproducible peptides was identified from >30 replicate digestions of the same protein and it was found that the number of reproducible peptides produced during pepsin digestion becomes constant above 5-6 replicate digestions. We also investigated a new aspartic protease from the stomach of the rice field eel (Monopterus albus Zuiew) and compared digestion efficiency and specificity to porcine pepsin and aspergillopepsin. Unique cleavage specificity was found for rice field eel pepsin at arginine, asparagine, and glycine. Different peptides produced by the various proteases can enhance protein sequence coverage and improve the spatial resolution of HDX MS data. This article is part of a Special Issue entitled: Mass spectrometry in structural biology. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. An Accurate Full-flexion Anterolateral Portal for Needle Placement in the Knee Joint With Dry Osteoarthritis.

    PubMed

    Hussein, Mohamed

    2017-07-01

    Accurate delivery of an injection into the intra-articular space of the knee is achieved in only two thirds of knees when using the standard anterolateral portal. The use of a modified full-flexion anterolateral portal provides a highly accurate, less painful, and more effective method for reproducible intra-articular injection without the need for ultrasonographic or fluoroscopic guidance in patients with dry osteoarthritis of the knee. The accuracy of needle placement was assessed in a prospective series of 140 consecutive injections in patients with symptomatic degenerative knee arthritis without clinical knee effusion. Procedural pain was determined using the Numerical Rating Scale. The accuracy rates of needle placement were confirmed with fluoroscopic imaging to document the dispersion pattern of injected contrast material. Using the standard anterolateral portal, 52 of 70 injections were confirmed to have been placed in the intra-articular space on the first attempt (accuracy rate, 74.2%). Using the modified full-flexion anterolateral portal, 68 of 70 injections were placed in the intra-articular space on the first attempt (accuracy rate, 97.1%; P = 0.000). This study revealed that using the modified full-flexion anterolateral portal for injections into the knee joint resulted in more accurate and less painful injections than those performed by the same orthopaedic surgeon using the standard anterolateral portal. In addition, the technique offered therapeutic delivery into the joint without the need for fluoroscopic confirmation. Therapeutic Level II.

  13. Scattering, absorption and transmittance of experimental graphene dental nanocomposites

    NASA Astrophysics Data System (ADS)

    Pérez, María. M.; Salas, Marianne; Moldovan, Marionara; Dudea, Diana; Yebra, Ana; Ghinea, Razvan

    2017-08-01

    Optical properties of experimental graphene dental nanocomposites were studied. Spectral reflectance was measured and S and K coefficients as well as transmittance of samples were calculated using Kubelka-Munk's equations. The spectral behavior of S, K and T experimental graphene exhibited different trends compared with the commercial nanocomposites and they were statistically different. Experimental nanocomposites show higher scattering and lower transmittance when compared with commercial nanocomposite, probably, due to the shape, type and size of the filler. K for short wavelength of the pre-polymerized experimental nancomposites was very low. According to our results, hidroxypatite with graphene oxide used in dental nanocomposites needs to be improved to reproduce esthetic properties of natural dental tissues and to have potentially clinical applications.

  14. Reproducibility of cervical range of motion in patients with neck pain

    PubMed Central

    Hoving, Jan Lucas; Pool, Jan JM; van Mameren, Henk; Devillé, Walter JLM; Assendelft, Willem JJ; de Vet, Henrica CW; de Winter, Andrea F; Koes, Bart W; Bouter, Lex M

    2005-01-01

    Background Reproducibility measurements of the range of motion are an important prerequisite for the interpretation of study results. The aim of the study is to assess the intra-rater and inter-rater reproducibility of the measurement of active Range of Motion (ROM) in patients with neck pain using the Cybex Electronic Digital Inclinometer-320 (EDI-320). Methods In an outpatient clinic in a primary care setting 32 patients with at least 2 weeks of pain and/or stiffness in the neck were randomly assessed, in a test- retest design with blinded raters using a standardized measurement protocol. Cervical flexion-extension, lateral flexion and rotation were assessed. Results Reliability expressed by the Intraclass Correlation Coefficient (ICC) was 0.93 (lateral flexion) or higher for intra-rater reliability and 0.89 (lateral flexion) or higher for inter-rater reliability. The 95% limits of agreement for intra-rater agreement, expressing the range of the differences between two ratings were -2.5 ± 11.1° for flexion-extension, -0.1 ± 10.4° for lateral flexion and -5.9 ± 13.5° for rotation. For inter-rater agreement the limits of agreement were 3.3 ± 17.0° for flexion-extension, 0.5 ± 17.0° for lateral flexion and -1.3 ± 24.6° for rotation. Conclusion In general, the intra-rater reproducibility and the inter-rater reproducibility were good. We recommend to compare the reproducibility and clinical applicability of the EDI-320 inclinometer with other cervical ROM measures in symptomatic patients. PMID:16351719

  15. Accurate modeling of high-repetition rate ultrashort pulse amplification in optical fibers

    PubMed Central

    Lindberg, Robert; Zeil, Peter; Malmström, Mikael; Laurell, Fredrik; Pasiskevicius, Valdas

    2016-01-01

    A numerical model for amplification of ultrashort pulses with high repetition rates in fiber amplifiers is presented. The pulse propagation is modeled by jointly solving the steady-state rate equations and the generalized nonlinear Schrödinger equation, which allows accurate treatment of nonlinear and dispersive effects whilst considering arbitrary spatial and spectral gain dependencies. Comparison of data acquired by using the developed model and experimental results prove to be in good agreement. PMID:27713496

  16. Traveling waves in a magnetized Taylor-Couette flow.

    PubMed

    Liu, Wei; Goodman, Jeremy; Ji, Hantao

    2007-07-01

    We investigate numerically a traveling wave pattern observed in experimental magnetized Taylor-Couette flow at low magnetic Reynolds number. By accurately modeling viscous and magnetic boundaries in all directions, we reproduce the experimentally measured wave patterns and their amplitudes. Contrary to previous claims, the waves are shown to be transiently amplified disturbances launched by viscous boundary layers, rather than globally unstable magnetorotational modes.

  17. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  18. Reproducibility Issues: Avoiding Pitfalls in Animal Inflammation Models.

    PubMed

    Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E

    2017-01-01

    In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite evident advancements in searching for alternatives, that is, replacing, reducing, and refining animal experiments-the three R's of Russel and Burch (1959)-understanding the complex interactions of the cells of the immune system, the nervous system and the affected tissue/organ during inflammation critically relies on in vivo models. Consequently, scientific advancement and ultimately novel therapeutic interventions depend on improving the reproducibility of animal inflammation models. As a prelude to the remaining hands-on protocols described in this volume, here, we summarize potential pitfalls of preclinical animal research and provide resources and background reading on how to avoid them.

  19. Properties of galaxies reproduced by a hydrodynamic simulation

    NASA Astrophysics Data System (ADS)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  20. ChIP-seq Accurately Predicts Tissue-Specific Activity of Enhancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Visel, Axel; Blow, Matthew J.; Li, Zirong

    2009-02-01

    A major yet unresolved quest in decoding the human genome is the identification of the regulatory sequences that control the spatial and temporal expression of genes. Distant-acting transcriptional enhancers are particularly challenging to uncover since they are scattered amongst the vast non-coding portion of the genome. Evolutionary sequence constraint can facilitate the discovery of enhancers, but fails to predict when and where they are active in vivo. Here, we performed chromatin immunoprecipitation with the enhancer-associated protein p300, followed by massively-parallel sequencing, to map several thousand in vivo binding sites of p300 in mouse embryonic forebrain, midbrain, and limb tissue. Wemore » tested 86 of these sequences in a transgenic mouse assay, which in nearly all cases revealed reproducible enhancer activity in those tissues predicted by p300 binding. Our results indicate that in vivo mapping of p300 binding is a highly accurate means for identifying enhancers and their associated activities and suggest that such datasets will be useful to study the role of tissue-specific enhancers in human biology and disease on a genome-wide scale.« less

  1. Federating heterogeneous datasets to enhance data sharing and experiment reproducibility

    NASA Astrophysics Data System (ADS)

    Prieto, Juan C.; Paniagua, Beatriz; Yatabe, Marilia S.; Ruellas, Antonio C. O.; Fattori, Liana; Muniz, Luciana; Styner, Martin; Cevidanes, Lucia

    2017-03-01

    Recent studies have demonstrated the difficulties to replicate scientific findings and/or experiments published in past.1 The effects seen in the replicated experiments were smaller than previously reported. Some of the explanations for these findings include the complexity of the experimental design and the pressure on researches to report positive findings. The International Committee of Medical Journal Editors (ICMJE) suggests that every study considered for publication must submit a plan to share the de-identified patient data no later than 6 months after publication. There is a growing demand to enhance the management of clinical data, facilitate data sharing across institutions and also to keep track of the data from previous experiments. The ultimate goal is to assure the reproducibility of experiments in the future. This paper describes Shiny-tooth, a web based application created to improve clinical data acquisition during the clinical trial; data federation of such data as well as morphological data derived from medical images; Currently, this application is being used to store clinical data from an osteoarthritis (OA) study. This work is submitted to the SPIE Biomedical Applications in Molecular, Structural, and Functional Imaging conference.

  2. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  3. Reproducibility of the kinematics and kinetics of the lower extremity during normal stair-climbing.

    PubMed

    Yu, B; Kienbacher, T; Growney, E S; Johnson, M E; An, K N

    1997-05-01

    The purpose of this study was to examine the intrasubject reproducibility of the kinematic and kinetic measures of the lower extremity during normal stair-climbing. Three-dimensional video and force-plate data were collected for three trials per subject during each of three conditions: ascending, descending, and level walking. Three-dimensional angles and moments of the ankle, knee, and hip joints were calculated. The coefficient of multiple correlation was used to determine the intrasubject reproducibility of joint angles and resultant moments. Analysis of variance with repeated measures was conducted to compare the magnitudes of the coefficients between different steps, different joints, and different joint functions. The results showed that (a) generally, the kinematic and kinetic measures of normal subjects climbing stairs were reproducible; (b) the kinetic measures during the transition steps from level walking to ascending and from descending to level walking were significantly less reproducible than those during the other steps; (c) the data from the sagittal plane were more reproducible than those from the other two planes; and (d) the kinetic measures were more reproducible than the kinematic measures, especially for abduction-adduction and internal-external rotation.

  4. Reproducibility of The Abdominal and Chest Wall Position by Voluntary Breath-Hold Technique Using a Laser-Based Monitoring and Visual Feedback System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, Katsumasa; Shioyama, Yoshiyuki; Nomoto, Satoru

    2007-05-01

    Purpose: The voluntary breath-hold (BH) technique is a simple method to control the respiration-related motion of a tumor during irradiation. However, the abdominal and chest wall position may not be accurately reproduced using the BH technique. The purpose of this study was to examine whether visual feedback can reduce the fluctuation in wall motion during BH using a new respiratory monitoring device. Methods and Materials: We developed a laser-based BH monitoring and visual feedback system. For this study, five healthy volunteers were enrolled. The volunteers, practicing abdominal breathing, performed shallow end-expiration BH (SEBH), shallow end-inspiration BH (SIBH), and deep end-inspirationmore » BH (DIBH) with or without visual feedback. The abdominal and chest wall positions were measured at 80-ms intervals during BHs. Results: The fluctuation in the chest wall position was smaller than that of the abdominal wall position. The reproducibility of the wall position was improved by visual feedback. With a monitoring device, visual feedback reduced the mean deviation of the abdominal wall from 2.1 {+-} 1.3 mm to 1.5 {+-} 0.5 mm, 2.5 {+-} 1.9 mm to 1.1 {+-} 0.4 mm, and 6.6 {+-} 2.4 mm to 2.6 {+-} 1.4 mm in SEBH, SIBH, and DIBH, respectively. Conclusions: Volunteers can perform the BH maneuver in a highly reproducible fashion when informed about the position of the wall, although in the case of DIBH, the deviation in the wall position remained substantial.« less

  5. Comparison of methods for accurate end-point detection of potentiometric titrations

    NASA Astrophysics Data System (ADS)

    Villela, R. L. A.; Borges, P. P.; Vyskočil, L.

    2015-01-01

    Detection of the end point in potentiometric titrations has wide application on experiments that demand very low measurement uncertainties mainly for certifying reference materials. Simulations of experimental coulometric titration data and consequential error analysis of the end-point values were conducted using a programming code. These simulations revealed that the Levenberg-Marquardt method is in general more accurate than the traditional second derivative technique used currently as end-point detection for potentiometric titrations. Performance of the methods will be compared and presented in this paper.

  6. A Viscoelastic Constitutive Model Can Accurately Represent Entire Creep Indentation Tests of Human Patella Cartilage

    PubMed Central

    Pal, Saikat; Lindsey, Derek P.; Besier, Thor F.; Beaupre, Gary S.

    2013-01-01

    Cartilage material properties provide important insights into joint health, and cartilage material models are used in whole-joint finite element models. Although the biphasic model representing experimental creep indentation tests is commonly used to characterize cartilage, cartilage short-term response to loading is generally not characterized using the biphasic model. The purpose of this study was to determine the short-term and equilibrium material properties of human patella cartilage using a viscoelastic model representation of creep indentation tests. We performed 24 experimental creep indentation tests from 14 human patellar specimens ranging in age from 20 to 90 years (median age 61 years). We used a finite element model to reproduce the experimental tests and determined cartilage material properties from viscoelastic and biphasic representations of cartilage. The viscoelastic model consistently provided excellent representation of the short-term and equilibrium creep displacements. We determined initial elastic modulus, equilibrium elastic modulus, and equilibrium Poisson’s ratio using the viscoelastic model. The viscoelastic model can represent the short-term and equilibrium response of cartilage and may easily be implemented in whole-joint finite element models. PMID:23027200

  7. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  8. Reproducibility of Nordic Sleep Questionnaire in spinal cord injured.

    PubMed

    Biering-Sørensen, F; Biering-Sørensen, M; Hilden, J

    1994-11-01

    A recently proposed Nordic Sleep Questionnaire (NSQ) comprises 26 questions concerning qualitative and quantitative aspects of the respondent's sleep habits. Its reproducibility was evaluated in 32 spinal cord injured individuals (SCI), 24 men and eight women (23-72 years), and 79 normal subjects, 23 men and 56 women (19-77 years). They completed the NSQ twice at a median interval of 15 days (range 10-26) and 27 days (range 4-103) respectively. The group of normal subjects were evenly divided into group 26, i.e. those who completed the two NSQs within 26 days, and group 27 with 27 days or more between their replies. Generally, group 27 showed no worse test-retest agreement than group 26. In addition, the respondents' answers, with a few exceptions, were reasonably stable in terms of test-retest agreement or standard deviation. The SCI group exhibited the same level of reproducibility, although they had more 'pathology' to report and thus more scope for contradicting themselves. The questions in the NSQ generally were satisfactorily reproducible. However, answers to the ordered five-point questions about sleepiness in the morning and during the daytime ought to be interpreted with caution. The same may be said about the number of minutes required to fall asleep, and the duration of daytime naps.

  9. Using induced electroosmotic micromixer to enhance the reproducibility of chemiluminescence intensity.

    PubMed

    Chen, Hsiao-Ping; Yeh, Chun-Yi; Hung, Pei-Chin; Wang, Shau-Chun

    2014-02-01

    In this study, induced electroosmotic vortex flows were generated using an AC electric field by one pair of external electrodes to rapidly mix luminescence reagents in a 30 μL micromixer and enhance the reproducibility of chemiluminescence (CL) assays. A solution containing the catalyst reagent ferricyanide ions (4 μL) was pipetted into a reservoir containing luminol to produce CL in the presence of hydrogen peroxide. When the added ferricyanide aliquot contacted the reservoir solution, the CL began flashing, but rapidly diminished as the ferricyanide was consumed. In such a short illumination period, the solutes could not mix homogeneously. Therefore, the reproducibility of CL intensities collected using a CCD and multiple aliquot additions was determined to be inadequate. By contrast, when the solutes were efficiently mixed after adding a ferricyanide aliquot to a micromixer, the intensity reproducibility was significantly improved. When the CL temporal profile was analyzed using a PMT, a consistent improvement in reproducibility was observed between the CL intensity and estimated CL reaction rate. Replicating the proposed device would create a multiple well plate that contains a micromixer in each reservoir; this system is compatible with conventional CL instrumentation and requires no CL enhancer to slow a reaction. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Apparently low reproducibility of true differential expression discoveries in microarray studies.

    PubMed

    Zhang, Min; Yao, Chen; Guo, Zheng; Zou, Jinfeng; Zhang, Lin; Xiao, Hui; Wang, Dong; Yang, Da; Gong, Xue; Zhu, Jing; Li, Yanhui; Li, Xia

    2008-09-15

    Differentially expressed gene (DEG) lists detected from different microarray studies for a same disease are often highly inconsistent. Even in technical replicate tests using identical samples, DEG detection still shows very low reproducibility. It is often believed that current small microarray studies will largely introduce false discoveries. Based on a statistical model, we show that even in technical replicate tests using identical samples, it is highly likely that the selected DEG lists will be very inconsistent in the presence of small measurement variations. Therefore, the apparently low reproducibility of DEG detection from current technical replicate tests does not indicate low quality of microarray technology. We also demonstrate that heterogeneous biological variations existing in real cancer data will further reduce the overall reproducibility of DEG detection. Nevertheless, in small subsamples from both simulated and real data, the actual false discovery rate (FDR) for each DEG list tends to be low, suggesting that each separately determined list may comprise mostly true DEGs. Rather than simply counting the overlaps of the discovery lists from different studies for a complex disease, novel metrics are needed for evaluating the reproducibility of discoveries characterized with correlated molecular changes. Supplementaty information: Supplementary data are available at Bioinformatics online.

  11. Inter-study reproducibility of cardiovascular magnetic resonance tagging

    PubMed Central

    2013-01-01

    Background The aim of this study is to determine the test-retest reliability of the measurement of regional myocardial function by cardiovascular magnetic resonance (CMR) tagging using spatial modulation of magnetization. Methods Twenty-five participants underwent CMR tagging twice over 12 ± 7 days. To assess the role of slice orientation on strain measurement, two healthy volunteers had a first exam, followed by image acquisition repeated with slices rotated ±15 degrees out of true short axis, followed by a second exam in the true short axis plane. To assess the role of slice location, two healthy volunteers had whole heart tagging. The harmonic phase (HARP) method was used to analyze the tagged images. Peak midwall circumferential strain (Ecc), radial strain (Err), Lambda 1, Lambda 2, and Angle α were determined in basal, mid and apical slices. LV torsion, systolic and early diastolic circumferential strain and torsion rates were also determined. Results LV Ecc and torsion had excellent intra-, interobserver, and inter-study intra-class correlation coefficients (ICC range, 0.7 to 0.9). Err, Lambda 1, Lambda 2 and angle had excellent intra- and interobserver ICC than inter-study ICC. Angle had least inter-study reproducibility. Torsion rates had superior intra-, interobserver, and inter-study reproducibility to strain rates. The measurements of LV Ecc were comparable in all three slices with different short axis orientations (standard deviation of mean Ecc was 0.09, 0.18 and 0.16 at basal, mid and apical slices, respectively). The mean difference in LV Ecc between slices was more pronounced in most of the basal slices compared to the rest of the heart. Conclusions Intraobserver and interobserver reproducibility of all strain and torsion parameters was excellent. Inter-study reproducibility of CMR tagging by SPAMM varied between different parameters as described in the results above and was superior for Ecc and LV torsion. The variation in LV Ecc

  12. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are not...

  13. 46 CFR 54.10-5 - Maximum allowable working pressure (reproduces UG-98).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Maximum allowable working pressure (reproduces UG-98). 54.10-5 Section 54.10-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Inspection, Reports, and Stamping § 54.10-5 Maximum allowable working pressure (reproduces UG-98). (a) The maximum allowable...

  14. 46 CFR 54.10-5 - Maximum allowable working pressure (reproduces UG-98).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Maximum allowable working pressure (reproduces UG-98). 54.10-5 Section 54.10-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Inspection, Reports, and Stamping § 54.10-5 Maximum allowable working pressure (reproduces UG-98). (a) The maximum allowable...

  15. 46 CFR 54.10-5 - Maximum allowable working pressure (reproduces UG-98).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Maximum allowable working pressure (reproduces UG-98). 54.10-5 Section 54.10-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Inspection, Reports, and Stamping § 54.10-5 Maximum allowable working pressure (reproduces UG-98). (a) The maximum allowable...

  16. 46 CFR 54.10-5 - Maximum allowable working pressure (reproduces UG-98).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Maximum allowable working pressure (reproduces UG-98). 54.10-5 Section 54.10-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Inspection, Reports, and Stamping § 54.10-5 Maximum allowable working pressure (reproduces UG-98). (a) The maximum allowable...

  17. Experimental annotation of the human genome using microarray technology.

    PubMed

    Shoemaker, D D; Schadt, E E; Armour, C D; He, Y D; Garrett-Engele, P; McDonagh, P D; Loerch, P M; Leonardson, A; Lum, P Y; Cavet, G; Wu, L F; Altschuler, S J; Edwards, S; King, J; Tsang, J S; Schimmack, G; Schelter, J M; Koch, J; Ziman, M; Marton, M J; Li, B; Cundiff, P; Ward, T; Castle, J; Krolewski, M; Meyer, M R; Mao, M; Burchard, J; Kidd, M J; Dai, H; Phillips, J W; Linsley, P S; Stoughton, R; Scherer, S; Boguski, M S

    2001-02-15

    The most important product of the sequencing of a genome is a complete, accurate catalogue of genes and their products, primarily messenger RNA transcripts and their cognate proteins. Such a catalogue cannot be constructed by computational annotation alone; it requires experimental validation on a genome scale. Using 'exon' and 'tiling' arrays fabricated by ink-jet oligonucleotide synthesis, we devised an experimental approach to validate and refine computational gene predictions and define full-length transcripts on the basis of co-regulated expression of their exons. These methods can provide more accurate gene numbers and allow the detection of mRNA splice variants and identification of the tissue- and disease-specific conditions under which genes are expressed. We apply our technique to chromosome 22q under 69 experimental condition pairs, and to the entire human genome under two experimental conditions. We discuss implications for more comprehensive, consistent and reliable genome annotation, more efficient, full-length complementary DNA cloning strategies and application to complex diseases.

  18. A time-accurate high-resolution TVD scheme for solving the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Kim, Hyun Dae; Liu, Nan-Suey

    1992-01-01

    A total variation diminishing (TVD) scheme has been developed and incorporated into an existing time-accurate high-resolution Navier-Stokes code. The accuracy and the robustness of the resulting solution procedure have been assessed by performing many calculations in four different areas: shock tube flows, regular shock reflection, supersonic boundary layer, and shock boundary layer interactions. These numerical results compare well with corresponding exact solutions or experimental data.

  19. Rapid calculation of accurate atomic charges for proteins via the electronegativity equalization method.

    PubMed

    Ionescu, Crina-Maria; Geidl, Stanislav; Svobodová Vařeková, Radka; Koča, Jaroslav

    2013-10-28

    We focused on the parametrization and evaluation of empirical models for fast and accurate calculation of conformationally dependent atomic charges in proteins. The models were based on the electronegativity equalization method (EEM), and the parametrization procedure was tailored to proteins. We used large protein fragments as reference structures and fitted the EEM model parameters using atomic charges computed by three population analyses (Mulliken, Natural, iterative Hirshfeld), at the Hartree-Fock level with two basis sets (6-31G*, 6-31G**) and in two environments (gas phase, implicit solvation). We parametrized and successfully validated 24 EEM models. When tested on insulin and ubiquitin, all models reproduced quantum mechanics level charges well and were consistent with respect to population analysis and basis set. Specifically, the models showed on average a correlation of 0.961, RMSD 0.097 e, and average absolute error per atom 0.072 e. The EEM models can be used with the freely available EEM implementation EEM_SOLVER.

  20. Development of reactive force fields using ab initio molecular dynamics simulation minimally biased to experimental data

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Arntsen, Christopher; Voth, Gregory A.

    2017-10-01

    Incorporation of quantum mechanical electronic structure data is necessary to properly capture the physics of many chemical processes. Proton hopping in water, which involves rearrangement of chemical and hydrogen bonds, is one such example of an inherently quantum mechanical process. Standard ab initio molecular dynamics (AIMD) methods, however, do not yet accurately predict the structure of water and are therefore less than optimal for developing force fields. We have instead utilized a recently developed method which minimally biases AIMD simulations to match limited experimental data to develop novel multiscale reactive molecular dynamics (MS-RMD) force fields by using relative entropy minimization. In this paper, we present two new MS-RMD models using such a parameterization: one which employs water with harmonic internal vibrations and another which uses anharmonic water. We show that the newly developed MS-RMD models very closely reproduce the solvation structure of the hydrated excess proton in the target AIMD data. We also find that the use of anharmonic water increases proton hopping, thereby increasing the proton diffusion constant.

  1. Accurate Modelling of Surface Currents and Internal Tides in a Semi-enclosed Coastal Sea

    NASA Astrophysics Data System (ADS)

    Allen, S. E.; Soontiens, N. K.; Dunn, M. B. H.; Liu, J.; Olson, E.; Halverson, M. J.; Pawlowicz, R.

    2016-02-01

    The Strait of Georgia is a deep (400 m), strongly stratified, semi-enclosed coastal sea on the west coast of North America. We have configured a baroclinic model of the Strait of Georgia and surrounding coastal waters using the NEMO ocean community model. We run daily nowcasts and forecasts and publish our sea-surface results (including storm surge warnings) to the web (salishsea.eos.ubc.ca/storm-surge). Tides in the Strait of Georgia are mixed and large. The baroclinic model and previous barotropic models accurately represent tidal sea-level variations and depth mean currents. The baroclinic model reproduces accurately the diurnal but not the semi-diurnal baroclinic tidal currents. In the Southern Strait of Georgia, strong internal tidal currents at the semi-diurnal frequency are observed. Strong semi-diurnal tides are also produced in the model, but are almost 180 degrees out of phase with the observations. In the model, in the surface, the barotropic and baroclinic tides reinforce, whereas the observations show that at the surface the baroclinic tides oppose the barotropic. As such the surface currents are very poorly modelled. Here we will present evidence of the internal tidal field from observations. We will discuss the generation regions of the tides, the necessary modifications to the model required to correct the phase, the resulting baroclinic tides and the improvements in the surface currents.

  2. Raising the bar for reproducible science at the U.S. Environmental Protection Agency Office of Research and Development.

    PubMed

    George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N

    2015-05-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.

  3. Potential utility of three-dimensional temperature and salinity fields estimated from satellite altimetry and Argo data for improving mesoscale reproducibility in regional ocean modeling

    NASA Astrophysics Data System (ADS)

    Kanki, R.; Uchiyama, Y.; Miyazaki, D.; Takano, A.; Miyazawa, Y.; Yamazaki, H.

    2014-12-01

    Mesoscale oceanic structure and variability are required to be reproduced as accurately as possible in realistic regional ocean modeling. Uchiyama et al. (2012) demonstrated with a submesoscale eddy-resolving JCOPE2-ROMS downscaling oceanic modeling system that the mesoscale reproducibility of the Kuroshio meandering along Japan is significantly improved by introducing a simple restoration to data which we call "TS nudging" (a.k.a. robust diagnosis) where the prognostic temperature and salinity fields are weakly nudged four-dimensionally towards the assimilative JCOPE2 reanalysis (Miyazawa et al., 2009). However, there is not always a reliable reanalysis for oceanic downscaling in an arbitrary region and at an arbitrary time, and therefore alternative dataset should be prepared. Takano et al. (2009) proposed an empirical method to estimate mesoscale 3-D thermal structure from the near real-time AVISO altimetry data along with the ARGO float data based on the two-layer model of Goni et al. (1996). In the present study, we consider the TS data derived from this method as a candidate. We thus conduct a synoptic forward modeling of the Kuroshio using the JCOPE2-ROMS downscaling system to explore potential utility of this empirical TS dataset (hereinafter TUM-TS) by carrying out two runs with the T-S nudging towards 1) the JCOPE2-TS and 2) TUM-TS fields. An example of the comparison between the two ROMS test runs is shown in the attached figure showing the annually averaged surface EKE. Both of TUM-TS and JCOPE2-TS are found to help reproducing the mesoscale variance of the Koroshio and its extension as well as its mean paths, surface KE and EKE reasonably well. Therefore, the AVISO-ARGO derived empirical 3-D TS estimation is potentially exploitable for the dataset to conduct the T-S nudging to reproduce mesoscale oceanic structure.

  4. On the accurate analysis of vibroacoustics in head insert gradient coils.

    PubMed

    Winkler, Simone A; Alejski, Andrew; Wade, Trevor; McKenzie, Charles A; Rutt, Brian K

    2017-10-01

    To accurately analyze vibroacoustics in MR head gradient coils. A detailed theoretical model for gradient coil vibroacoustics, including the first description and modeling of Lorentz damping, is introduced and implemented in a multiphysics software package. Numerical finite-element method simulations were used to establish a highly accurate vibroacoustic model in head gradient coils in detail, including the newly introduced Lorentz damping effect. Vibroacoustic coupling was examined through an additional modal analysis. Thorough experimental studies were used to validate simulations. Average experimental sound pressure levels (SPLs) and accelerations over the 0-3000 Hz frequency range were 97.6 dB, 98.7 dB, and 95.4 dB, as well as 20.6 g, 8.7 g, and 15.6 g for the X-, Y-, and Z-gradients, respectively. A reasonable agreement between simulations and measurements was achieved. Vibroacoustic coupling showed a coupled resonance at 2300 Hz for the Z-gradient that is responsible for a sharp peak and the highest SPL value in the acoustic spectrum. We have developed and used more realistic multiphysics simulation methods to gain novel insights into the underlying concepts for vibroacoustics in head gradient coils, which will permit improved analyses of existing gradient coils and novel SPL reduction strategies for future gradient coil designs. Magn Reson Med 78:1635-1645, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  5. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-08

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  6. Evaluation of guidewire path reproducibility

    PubMed Central

    Schafer, Sebastian; Hoffmann, Kenneth R.; Noël, Peter B.; Ionita, Ciprian N.; Dmochowski, Jacek

    2008-01-01

    The number of minimally invasive vascular interventions is increasing. In these interventions, a variety of devices are directed to and placed at the site of intervention. The device used in almost all of these interventions is the guidewire, acting as a monorail for all devices which are delivered to the intervention site. However, even with the guidewire in place, clinicians still experience difficulties during the interventions. As a first step toward understanding these difficulties and facilitating guidewire and device guidance, we have investigated the reproducibility of the final paths of the guidewire in vessel phantom models on different factors: user, materials and geometry. Three vessel phantoms (vessel diameters ∼4 mm) were constructed having tortuousity similar to the internal carotid artery from silicon tubing and encased in Sylgard elastomer. Several trained users repeatedly passed two guidewires of different flexibility through the phantoms under pulsatile flow conditions. After the guidewire had been placed, rotational c-arm image sequences were acquired (9 in. II mode, 0.185 mm pixel size), and the phantom and guidewire were reconstructed (5123, 0.288 mm voxel size). The reconstructed volumes were aligned. The centerlines of the guidewire and the phantom vessel were then determined using region-growing techniques. Guidewire paths appear similar across users but not across materials. The average root mean square difference of the repeated placement was 0.17±0.02 mm (plastic-coated guidewire), 0.73±0.55 mm (steel guidewire) and 1.15±0.65 mm (steel versus plastic-coated). For a given guidewire, these results indicate that the guidewire path is relatively reproducible in shape and position. PMID:18561663

  7. Reconstruction of dynamic structures of experimental setups based on measurable experimental data only

    NASA Astrophysics Data System (ADS)

    Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang

    2018-03-01

    Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.

  8. Highly accurate apparatus for electrochemical characterization of the felt electrodes used in redox flow batteries

    NASA Astrophysics Data System (ADS)

    Park, Jong Ho; Park, Jung Jin; Park, O. Ok; Jin, Chang-Soo; Yang, Jung Hoon

    2016-04-01

    Because of the rise in renewable energy use, the redox flow battery (RFB) has attracted extensive attention as an energy storage system. Thus, many studies have focused on improving the performance of the felt electrodes used in RFBs. However, existing analysis cells are unsuitable for characterizing felt electrodes because of their complex 3-dimensional structure. Analysis is also greatly affected by the measurement conditions, viz. compression ratio, contact area, and contact strength between the felt and current collector. To address the growing need for practical analytical apparatus, we report a new analysis cell for accurate electrochemical characterization of felt electrodes under various conditions, and compare it with previous ones. In this cell, the measurement conditions can be exhaustively controlled with a compression supporter. The cell showed excellent reproducibility in cyclic voltammetry analysis and the results agreed well with actual RFB charge-discharge performance.

  9. Accurate electron gun-positioning mechanism for electron beam-mapping of large cross-section magnetic surfaces

    NASA Astrophysics Data System (ADS)

    Anderson, F. S. B.; Middleton, F.; Colchin, R. J.; Million, D.

    1989-04-01

    A method of accurately supporting and positioning an electron source inside a large cross-sectional area magnetic field which provides very low electron beam occlusion is reported. The application of electrical discharge machining to the fabrication of a 1-m truss support structure has provided an extremely long, rigid and mechanically strong electron gun support. Reproducible electron gun positioning to within 1 mm has been achieved at any location within a 1×0.6-m2 area. The extremely thin sections of the support truss (≤1.5 mm) have kept the electron beam occlusion to less than 3 mm. The support and drive mechanism have been designed and fabricated at the University of Wisconsin for application to the mapping of the magnetic surface structure of the Advanced Toroidal Facility torsatron1 at the Oak Ridge National Laboratory.

  10. Exchange-Hole Dipole Dispersion Model for Accurate Energy Ranking in Molecular Crystal Structure Prediction.

    PubMed

    Whittleton, Sarah R; Otero-de-la-Roza, A; Johnson, Erin R

    2017-02-14

    Accurate energy ranking is a key facet to the problem of first-principles crystal-structure prediction (CSP) of molecular crystals. This work presents a systematic assessment of B86bPBE-XDM, a semilocal density functional combined with the exchange-hole dipole moment (XDM) dispersion model, for energy ranking using 14 compounds from the first five CSP blind tests. Specifically, the set of crystals studied comprises 11 rigid, planar compounds and 3 co-crystals. The experimental structure was correctly identified as the lowest in lattice energy for 12 of the 14 total crystals. One of the exceptions is 4-hydroxythiophene-2-carbonitrile, for which the experimental structure was correctly identified once a quasi-harmonic estimate of the vibrational free-energy contribution was included, evidencing the occasional importance of thermal corrections for accurate energy ranking. The other exception is an organic salt, where charge-transfer error (also called delocalization error) is expected to cause the base density functional to be unreliable. Provided the choice of base density functional is appropriate and an estimate of temperature effects is used, XDM-corrected density-functional theory is highly reliable for the energetic ranking of competing crystal structures.

  11. Exploring the reproducibility of functional connectivity alterations in Parkinson’s disease

    PubMed Central

    Onu, Mihaela; Wu, Tao; Roceanu, Adina; Bajenaru, Ovidiu

    2017-01-01

    Since anatomic MRI is presently not able to directly discern neuronal loss in Parkinson’s Disease (PD), studying the associated functional connectivity (FC) changes seems a promising approach toward developing non-invasive and non-radioactive neuroimaging markers for this disease. While several groups have reported such FC changes in PD, there are also significant discrepancies between studies. Investigating the reproducibility of PD-related FC changes on independent datasets is therefore of crucial importance. We acquired resting-state fMRI scans for 43 subjects (27 patients and 16 normal controls, with 2 replicate scans per subject) and compared the observed FC changes with those obtained in two independent datasets, one made available by the PPMI consortium (91 patients, 18 controls) and a second one by the group of Tao Wu (20 patients, 20 controls). Unfortunately, PD-related functional connectivity changes turned out to be non-reproducible across datasets. This could be due to disease heterogeneity, but also to technical differences. To distinguish between the two, we devised a method to directly check for disease heterogeneity using random splits of a single dataset. Since we still observe non-reproducibility in a large fraction of random splits of the same dataset, we conclude that functional heterogeneity may be a dominating factor behind the lack of reproducibility of FC alterations in different rs-fMRI studies of PD. While global PD-related functional connectivity changes were non-reproducible across datasets, we identified a few individual brain region pairs with marginally consistent FC changes across all three datasets. However, training classifiers on each one of the three datasets to discriminate PD scans from controls produced only low accuracies on the remaining two test datasets. Moreover, classifiers trained and tested on random splits of the same dataset (which are technically homogeneous) also had low test accuracies, directly substantiating

  12. Reproducibility of 3D kinematics and surface electromyography measurements of mastication.

    PubMed

    Remijn, Lianne; Groen, Brenda E; Speyer, Renée; van Limbeek, Jacques; Nijhuis-van der Sanden, Maria W G

    2016-03-01

    The aim of this study was to determine the measurement reproducibility for a procedure evaluating the mastication process and to estimate the smallest detectable differences of 3D kinematic and surface electromyography (sEMG) variables. Kinematics of mandible movements and sEMG activity of the masticatory muscles were obtained over two sessions with four conditions: two food textures (biscuit and bread) of two sizes (small and large). Twelve healthy adults (mean age 29.1 years) completed the study. The second to the fifth chewing cycle of 5 bites were used for analyses. The reproducibility per outcome variable was calculated with an intraclass correlation coefficient (ICC) and a Bland-Altman analysis was applied to determine the standard error of measurement relative error of measurement and smallest detectable differences of all variables. ICCs ranged from 0.71 to 0.98 for all outcome variables. The outcome variables consisted of four bite and fourteen chewing cycle variables. The relative standard error of measurement of the bite variables was up to 17.3% for 'time-to-swallow', 'time-to-transport' and 'number of chewing cycles', but ranged from 31.5% to 57.0% for 'change of chewing side'. The relative standard error of measurement ranged from 4.1% to 24.7% for chewing cycle variables and was smaller for kinematic variables than sEMG variables. In general, measurements obtained with 3D kinematics and sEMG are reproducible techniques to assess the mastication process. The duration of the chewing cycle and frequency of chewing were the best reproducible measurements. Change of chewing side could not be reproduced. The published measurement error and smallest detectable differences will aid the interpretation of the results of future clinical studies using the same study variables. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Time-Accurate Solutions of Incompressible Navier-Stokes Equations for Potential Turbopump Applications

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan

    2001-01-01

    Two numerical procedures, one based on artificial compressibility method and the other pressure projection method, are outlined for obtaining time-accurate solutions of the incompressible Navier-Stokes equations. The performance of the two method are compared by obtaining unsteady solutions for the evolution of twin vortices behind a at plate. Calculated results are compared with experimental and other numerical results. For an un- steady ow which requires small physical time step, pressure projection method was found to be computationally efficient since it does not require any subiterations procedure. It was observed that the artificial compressibility method requires a fast convergence scheme at each physical time step in order to satisfy incompressibility condition. This was obtained by using a GMRES-ILU(0) solver in our computations. When a line-relaxation scheme was used, the time accuracy was degraded and time-accurate computations became very expensive.

  14. Towards Accurate Application Characterization for Exascale (APEX)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Simon David

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns.more » Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.« less

  15. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    PubMed

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Nicholas J. H.; Noid, W. G., E-mail: wnoid@chem.psu.edu

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF.more » We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.« less

  17. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  18. Fabrication of reproducible, integration-compatible hybrid molecular/si electronics.

    PubMed

    Yu, Xi; Lovrinčić, Robert; Kraynis, Olga; Man, Gabriel; Ely, Tal; Zohar, Arava; Toledano, Tal; Cahen, David; Vilan, Ayelet

    2014-12-29

    Reproducible molecular junctions can be integrated within standard CMOS technology. Metal-molecule-semiconductor junctions are fabricated by direct Si-C binding of hexadecane or methyl-styrene onto oxide-free H-Si(111) surfaces, with the lateral size of the junctions defined by an etched SiO2 well and with evaporated Pb as the top contact. The current density, J, is highly reproducible with a standard deviation in log(J) of 0.2 over a junction diameter change from 3 to 100 μm. Reproducibility over such a large range indicates that transport is truly across the molecules and does not result from artifacts like edge effects or defects in the molecular monolayer. Device fabrication is tested for two n-Si doping levels. With highly doped Si, transport is dominated by tunneling and reveals sharp conductance onsets at room temperature. Using the temperature dependence of current across medium-doped n-Si, the molecular tunneling barrier can be separated from the Si-Schottky one, which is a 0.47 eV, in agreement with the molecular-modified surface dipole and quite different from the bare Si-H junction. This indicates that Pb evaporation does not cause significant chemical changes to the molecules. The ability to manufacture reliable devices constitutes important progress toward possible future hybrid Si-based molecular electronics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Improved estimates of fixed reproducible tangible wealth, 1929-95

    DOT National Transportation Integrated Search

    1997-05-01

    This article presents revised estimates of the value of fixed reproducible tangible wealth in the United States for 192995; these estimates incorporate the definitional and statistical : improvements introduced in last years comprehensive revis...

  20. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  1. An accurate density functional theory based estimation of pK(a) values of polar residues combined with experimental data: from amino acids to minimal proteins.

    PubMed

    Matsui, Toru; Baba, Takeshi; Kamiya, Katsumasa; Shigeta, Yasuteru

    2012-03-28

    We report a scheme for estimating the acid dissociation constant (pK(a)) based on quantum-chemical calculations combined with a polarizable continuum model, where a parameter is determined for small reference molecules. We calculated the pK(a) values of variously sized molecules ranging from an amino acid to a protein consisting of 300 atoms. This scheme enabled us to derive a semiquantitative pK(a) value of specific chemical groups and discuss the influence of the surroundings on the pK(a) values. As applications, we have derived the pK(a) value of the side chain of an amino acid and almost reproduced the experimental value. By using our computing schemes, we showed the influence of hydrogen bonds on the pK(a) values in the case of tripeptides, which decreases the pK(a) value by 3.0 units for serine in comparison with those of the corresponding monopeptides. Finally, with some assumptions, we derived the pK(a) values of tyrosines and serines in chignolin and a tryptophan cage. We obtained quite different pK(a) values of adjacent serines in the tryptophan cage; the pK(a) value of the OH group of Ser13 exposed to bulk water is 14.69, whereas that of Ser14 not exposed to bulk water is 20.80 because of the internal hydrogen bonds.

  2. Accurate coarse-grained models for mixtures of colloids and linear polymers under good-solvent conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D’Adamo, Giuseppe, E-mail: giuseppe.dadamo@sissa.it; Pelissetto, Andrea, E-mail: andrea.pelissetto@roma1.infn.it; Pierleoni, Carlo, E-mail: carlo.pierleoni@aquila.infn.it

    2014-12-28

    A coarse-graining strategy, previously developed for polymer solutions, is extended here to mixtures of linear polymers and hard-sphere colloids. In this approach, groups of monomers are mapped onto a single pseudoatom (a blob) and the effective blob-blob interactions are obtained by requiring the model to reproduce some large-scale structural properties in the zero-density limit. We show that an accurate parametrization of the polymer-colloid interactions is obtained by simply introducing pair potentials between blobs and colloids. For the coarse-grained (CG) model in which polymers are modelled as four-blob chains (tetramers), the pair potentials are determined by means of the iterative Boltzmannmore » inversion scheme, taking full-monomer (FM) pair correlation functions at zero-density as targets. For a larger number n of blobs, pair potentials are determined by using a simple transferability assumption based on the polymer self-similarity. We validate the model by comparing its predictions with full-monomer results for the interfacial properties of polymer solutions in the presence of a single colloid and for thermodynamic and structural properties in the homogeneous phase at finite polymer and colloid density. The tetramer model is quite accurate for q ≲ 1 (q=R{sup ^}{sub g}/R{sub c}, where R{sup ^}{sub g} is the zero-density polymer radius of gyration and R{sub c} is the colloid radius) and reasonably good also for q = 2. For q = 2, an accurate coarse-grained description is obtained by using the n = 10 blob model. We also compare our results with those obtained by using single-blob models with state-dependent potentials.« less

  3. RICO: A NEW APPROACH FOR FAST AND ACCURATE REPRESENTATION OF THE COSMOLOGICAL RECOMBINATION HISTORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fendt, W. A.; Wandelt, B. D.; Chluba, J.

    2009-04-15

    We present RICO, a code designed to compute the ionization fraction of the universe during the epoch of hydrogen and helium recombination with an unprecedented combination of speed and accuracy. This is accomplished by training the machine learning code PICO on the calculations of a multilevel cosmological recombination code which self-consistently includes several physical processes that were neglected previously. After training, RICO is used to fit the free electron fraction as a function of the cosmological parameters. While, for example, at low redshifts (z {approx}< 900), much of the net change in the ionization fraction can be captured by loweringmore » the hydrogen fudge factor in RECFAST by about 3%, RICO provides a means of effectively using the accurate ionization history of the full recombination code in the standard cosmological parameter estimation framework without the need to add new or refined fudge factors or functions to a simple recombination model. Within the new approach presented here, it is easy to update RICO whenever a more accurate full recombination code becomes available. Once trained, RICO computes the cosmological ionization history with negligible fitting error in {approx}10 ms, a speedup of at least 10{sup 6} over the full recombination code that was used here. Also RICO is able to reproduce the ionization history of the full code to a level well below 0.1%, thereby ensuring that the theoretical power spectra of cosmic microwave background (CMB) fluctuations can be computed to sufficient accuracy and speed for analysis from upcoming CMB experiments like Planck. Furthermore, it will enable cross-checking different recombination codes across cosmological parameter space, a comparison that will be very important in order to assure the accurate interpretation of future CMB data.« less

  4. Accurate ab initio Quartic Force Fields of Cyclic and Bent HC2N Isomers

    NASA Technical Reports Server (NTRS)

    Inostroza, Natalia; Huang, Xinchuan; Lee, Timothy J.

    2012-01-01

    Highly correlated ab initio quartic force field (QFFs) are used to calculate the equilibrium structures and predict the spectroscopic parameters of three HC2N isomers. Specifically, the ground state quasilinear triplet and the lowest cyclic and bent singlet isomers are included in the present study. Extensive treatment of correlation effects were included using the singles and doubles coupled-cluster method that includes a perturbational estimate of the effects of connected triple excitations, denoted CCSD(T). Dunning s correlation-consistent basis sets cc-pVXZ, X=3,4,5, were used, and a three-point formula for extrapolation to the one-particle basis set limit was used. Core-correlation and scalar relativistic corrections were also included to yield highly accurate QFFs. The QFFs were used together with second-order perturbation theory (with proper treatment of Fermi resonances) and variational methods to solve the nuclear Schr dinger equation. The quasilinear nature of the triplet isomer is problematic, and it is concluded that a QFF is not adequate to describe properly all of the fundamental vibrational frequencies and spectroscopic constants (though some constants not dependent on the bending motion are well reproduced by perturbation theory). On the other hand, this procedure (a QFF together with either perturbation theory or variational methods) leads to highly accurate fundamental vibrational frequencies and spectroscopic constants for the cyclic and bent singlet isomers of HC2N. All three isomers possess significant dipole moments, 3.05D, 3.06D, and 1.71D, for the quasilinear triplet, the cyclic singlet, and the bent singlet isomers, respectively. It is concluded that the spectroscopic constants determined for the cyclic and bent singlet isomers are the most accurate available, and it is hoped that these will be useful in the interpretation of high-resolution astronomical observations or laboratory experiments.

  5. Accurate Arabic Script Language/Dialect Classification

    DTIC Science & Technology

    2014-01-01

    Army Research Laboratory Accurate Arabic Script Language/Dialect Classification by Stephen C. Tratz ARL-TR-6761 January 2014 Approved for public...1197 ARL-TR-6761 January 2014 Accurate Arabic Script Language/Dialect Classification Stephen C. Tratz Computational and Information Sciences...Include area code) Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 January 2014 Final Accurate Arabic Script Language/Dialect Classification

  6. Evaluation of the reproducibility of lung motion probability distribution function (PDF) using dynamic MRI.

    PubMed

    Cai, Jing; Read, Paul W; Altes, Talissa A; Molloy, Janelle A; Brookeman, James R; Sheng, Ke

    2007-01-21

    Treatment planning based on probability distribution function (PDF) of patient geometries has been shown a potential off-line strategy to incorporate organ motion, but the application of such approach highly depends upon the reproducibility of the PDF. In this paper, we investigated the dependences of the PDF reproducibility on the imaging acquisition parameters, specifically the scan time and the frame rate. Three healthy subjects underwent a continuous 5 min magnetic resonance (MR) scan in the sagittal plane with a frame rate of approximately 10 f s-1, and the experiments were repeated with an interval of 2 to 3 weeks. A total of nine pulmonary vessels from different lung regions (upper, middle and lower) were tracked and the dependences of their displacement PDF reproducibility were evaluated as a function of scan time and frame rate. As results, the PDF reproducibility error decreased with prolonged scans and appeared to approach equilibrium state in subjects 2 and 3 within the 5 min scan. The PDF accuracy increased in the power function with the increase of frame rate; however, the PDF reproducibility showed less sensitivity to frame rate presumably due to the randomness of breathing which dominates the effects. As the key component of the PDF-based treatment planning, the reproducibility of the PDF affects the dosimetric accuracy substantially. This study provides a reference for acquiring MR-based PDF of structures in the lung.

  7. Reproducibility of HTS-SQUID magnetocardiography in an unshielded clinical environment.

    PubMed

    Leder, U; Schrey, F; Haueisen, J; Dörrer, L; Schreiber, J; Liehr, M; Schwarz, G; Solbrig, O; Figulla, H R; Seidel, P

    2001-07-01

    A new technology has been developed which measures the magnetic field of the human heart (magnetocardiogram, MCG) by using high temperature superconducting (HTS) sensors. These sensors can be operated at the temperature of liquid nitrogen without electromagnetic shielding. We tested the reproducibility of HTS-MCG measurements in healthy volunteers. Unshielded HTS-MCG measurements were performed in 18 healthy volunteers in left precordial position in two separate sessions in a clinical environment. The heart cycles of 10 min were averaged, smoothed, the baselines were adjusted, and the data were standardized to the respective areas under the curves (AUC) of the absolute values of the QRST amplitudes. The QRS complexes and the ST-T intervals were used to assess the reproducibility of the two measurements. Ratios (R(QRS), R(STT)) were calculated by dividing the AUC of the first measurement by the ones of the second measurement. The linear correlation coefficients (CORR(QRS), CORR(STT)) of the time intervals of the two measurements were calculated, too. The HTS-MCG signal was completely concealed by the high noise level in the raw data. The averaging and smoothing algorithms unmasked the QRS complex and the ST segment. A high reproducibility was found for the QRS complex (R(QRS)=1.2+/-0.3, CORR(QRS)=0.96+/-0.06). Similarly to the shape of the ECG it was characterized by three bends, the Q, R, and S waves. In the ST-T interval, the reproducibility was considerably lower (R(STT)=0.9+/-0.2, CORR(STT)=0.66+/-0.28). In contrast to the shape of the ECG, a baseline deflection after the T wave which may belong to U wave activity was found in a number of volunteers. HTS-MCG devices can be operated in a clinical environment without shielding. Whereas the reproducibility was found to be high for the depolarization interval, it was considerably lower for the ST segment and for the T wave. Therefore, before clinically applying HTS-MCG systems to the detection of repolarization

  8. Ensuring reproducibility and ethics in animal experiments reporting in Korea using the ARRIVE guideline

    PubMed Central

    Nam, Mi-Hyun; Chun, Myung-Sun

    2018-01-01

    The aim of this study is to evaluate the reporting quality of animal experiments in Korea using the Animals in Research: Reporting In Vivo Experiments (ARRIVE) guideline developed in 2010 to overcome the reproducibility problem and to encourage compliance with replacement, refinement and reduction of animals in research (3R's principle). We reviewed 50 papers published by a Korean research group from 2013 to 2016 and scored the conformity with the 20-items ARRIVE guideline. The median conformity score was 39.50%. For more precise evaluation, the 20 items were subdivided into 57 sub-items. Among the sub-items, status of experimental animals, housing and husbandry were described under the average level. Microenvironment sub-items, such as enrichment, bedding material, cage type, number of companions, scored under 10%. Although statistical methods used for the studies were given in most publications (84%), sample size calculation and statistical assumption were rarely described. Most publications mentioned the IACUC approval, but only 8% mentioned welfare-related assessments and interventions, and only 4% mentioned any implications of experimental methods or findings for 3R. We may recommend the revision of the present IACUC proposal to collect more detailed information and improving educational program for animal researchers according to the ARRIVE guideline. PMID:29628972

  9. Simple Mathematical Models Do Not Accurately Predict Early SIV Dynamics

    PubMed Central

    Noecker, Cecilia; Schaefer, Krista; Zaccheo, Kelly; Yang, Yiding; Day, Judy; Ganusov, Vitaly V.

    2015-01-01

    Upon infection of a new host, human immunodeficiency virus (HIV) replicates in the mucosal tissues and is generally undetectable in circulation for 1–2 weeks post-infection. Several interventions against HIV including vaccines and antiretroviral prophylaxis target virus replication at this earliest stage of infection. Mathematical models have been used to understand how HIV spreads from mucosal tissues systemically and what impact vaccination and/or antiretroviral prophylaxis has on viral eradication. Because predictions of such models have been rarely compared to experimental data, it remains unclear which processes included in these models are critical for predicting early HIV dynamics. Here we modified the “standard” mathematical model of HIV infection to include two populations of infected cells: cells that are actively producing the virus and cells that are transitioning into virus production mode. We evaluated the effects of several poorly known parameters on infection outcomes in this model and compared model predictions to experimental data on infection of non-human primates with variable doses of simian immunodifficiency virus (SIV). First, we found that the mode of virus production by infected cells (budding vs. bursting) has a minimal impact on the early virus dynamics for a wide range of model parameters, as long as the parameters are constrained to provide the observed rate of SIV load increase in the blood of infected animals. Interestingly and in contrast with previous results, we found that the bursting mode of virus production generally results in a higher probability of viral extinction than the budding mode of virus production. Second, this mathematical model was not able to accurately describe the change in experimentally determined probability of host infection with increasing viral doses. Third and finally, the model was also unable to accurately explain the decline in the time to virus detection with increasing viral dose. These results

  10. Reproducibility in science: improving the standard for basic and preclinical research.

    PubMed

    Begley, C Glenn; Ioannidis, John P A

    2015-01-02

    Medical and scientific advances are predicated on new knowledge that is robust and reliable and that serves as a solid foundation on which further advances can be built. In biomedical research, we are in the midst of a revolution with the generation of new data and scientific publications at a previously unprecedented rate. However, unfortunately, there is compelling evidence that the majority of these discoveries will not stand the test of time. To a large extent, this reproducibility crisis in basic and preclinical research may be as a result of failure to adhere to good scientific practice and the desperation to publish or perish. This is a multifaceted, multistakeholder problem. No single party is solely responsible, and no single solution will suffice. Here we review the reproducibility problems in basic and preclinical biomedical research, highlight some of the complexities, and discuss potential solutions that may help improve research quality and reproducibility. © 2015 American Heart Association, Inc.

  11. An accurate automated technique for quasi-optics measurement of the microwave diagnostics for fusion plasma

    NASA Astrophysics Data System (ADS)

    Hu, Jianqiang; Liu, Ahdi; Zhou, Chu; Zhang, Xiaohui; Wang, Mingyuan; Zhang, Jin; Feng, Xi; Li, Hong; Xie, Jinlin; Liu, Wandong; Yu, Changxuan

    2017-08-01

    A new integrated technique for fast and accurate measurement of the quasi-optics, especially for the microwave/millimeter wave diagnostic systems of fusion plasma, has been developed. Using the LabVIEW-based comprehensive scanning system, we can realize not only automatic but also fast and accurate measurement, which will help to eliminate the effects of temperature drift and standing wave/multi-reflection. With the Matlab-based asymmetric two-dimensional Gaussian fitting method, all the desired parameters of the microwave beam can be obtained. This technique can be used in the design and testing of microwave diagnostic systems such as reflectometers and the electron cyclotron emission imaging diagnostic systems of the Experimental Advanced Superconducting Tokamak.

  12. Aveiro method in reproducing kernel Hilbert spaces under complete dictionary

    NASA Astrophysics Data System (ADS)

    Mai, Weixiong; Qian, Tao

    2017-12-01

    Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.

  13. On the origin of reproducible sequential activity in neural circuits

    NASA Astrophysics Data System (ADS)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  14. On the origin of reproducible sequential activity in neural circuits.

    PubMed

    Afraimovich, V S; Zhigulin, V P; Rabinovich, M I

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  15. Reproducibility and validity of the Shanghai Women's Health Study physical activity questionnaire.

    PubMed

    Matthews, Charles E; Shu, Xiao-Ou; Yang, Gong; Jin, Fan; Ainsworth, Barbara E; Liu, Dake; Gao, Yu-Tang; Zheng, Wei

    2003-12-01

    In this investigation, the authors evaluated the reproducibility and validity of the Shanghai Women's Health Study (SWHS) physical activity questionnaire (PAQ), which was administered in a cohort study of approximately 75,000 Chinese women aged 40-70 years. Reproducibility (2-year test-retest) was evaluated using kappa statistics and intraclass correlation coefficients (ICCs). Validity was evaluated by comparing Spearman correlations (r) for the SWHS PAQ with two criterion measures administered over a period of 12 months: four 7-day physical activity logs and up to 28 7-day PAQs. Women were recruited from the SWHS cohort (n = 200). Results indicated that the reproducibility of adolescent and adult exercise participation (kappa = 0.85 and kappa = 0.64, respectively) and years of adolescent exercise and adult exercise energy expenditure (ICC = 0.83 and ICC = 0.70, respectively) was reasonable. Reproducibility values for adult lifestyle activities were lower (ICC = 0.14-0.54). Significant correlations between the PAQ and criterion measures of adult exercise were observed for the first PAQ administration (physical activity log, r = 0.50; 7-day PAQ, r = 0.62) and the second PAQ administration (physical activity log, r = 0.74; 7-day PAQ, r = 0.80). Significant correlations between PAQ lifestyle activities and the 7-day PAQ were also noted (r = 0.33-0.88). These data indicate that the SWHS PAQ is a reproducible and valid measure of exercise behaviors and that it demonstrates utility in stratifying women by levels of important lifestyle activities (e.g., housework, walking, cycling).

  16. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  17. Enriched reproducing kernel particle method for fractional advection-diffusion equation

    NASA Astrophysics Data System (ADS)

    Ying, Yuping; Lian, Yanping; Tang, Shaoqiang; Liu, Wing Kam

    2018-06-01

    The reproducing kernel particle method (RKPM) has been efficiently applied to problems with large deformations, high gradients and high modal density. In this paper, it is extended to solve a nonlocal problem modeled by a fractional advection-diffusion equation (FADE), which exhibits a boundary layer with low regularity. We formulate this method on a moving least-square approach. Via the enrichment of fractional-order power functions to the traditional integer-order basis for RKPM, leading terms of the solution to the FADE can be exactly reproduced, which guarantees a good approximation to the boundary layer. Numerical tests are performed to verify the proposed approach.

  18. An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats

    PubMed Central

    Rivalan, Marion; Munawar, Humaira; Fuchs, Anna; Winter, York

    2017-01-01

    Animal models of human pathology are essential for biomedical research. However, a recurring issue in the use of animal models is the poor reproducibility of behavioural and physiological findings within and between laboratories. The most critical factor influencing this issue remains the experimenter themselves. One solution is the use of procedures devoid of human intervention. We present a novel approach to experimenter-free testing cognitive abilities in rats, by combining undisturbed group housing with automated, standardized and individual operant testing. This experimenter-free system consisted of an automated-operant system (Bussey-Saksida rat touch screen) connected to a home cage containing group living rats via an automated animal sorter (PhenoSys). The automated animal sorter, which is based on radio-frequency identification (RFID) technology, functioned as a mechanical replacement of the experimenter. Rats learnt to regularly and individually enter the operant chamber and remained there for the duration of the experimental session only. Self-motivated rats acquired the complex touch screen task of trial-unique non-matching to location (TUNL) in half the time reported for animals that were manually placed into the operant chamber. Rat performance was similar between the two groups within our laboratory, and comparable to previously published results obtained elsewhere. This reproducibility, both within and between laboratories, confirms the validity of this approach. In addition, automation reduced daily experimental time by 80%, eliminated animal handling, and reduced equipment cost. This automated, experimenter-free setup is a promising tool of great potential for testing a large variety of functions with full automation in future studies. PMID:28060883

  19. Parameter optimization for reproducible cardiac 1 H-MR spectroscopy at 3 Tesla.

    PubMed

    de Heer, Paul; Bizino, Maurice B; Lamb, Hildo J; Webb, Andrew G

    2016-11-01

    To optimize data acquisition parameters in cardiac proton MR spectroscopy, and to evaluate the intra- and intersession variability in myocardial triglyceride content. Data acquisition parameters at 3 Tesla (T) were optimized and reproducibility measured using, in total, 49 healthy subjects. The signal-to-noise-ratio (SNR) and the variance in metabolite amplitude between averages were measured for: (i) global versus local power optimization; (ii) static magnetic field (B 0 ) shimming performed during free-breathing or within breathholds; (iii) post R-wave peak measurement times between 50 and 900 ms; (iv) without respiratory compensation, with breathholds and with navigator triggering; and (v) frequency selective excitation, Chemical Shift Selective (CHESS) and Multiply Optimized Insensitive Suppression Train (MOIST) water suppression techniques. Using the optimized parameters intra- and intersession myocardial triglyceride content reproducibility was measured. Two cardiac proton spectra were acquired with the same parameters and compared (intrasession reproducibility) after which the subject was removed from the scanner and placed back in the scanner and a third spectrum was acquired which was compared with the first measurement (intersession reproducibility). Local power optimization increased SNR on average by 22% compared with global power optimization (P = 0.0002). The average linewidth was not significantly different for pencil beam B 0 shimming using free-breathing or breathholds (19.1 Hz versus 17.5 Hz; P = 0.15). The highest signal stability occurred at a cardiac trigger delay around 240 ms. The mean amplitude variation was significantly lower for breathholds versus free-breathing (P = 0.03) and for navigator triggering versus free-breathing (P = 0.03) as well as for navigator triggering versus breathhold (P = 0.02). The mean residual water signal using CHESS (1.1%, P = 0.01) or MOIST (0.7%, P = 0.01) water suppression was significantly lower than using

  20. An accurate coarse-grained model for chitosan polysaccharides in aqueous solution.

    PubMed

    Tsereteli, Levan; Grafmüller, Andrea

    2017-01-01

    Computational models can provide detailed information about molecular conformations and interactions in solution, which is currently inaccessible by other means in many cases. Here we describe an efficient and precise coarse-grained model for long polysaccharides in aqueous solution at different physico-chemical conditions such as pH and ionic strength. The Model is carefully constructed based on all-atom simulations of small saccharides and metadynamics sampling of the dihedral angles in the glycosidic links, which represent the most flexible degrees of freedom of the polysaccharides. The model is validated against experimental data for Chitosan molecules in solution with various degree of deacetylation, and is shown to closely reproduce the available experimental data. For long polymers, subtle differences of the free energy maps of the glycosidic links are found to significantly affect the measurable polymer properties. Therefore, for titratable monomers the free energy maps of the corresponding links are updated according to the current charge of the monomers. We then characterize the microscopic and mesoscopic structural properties of large chitosan polysaccharides in solution for a wide range of solvent pH and ionic strength, and investigate the effect of polymer length and degree and pattern of deacetylation on the polymer properties.