Science.gov

Sample records for accurately reproduce observed

  1. Accurate and reproducible determination of lignin molar mass by acetobromination.

    PubMed

    Asikkala, Janne; Tamminen, Tarja; Argyropoulos, Dimitris S

    2012-09-12

    The accurate and reproducible determination of lignin molar mass by using size exclusion chromatography (SEC) is challenging. The lignin association effects, known to dominate underivatized lignins, have been thoroughly addressed by reaction with acetyl bromide in an excess of glacial acetic acid. The combination of a concerted acetylation with the introduction of bromine within the lignin alkyl side chains is thought to be responsible for the observed excellent solubilization characteristics acetobromination imparts to a variety of lignin samples. The proposed methodology was compared and contrasted to traditional lignin derivatization methods. In addition, side reactions that could possibly be induced under the acetobromination conditions were explored with native softwood (milled wood lignin, MWL) and technical (kraft) lignin. These efforts lend support toward the use of room temperature acetobromination being a facile, effective, and universal lignin derivatization medium proposed to be employed prior to SEC measurements. PMID:22870925

  2. Accurate measurements of dynamics and reproducibility in small genetic networks

    PubMed Central

    Dubuis, Julien O; Samanta, Reba; Gregor, Thomas

    2013-01-01

    Quantification of gene expression has become a central tool for understanding genetic networks. In many systems, the only viable way to measure protein levels is by immunofluorescence, which is notorious for its limited accuracy. Using the early Drosophila embryo as an example, we show that careful identification and control of experimental error allows for highly accurate gene expression measurements. We generated antibodies in different host species, allowing for simultaneous staining of four Drosophila gap genes in individual embryos. Careful error analysis of hundreds of expression profiles reveals that less than ∼20% of the observed embryo-to-embryo fluctuations stem from experimental error. These measurements make it possible to extract not only very accurate mean gene expression profiles but also their naturally occurring fluctuations of biological origin and corresponding cross-correlations. We use this analysis to extract gap gene profile dynamics with ∼1 min accuracy. The combination of these new measurements and analysis techniques reveals a twofold increase in profile reproducibility owing to a collective network dynamics that relays positional accuracy from the maternal gradients to the pair-rule genes. PMID:23340845

  3. Cycle accurate and cycle reproducible memory for an FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameh W.; Kapur, Mohit

    2016-03-15

    A method, system and computer program product are disclosed for using a Field Programmable Gate Array (FPGA) to simulate operations of a device under test (DUT). The DUT includes a device memory having a number of input ports, and the FPGA is associated with a target memory having a second number of input ports, the second number being less than the first number. In one embodiment, a given set of inputs is applied to the device memory at a frequency Fd and in a defined cycle of time, and the given set of inputs is applied to the target memory at a frequency Ft. Ft is greater than Fd and cycle accuracy is maintained between the device memory and the target memory. In an embodiment, a cycle accurate model of the DUT memory is created by separating the DUT memory interface protocol from the target memory storage array.

  4. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  5. A cricoid cartilage compression device for the accurate and reproducible application of cricoid pressure.

    PubMed

    Taylor, R J; Smurthwaite, G; Mehmood, I; Kitchen, G B; Baker, R D

    2015-01-01

    We describe the development and laboratory assessment of a refined prototype tactile feedback device for the safe and accurate application of cricoid pressure. We recruited 20 operating department practitioners and compared their performance of cricoid pressure on a training simulator using both the device and a manual unaided technique. The device significantly reduced the spread of the applied force: average (SE) root mean squared error decreased from 8.23 (0.48) N to 5.23 (0.32) N (p < 0.001). The average (SE) upwards bias in applied force also decreased, from 2.30 (0.74) N to 0.88 (0.48) N (p < 0.01). Most importantly, the percentage of force applications that deviated from target by more than 10 N decreased from 18% to 7% (p < 0.01). The device requires no prior training, is cheap to manufacture, is single-use and requires no power to operate, whilst ensuring that the correct force is always consistently applied. PMID:25267415

  6. Susceptibility testing: accurate and reproducible minimum inhibitory concentration (MIC) and non-inhibitory concentration (NIC) values.

    PubMed

    Lambert, R J; Pearson, J

    2000-05-01

    Measuring the minimum inhibitory concentration (MIC) of a substance by current methods is straightforward, whereas obtaining useful comparative information from the tests can be more difficult. A simple technique and a method of data analysis are reported which give the experimentalist more useful information from susceptibility testing. This method makes use of a 100-well microtitre plate and the analysis uses all the growth information, obtained by turbidometry, from each and every well of the microtitre plate. A modified Gompertz function is used to fit the data, from which a more exact value can be obtained for the MIC. The technique also showed that at certain concentrations of inhibitor, there was no effect on growth relative to a control well (zero inhibitor). Above a threshold value, which has been termed the non-inhibitory concentration or NIC, growth becomes limiting until it reaches the MIC, where no growth relative to the control is observed.

  7. Do climate models reproduce complexity of observed sea level changes?

    NASA Astrophysics Data System (ADS)

    Becker, M.; Karpytchev, M.; Marcos, M.; Jevrejeva, S.; Lennartz-Sassinek, S.

    2016-05-01

    The ability of Atmosphere-Ocean General Circulation Models (AOGCMs) to capture the statistical behavior of sea level (SL) fluctuations has been assessed at the local scale. To do so, we have compared scaling behavior of the SL fluctuations simulated in the historical runs of 36 CMIP5 AOGCMs to that in the longest (>100 years) SL records from 23 tides gauges around the globe. The observed SL fluctuations are known to manifest a power law scaling. We have checked if the SL changes simulated in the AOGCM exhibit the same scaling properties and the long-term correlations as observed in the tide gauge records. We find that the majority of AOGCMs overestimates the scaling of SL fluctuations, particularly in the North Atlantic. Consequently, AOGCMs, routinely used to project regional SL rise, may underestimate the part of the externally driven SL rise, in particular the anthropogenic footprint, in the projections for the 21st century.

  8. SNe Ia: Can Chandrasekhar mass explosions reproduce the observed zoo?

    NASA Astrophysics Data System (ADS)

    Baron, E.

    2014-08-01

    The question of the nature of the progenitor of Type Ia supernovae (SNe Ia) is important both for our detailed understanding of stellar evolution and for their use as cosmological probes of the dark energy. Much of the basic features of SNe Ia can be understood directly from the nuclear physics, a fact which Gerry would have appreciated. We present an overview of the current observational and theoretical situation and show that it not incompatible with most SNe Ia being the results of thermonuclear explosions near the Chandrasekhar mass.

  9. ACCURATE CHARACTERIZATION OF HIGH-DEGREE MODES USING MDI OBSERVATIONS

    SciTech Connect

    Korzennik, S. G.; Rabello-Soares, M. C.; Schou, J.; Larson, T. P.

    2013-08-01

    We present the first accurate characterization of high-degree modes, derived using the best Michelson Doppler Imager (MDI) full-disk full-resolution data set available. A 90 day long time series of full-disk 2 arcsec pixel{sup -1} resolution Dopplergrams was acquired in 2001, thanks to the high rate telemetry provided by the Deep Space Network. These Dopplergrams were spatially decomposed using our best estimate of the image scale and the known components of MDI's image distortion. A multi-taper power spectrum estimator was used to generate power spectra for all degrees and all azimuthal orders, up to l = 1000. We used a large number of tapers to reduce the realization noise, since at high degrees the individual modes blend into ridges and thus there is no reason to preserve a high spectral resolution. These power spectra were fitted for all degrees and all azimuthal orders, between l = 100 and l = 1000, and for all the orders with substantial amplitude. This fitting generated in excess of 5.2 Multiplication-Sign 10{sup 6} individual estimates of ridge frequencies, line widths, amplitudes, and asymmetries (singlets), corresponding to some 5700 multiplets (l, n). Fitting at high degrees generates ridge characteristics, characteristics that do not correspond to the underlying mode characteristics. We used a sophisticated forward modeling to recover the best possible estimate of the underlying mode characteristics (mode frequencies, as well as line widths, amplitudes, and asymmetries). We describe in detail this modeling and its validation. The modeling has been extensively reviewed and refined, by including an iterative process to improve its input parameters to better match the observations. Also, the contribution of the leakage matrix on the accuracy of the procedure has been carefully assessed. We present the derived set of corrected mode characteristics, which includes not only frequencies, but line widths, asymmetries, and amplitudes. We present and discuss

  10. Panel-based Genetic Diagnostic Testing for Inherited Eye Diseases is Highly Accurate and Reproducible and More Sensitive for Variant Detection Than Exome Sequencing

    PubMed Central

    Bujakowska, Kinga M.; Sousa, Maria E.; Fonseca-Kelly, Zoë D.; Taub, Daniel G.; Janessian, Maria; Wang, Dan Yi; Au, Elizabeth D.; Sims, Katherine B.; Sweetser, David A.; Fulton, Anne B.; Liu, Qin; Wiggs, Janey L.; Gai, Xiaowu; Pierce, Eric A.

    2015-01-01

    Purpose Next-generation sequencing (NGS) based methods are being adopted broadly for genetic diagnostic testing, but the performance characteristics of these techniques have not been fully defined with regard to test accuracy and reproducibility. Methods We developed a targeted enrichment and NGS approach for genetic diagnostic testing of patients with inherited eye disorders, including inherited retinal degenerations, optic atrophy and glaucoma. In preparation for providing this Genetic Eye Disease (GEDi) test on a CLIA-certified basis, we performed experiments to measure the sensitivity, specificity, reproducibility as well as the clinical sensitivity of the test. Results The GEDi test is highly reproducible and accurate, with sensitivity and specificity for single nucleotide variant detection of 97.9% and 100%, respectively. The sensitivity for variant detection was notably better than the 88.3% achieved by whole exome sequencing (WES) using the same metrics, due to better coverage of targeted genes in the GEDi test compared to commercially available exome capture sets. Prospective testing of 192 patients with IRDs indicated that the clinical sensitivity of the GEDi test is high, with a diagnostic rate of 51%. Conclusion The data suggest that based on quantified performance metrics, selective targeted enrichment is preferable to WES for genetic diagnostic testing. PMID:25412400

  11. Reproducibility and observer variability of tissue phase mapping for the quantification of regional myocardial velocities.

    PubMed

    Lin, Kai; Chowdhary, Varun; Benzuly, Keith H; Yancy, Clyde W; Lomasney, Jon W; Rigolin, Vera H; Anderson, Allen S; Wilcox, Jane; Carr, James; Markl, Michael

    2016-08-01

    To systematically investigate the reproducibility of global and segmental left ventricular (LV) velocities derived from tissue phase mapping (TPM). Breath held and ECG synchronized TPM data (spatial/temporal resolution = 2 × 2 mm(2)/20.8 ms) were acquired in 18 healthy volunteers. To analyze scan-rescan variability, TPM was repeated in all subjects during a second visit separated by 16 ± 5 days. Data analysis included LV segmentation, and quantification of global and regional (AHA 16-segment modal) metrics of LV function [velocity-time curves, systolic and diastolic peak and time-to-peak (TTP) velocities] for radial (Vr), long-axis (Vz) and circumferential (VΦ) LV velocities. Mean velocity time curves in basal, mid-ventricular, and apical locations showed highly similar LV motion patterns for all three velocity components (Vr, VΦ, Vz) for scan and rescan. No significant differences for both systolic and diastolic peak and TTP myocardial velocities were observed. Segmental analysis revealed similar regional peak Vr and Vz during both systole and diastole except for three LV segments (p = 0.045, p = 0.033, and p = 0.009). Excellent (p < 0.001) correlations between scans and rescan for peak Vr (R(2) = 0.92), peak Vz (R(2) = 0.90), radial TTP (R(2) = 0.91) and long-axis TTP (R(2) = 0.88) confirmed good agreement. Bland-Altman analysis demonstrated excellent intra-observer and good inter-observer analysis agreement but increased variability for long axis peak velocities. TPM based analysis of global and regional myocardial velocities can be performed with good reproducibility. Robustness of regional quantification of long-axis velocities was limited but spatial velocity distributions across the LV could reliably be replicated. PMID:27116238

  12. Towards accurate observation and modelling of Antarctic glacial isostatic adjustment

    NASA Astrophysics Data System (ADS)

    King, M.

    2012-04-01

    The response of the solid Earth to glacial mass changes, known as glacial isostatic adjustment (GIA), has received renewed attention in the recent decade thanks to the Gravity Recovery and Climate Experiment (GRACE) satellite mission. GRACE measures Earth's gravity field every 30 days, but cannot partition surface mass changes, such as present-day cryospheric or hydrological change, from changes within the solid Earth, notably due to GIA. If GIA cannot be accurately modelled in a particular region the accuracy of GRACE estimates of ice mass balance for that region is compromised. This lecture will focus on Antarctica, where models of GIA are hugely uncertain due to weak constraints on ice loading history and Earth structure. Over the last years, however, there has been a step-change in our ability to measure GIA uplift with the Global Positioning System (GPS), including widespread deployments of permanent GPS receivers as part of the International Polar Year (IPY) POLENET project. I will particularly focus on the Antarctic GPS velocity field and the confounding effect of elastic rebound due to present-day ice mass changes, and then describe the construction and calibration of a new Antarctic GIA model for application to GRACE data, as well as highlighting areas where further critical developments are required.

  13. Transparency and Reproducibility of Observational Cohort Studies Using Large Healthcare Databases.

    PubMed

    Wang, S V; Verpillat, P; Rassen, J A; Patrick, A; Garry, E M; Bartels, D B

    2016-03-01

    The scientific community and decision-makers are increasingly concerned about transparency and reproducibility of epidemiologic studies using longitudinal healthcare databases. We explored the extent to which published pharmacoepidemiologic studies using commercially available databases could be reproduced by other investigators. We identified a nonsystematic sample of 38 descriptive or comparative safety/effectiveness cohort studies. Seven studies were excluded from reproduction, five because of violation of fundamental design principles, and two because of grossly inadequate reporting. In the remaining studies, >1,000 patient characteristics and measures of association were reproduced with a high degree of accuracy (median differences between original and reproduction <2% and <0.1). An essential component of transparent and reproducible research with healthcare databases is more complete reporting of study implementation. Once reproducibility is achieved, the conversation can be elevated to assess whether suboptimal design choices led to avoidable bias and whether findings are replicable in other data sources. PMID:26690726

  14. Transparency and Reproducibility of Observational Cohort Studies Using Large Healthcare Databases.

    PubMed

    Wang, S V; Verpillat, P; Rassen, J A; Patrick, A; Garry, E M; Bartels, D B

    2016-03-01

    The scientific community and decision-makers are increasingly concerned about transparency and reproducibility of epidemiologic studies using longitudinal healthcare databases. We explored the extent to which published pharmacoepidemiologic studies using commercially available databases could be reproduced by other investigators. We identified a nonsystematic sample of 38 descriptive or comparative safety/effectiveness cohort studies. Seven studies were excluded from reproduction, five because of violation of fundamental design principles, and two because of grossly inadequate reporting. In the remaining studies, >1,000 patient characteristics and measures of association were reproduced with a high degree of accuracy (median differences between original and reproduction <2% and <0.1). An essential component of transparent and reproducible research with healthcare databases is more complete reporting of study implementation. Once reproducibility is achieved, the conversation can be elevated to assess whether suboptimal design choices led to avoidable bias and whether findings are replicable in other data sources.

  15. An electrostatic mechanism closely reproducing observed behavior in the bacterial flagellar motor.

    PubMed

    Walz, D; Caplan, S R

    2000-02-01

    A mechanism coupling the transmembrane flow of protons to the rotation of the bacterial flagellum is studied. The coupling is accomplished by means of an array of tilted rows of positive and negative charges around the circumference of the rotor, which interacts with a linear array of proton binding sites in channels. We present a rigorous treatment of the electrostatic interactions using minimal assumptions. Interactions with the transition states are included, as well as proton-proton interactions in and between channels. In assigning values to the parameters of the model, experimentally determined structural characteristics of the motor have been used. According to the model, switching and pausing occur as a consequence of modest conformational changes in the rotor. In contrast to similar approaches developed earlier, this model closely reproduces a large number of experimental findings from different laboratories, including the nonlinear behavior of the torque-frequency relation in Escherichia coli, the stoichiometry of the system in Streptococcus, and the pH-dependence of swimming speed in Bacillus subtilis. PMID:10653777

  16. An electrostatic mechanism closely reproducing observed behavior in the bacterial flagellar motor.

    PubMed Central

    Walz, D; Caplan, S R

    2000-01-01

    A mechanism coupling the transmembrane flow of protons to the rotation of the bacterial flagellum is studied. The coupling is accomplished by means of an array of tilted rows of positive and negative charges around the circumference of the rotor, which interacts with a linear array of proton binding sites in channels. We present a rigorous treatment of the electrostatic interactions using minimal assumptions. Interactions with the transition states are included, as well as proton-proton interactions in and between channels. In assigning values to the parameters of the model, experimentally determined structural characteristics of the motor have been used. According to the model, switching and pausing occur as a consequence of modest conformational changes in the rotor. In contrast to similar approaches developed earlier, this model closely reproduces a large number of experimental findings from different laboratories, including the nonlinear behavior of the torque-frequency relation in Escherichia coli, the stoichiometry of the system in Streptococcus, and the pH-dependence of swimming speed in Bacillus subtilis. PMID:10653777

  17. Excellent Intra and Inter-Observer Reproducibility of Wrist Circumference Measurements in Obese Children and Adolescents.

    PubMed

    Campagna, Giuseppe; Zampetti, Simona; Gallozzi, Alessia; Giansanti, Sara; Chiesa, Claudio; Pacifico, Lucia; Buzzetti, Raffaella

    2016-01-01

    In a previous study, we found that wrist circumference, in particular its bone component, was associated with insulin resistance in a population of overweight/obese children. The aim of the present study was to evaluate the intra- and inter-operator variability in wrist circumference measurement in a population of obese children and adolescents. One hundred and two (54 male and 48 female) obese children and adolescents were consecutively enrolled. In all subjects wrist circumferences were measured by two different operators two times to assess intra- and inter-operator variability. Statistical analysis was performed using SAS v.9.4 and JMP v.12. Measurements of wrist circumference showed excellent inter-operator reliability with Intra class Correlation Coefficients (ICC) of 0.96 and ICC of 0.97 for the first and the second measurement, respectively. The intra-operator reliability was, also, very strong with a Concordance Correlation Coefficient (CCC) of 0.98 for both operators. The high reproducibility demonstrated in our results suggests that wrist circumference measurement, being safe, non-invasive and repeatable can be easily used in out-patient settings to identify youths with increased risk of insulin-resistance. This can avoid testing the entire population of overweight/obese children for insulin resistance parameters. PMID:27294398

  18. Can model observers be developed to reproduce radiologists' diagnostic performances? Our study says not so fast!

    NASA Astrophysics Data System (ADS)

    Lee, Juhun; Nishikawa, Robert M.; Reiser, Ingrid; Boone, John M.

    2016-03-01

    The purpose of this study was to determine radiologists' diagnostic performances on different image reconstruction algorithms that could be used to optimize image-based model observers. We included a total of 102 pathology proven breast computed tomography (CT) cases (62 malignant). An iterative image reconstruction (IIR) algorithm was used to obtain 24 reconstructions with different image appearance for each image. Using quantitative image feature analysis, three IIRs and one clinical reconstruction of 50 lesions (25 malignant) were selected for a reader study. The reconstructions spanned a range of smooth-low noise to sharp-high noise image appearance. The trained classifiers' AUCs on the above reconstructions ranged from 0.61 (for smooth reconstruction) to 0.95 (for sharp reconstruction). Six experienced MQSA radiologists read 200 cases (50 lesions times 4 reconstructions) and provided the likelihood of malignancy of each lesion. Radiologists' diagnostic performances (AUC) ranged from 0.7 to 0.89. However, there was no agreement among the six radiologists on which image appearance was the best, in terms of radiologists' having the highest diagnostic performances. Specifically, two radiologists indicated sharper image appearance was diagnostically superior, another two radiologists indicated smoother image appearance was diagnostically superior, and another two radiologists indicated all image appearances were diagnostically similar to each other. Due to the poor agreement among radiologists on the diagnostic ranking of images, it may not be possible to develop a model observer for this particular imaging task.

  19. Inter-observer reproducibility of measurements of range of motion in patients with shoulder pain using a digital inclinometer

    PubMed Central

    de Winter, Andrea F; Heemskerk, Monique AMB; Terwee, Caroline B; Jans, Marielle P; Devillé, Walter; van Schaardenburg, Dirk-Jan; Scholten, Rob JPM; Bouter, Lex M

    2004-01-01

    Background Reproducible measurements of the range of motion are an important prerequisite for the interpretation of study results. The digital inclinometer is considered to be a useful instrument because it is inexpensive and easy to use. No previous study assessed inter-observer reproducibility of range of motion measurements with a digital inclinometer by physical therapists in a large sample of patients. Methods Two physical therapists independently measured the passive range of motion of the glenohumeral abduction and the external rotation in 155 patients with shoulder pain. Agreement was quantified by calculation of the mean differences between the observers and the standard deviation (SD) of this difference and the limits of agreement, defined as the mean difference ± 1.96*SD of this difference. Reliability was quantified by means of the intraclass correlation coefficient (ICC). Results The limits of agreement were 0.8 ± 19.6 for glenohumeral abduction and -4.6 ± 18.8 for external rotation (affected side) and quite similar for the contralateral side and the differences between sides. The percentage agreement within 10° for these measurements were 72% and 70% respectively. The ICC ranged from 0.28 to 0.90 (0.83 and 0.90 for the affected side). Conclusions The inter-observer agreement was found to be poor. If individual patients are assessed by two different observers, differences in range of motion of less than 20–25 degrees can not be distuinguished from measurement error. In contrast, acceptable reliability was found for the inclinometric measurements of the affected side and the differences between the sides, indicating that the inclimeter can be used in studies in which groups are compared. PMID:15196309

  20. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation. PMID:26538323

  1. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  2. Do CMIP5 Climate Models Reproduce Observed Historical Trends in Temperature and Precipitation over the Continental United States?

    NASA Astrophysics Data System (ADS)

    Lee, J.; Loikith, P. C.; Waliser, D. E.; Kunkel, K.

    2015-12-01

    Monitoring trends in key climate variables, such as surface temperature and precipitation, is an integral part of the ongoing efforts of the United States National Climate Assessment (NCA). Positive trends in both temperature and precipitation have been observed over the 20th century over much of the Continental United States (CONUS), however projections of future trends are reliant on climate model simulations. In order to have confidence in future projections of temperature and precipitation, it is crucial to evaluate the ability of current state-of-the-art climate models to reproduce historical observed trends. Towards this goal, trends in surface temperature and precipitation obtained from the NOAA nClimDiv 5 km gridded station observation-based product are compared to the suite of CMIP5 historical simulations over the CONUS region. The Regional Climate Model Evaluation System (RCMES), an analysis tool which supports the NCA by providing access to data and tools for regional climate model validation, is used to provide the comparisons between the models and observation. NASA TRMM precipitation data and MERRA surface temperature data are included in part of the analysis to observe how well satellite data and reanalysis compares to nClimDiv station observation data.

  3. Continuous nanoflow-scanning electrochemical microscopy: voltammetric characterization and application for accurate and reproducible imaging of enzyme-labeled protein microarrays.

    PubMed

    Kai, Tianhan; Chen, Shu; Monterroso, Estuardo; Zhou, Feimeng

    2015-04-21

    The coupling of scanning electrochemical microscopy (SECM) to a continuous nanoflow (CNF) system is accomplished with the use of a microconcentric ring electrode/injector probe. The gold microring electrode encapsulated by a glass sheath is robust and can be beveled and polished. The CNF system, comprising a precision gas displacement pump and a rotary valve, is capable of delivering solution to the center of the SECM probe in the range of 1-150 nL/min. Major advantages of the CNF-SECM imaging mode over the conventional SECM generation/collection (G/C) mode include higher imaging resolution, immunity from interferences by species in the bulk solution or at other sites of the substrate, elimination of the feedback current that could interfere with the G/C data interpretation, and versatility of initiating surface reactions/processes via introducing different reactants into the flowing stream. Parameters such as flow rates, probe/substrate separations, and collection efficiencies are examined and optimized. Higher resolution, reproducibility, and accuracy are demonstrated through the application of CNF-SECM to horseradish peroxidase (HRP)-amplified imaging of protein microarrays. By flowing H2O2 and ferrocenemethanol through the injector and detecting the surface-generated ferriceniummethanol, human IgG spots covered with HPR-labeled antihuman IgG can be detected in the range of 13 nM-1.333 μM with a detection limit of 3.0 nM. In addition, consistent images of microarray spots for selective and high-density detection of analytes can be attained. PMID:25831146

  4. The NHLBI-Sponsored Consortium for preclinicAl assESsment of cARdioprotective Therapies (CAESAR): A New Paradigm for Rigorous, Accurate, and Reproducible Evaluation of Putative Infarct-Sparing Interventions in Mice, Rabbits, and Pigs

    PubMed Central

    Jones, Steven P.; Tang, Xian-Liang; Guo, Yiru; Steenbergen, Charles; Lefer, David J.; Kukreja, Rakesh C.; Kong, Maiying; Li, Qianhong; Bhushan, Shashi; Zhu, Xiaoping; Du, Junjie; Nong, Yibing; Stowers, Heather L.; Kondo, Kazuhisa; Hunt, Gregory N.; Goodchild, Traci T.; Orr, Adam; Chang, Carlos C.; Ockaili, Ramzi; Salloum, Fadi N.; Bolli, Roberto

    2014-01-01

    rigorous, accurate, and reproducible. PMID:25499773

  5. Reproducible Science▿

    PubMed Central

    Casadevall, Arturo; Fang, Ferric C.

    2010-01-01

    The reproducibility of an experimental result is a fundamental assumption in science. Yet, results that are merely confirmatory of previous findings are given low priority and can be difficult to publish. Furthermore, the complex and chaotic nature of biological systems imposes limitations on the replicability of scientific experiments. This essay explores the importance and limits of reproducibility in scientific manuscripts. PMID:20876290

  6. All-atom molecular dynamics analysis of multi-peptide systems reproduces peptide solubility in line with experimental observations

    PubMed Central

    Kuroda, Yutaka; Suenaga, Atsushi; Sato, Yuji; Kosuda, Satoshi; Taiji, Makoto

    2016-01-01

    In order to investigate the contribution of individual amino acids to protein and peptide solubility, we carried out 100 ns molecular dynamics (MD) simulations of 106 Å3 cubic boxes containing ~3 × 104 water molecules and 27 tetra-peptides regularly positioned at 23 Å from each other and composed of a single amino acid type for all natural amino acids but cysteine and glycine. The calculations were performed using Amber with a standard force field on a special purpose MDGRAPE-3 computer, without introducing any “artificial” hydrophobic interactions. Tetra-peptides composed of I, V, L, M, N, Q, F, W, Y, and H formed large amorphous clusters, and those containing A, P, S, and T formed smaller ones. Tetra-peptides made of D, E, K, and R did not cluster at all. These observations correlated well with experimental solubility tendencies as well as hydrophobicity scales with correlation coefficients of 0.5 to > 0.9. Repulsive Coulomb interactions were dominant in ensuring high solubility, whereas both Coulomb and van der Waals (vdW) energies contributed to the aggregations of low solubility amino acids. Overall, this very first all-atom molecular dynamics simulation of a multi-peptide system appears to reproduce the basic properties of peptide solubility, essentially in line with experimental observations. PMID:26817663

  7. Southern Hemisphere Observations Towards the Accurate Alignment of the VLBI Frame and the Future Gaia Frame

    NASA Astrophysics Data System (ADS)

    de Witt, Aletha; Quick, Jonathan; Bertarini, Alessandra; Ploetz, Christian; Bourda, Géraldine; Charlot, Patrick

    2014-04-01

    The Gaia space astrometry mission to be launched on 19 December 2013 will construct for the first time a dense and highly-accurate extragalactic reference frame directly at optical wavelengths based on positions of thousands of QSOs. For consistency with the present International Celestial Reference Frame (ICRF) built from VLBI data, it will be essential that the Gaia frame be aligned onto the ICRF with the highest possible accuracy. To this end, a VLBI observing program dedicated to identifying the most suitable radio sources for this alignment has been initiated using the VLBA and the EVN. In addition, VLBI observations of suitable ICRF2 sources are being strengthened using the IVS network, leading to a total of 314 link sources. The purpose of this proposal is to extend such observing programs to the southern hemisphere since the distribution of the present link sources is very sparse south of -30 degree declination due to the geographical location of the VLBI arrays used for this project. As a first stage, we propose to observe 48 optically-bright radio sources in the far south using the LBA supplemented with the antennas of Warkworth (New Zeland) and O'Higgins (Antartica). Our goal is to image these potential link sources and determine those that are the most point-like on VLBI scales and therefore suitable for the Gaia-ICRF alignment. We anticipate that further observations may be necessary in the future to extend the sample and refine the astrometry of these sources.

  8. New accurate ephemerides for the Galilean satellites of Jupiter. II. Fitting the observations

    NASA Astrophysics Data System (ADS)

    Lainey, V.; Arlot, J. E.; Vienne, A.

    2004-11-01

    We present a new model of the four Galilean satellites Io, Europa, Ganymede and Callisto, able to deliver accurate ephemerides over a very long time span (several centuries). In the first paper (Lainey et al. \\cite{Lainey04}, A&A, 420, 1171) we gave the equations of the dynamical model. Here we present the fit of this model to the observations, covering more than one century starting from 1891. Our ephemerides, based on this first fit called L1, are available on the web page of the IMCCE at the URL http://www.imcce.fr/ephemeride_eng.html. Tables 4-7 are only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/427/371

  9. Can quasiclassical trajectory calculations reproduce the extreme kinetic isotope effect observed in the muonic isotopologues of the H + H2 reaction?

    NASA Astrophysics Data System (ADS)

    Jambrina, P. G.; García, Ernesto; Herrero, Víctor J.; Sáez-Rábanos, Vicente; Aoiz, F. J.

    2011-07-01

    Rate coefficients for the mass extreme isotopologues of the H + H2 reaction, namely, Mu + H2, where Mu is muonium, and Heμ + H2, where Heμ is a He atom in which one of the electrons has been replaced by a negative muon, have been calculated in the 200-1000 K temperature range by means of accurate quantum mechanical (QM) and quasiclassical trajectory (QCT) calculations and compared with the experimental and theoretical results recently reported by Fleming et al. [Science 331, 448 (2011)], 10.1126/science.1199421. The QCT calculations can reproduce the experimental and QM rate coefficients and kinetic isotope effect (KIE), kMu(T)/kHeμ(T), if the Gaussian binning procedure (QCT-GB) - weighting the trajectories according to their proximity to the right quantal vibrational action - is applied. The analysis of the results shows that the large zero point energy of the MuH product is the key factor for the large KIE observed.

  10. Accurate CT-MR image registration for deep brain stimulation: a multi-observer evaluation study

    NASA Astrophysics Data System (ADS)

    Rühaak, Jan; Derksen, Alexander; Heldmann, Stefan; Hallmann, Marc; Meine, Hans

    2015-03-01

    Since the first clinical interventions in the late 1980s, Deep Brain Stimulation (DBS) of the subthalamic nucleus has evolved into a very effective treatment option for patients with severe Parkinson's disease. DBS entails the implantation of an electrode that performs high frequency stimulations to a target area deep inside the brain. A very accurate placement of the electrode is a prerequisite for positive therapy outcome. The assessment of the intervention result is of central importance in DBS treatment and involves the registration of pre- and postinterventional scans. In this paper, we present an image processing pipeline for highly accurate registration of postoperative CT to preoperative MR. Our method consists of two steps: a fully automatic pre-alignment using a detection of the skull tip in the CT based on fuzzy connectedness, and an intensity-based rigid registration. The registration uses the Normalized Gradient Fields distance measure in a multilevel Gauss-Newton optimization framework and focuses on a region around the subthalamic nucleus in the MR. The accuracy of our method was extensively evaluated on 20 DBS datasets from clinical routine and compared with manual expert registrations. For each dataset, three independent registrations were available, thus allowing to relate algorithmic with expert performance. Our method achieved an average registration error of 0.95mm in the target region around the subthalamic nucleus as compared to an inter-observer variability of 1.12 mm. Together with the short registration time of about five seconds on average, our method forms a very attractive package that can be considered ready for clinical use.

  11. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs. PMID:24882687

  12. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs.

  13. Applying an accurate spherical model to gamma-ray burst afterglow observations

    NASA Astrophysics Data System (ADS)

    Leventis, K.; van der Horst, A. J.; van Eerten, H. J.; Wijers, R. A. M. J.

    2013-05-01

    We present results of model fits to afterglow data sets of GRB 970508, GRB 980703 and GRB 070125, characterized by long and broad-band coverage. The model assumes synchrotron radiation (including self-absorption) from a spherical adiabatic blast wave and consists of analytic flux prescriptions based on numerical results. For the first time it combines the accuracy of hydrodynamic simulations through different stages of the outflow dynamics with the flexibility of simple heuristic formulas. The prescriptions are especially geared towards accurate description of the dynamical transition of the outflow from relativistic to Newtonian velocities in an arbitrary power-law density environment. We show that the spherical model can accurately describe the data only in the case of GRB 970508, for which we find a circumburst medium density n ∝ r-2. We investigate in detail the implied spectra and physical parameters of that burst. For the microphysics we show evidence for equipartition between the fraction of energy density carried by relativistic electrons and magnetic field. We also find that for the blast wave to be adiabatic, the fraction of electrons accelerated at the shock has to be smaller than 1. We present best-fitting parameters for the afterglows of all three bursts, including uncertainties in the parameters of GRB 970508, and compare the inferred values to those obtained by different authors.

  14. Fabrication of reproducible sub-5 nm nanogaps by a focused ion beam and observation of Fowler-Nordheim tunneling

    SciTech Connect

    Li, Hu; Wani, Ishtiaq H.; Hayat, Aqib; Leifer, Klaus; Jafri, S. Hassan M.

    2015-09-07

    Creating a stable high resistance sub-5 nm nanogap in between conductive electrodes is one of the major challenges in the device fabrication of nano-objects. Gap-sizes of 20 nm and above can be fabricated reproducibly by the precise focusing of the ion beam and careful milling of the metallic lines. Here, by tuning ion dosages starting from 4.6 × 10{sup 10} ions/cm and above, reproducible nanogaps with sub-5 nm sizes are milled with focused ion beam. The resistance as a function of gap dimension shows an exponential behavior, and Fowler-Nordheim tunneling effect was observed in nanoelectrodes with sub-5 nm nanogaps. The application of Simmon's model to the milled nanogaps and the electrical analysis indicates that the minimum nanogap size approaches to 2.3 nm.

  15. How well does WWLLN reproduce the satellite-observed distribution of flashes during the 2007-2012 time period?

    NASA Astrophysics Data System (ADS)

    Allen, D. J.; Pickering, K. E.; Ring, A.; Holzworth, R. H.

    2013-12-01

    Lightning is the dominant source of nitrogen oxides (NOx) involved in the production of ozone in the middle and upper troposphere in the tropics and in summer in the midlatitudes. Therefore it is imperative that the lightning NOx (LNOx) source strength per flash be better constrained. This process requires accurate information on the location and timing of lightning flashes. In the past fifteen years satellite-based lightning monitoring by the Optical Transient Detector (OTD) and Lightning Imaging Sensor (LIS) has greatly increased our understanding of the global distribution of lightning as a function of season and time-of-day. However, detailed information at higher temporal resolutions is only available for limited regions where ground-based networks such as the United States National Lightning Detection Network (NLDN) exist. In 2004, the ground-based World Wide Lightning Location Network (WWLLN) was formed with the goal of providing continuous flash rate information over the entire globe. It detects very low frequency (VLF) radio waves emitted by lightning with a detection efficiency (DE) that varies with stroke energy, time-of-day, surface type, and network coverage. This study evaluated the DE of WWLLN strokes relative to climatological OTD/LIS flashes using data from the 2007 to 2012 time period, a period during which the mean number of working sensors increased from 28 to 53. The analysis revealed that the mean global DE increased from 5% in 2007 to 13% in 2012. Regional variations were substantial with mean 2012 DEs of 5-10% over much of Argentina, Africa, and Asia and 15-30% over much of the Atlantic, Pacific, and Indian Oceans, the United States and the Maritime Continent. Detection-efficiency adjusted WWLLN flash rates were then compared to NLDN-based flash rates. Spatial correlations for individual summer months ranged from 0.66 to 0.93. Temporal correlations are currently being examined for regions of the U.S. and will also be shown.

  16. Reproducing cloud and boundary layer structure observed in MAGIC campaign using ship-following large-eddy simulations

    NASA Astrophysics Data System (ADS)

    McGibbon, J.; Bretherton, C. S.

    2015-12-01

    The 2012-2013 MAGIC shipborne deployment of the ARM mobile facility sampled a broad range of subtropical marine stratocumulus (Sc), cumulus (Cu), and transition regimes during cruises between Long Beach, CA, and Hololulu, HI. Ship-following large-eddy simulations (LES) of selected cruise legs of 4-5 days are compared with a broad suite of observations of cloud structure and radiative properties taken on the Horizon Spirit ship. This quantitative comparison across a realistic range of conditions assesses the suitability of LES for simulating the sensitivity of such cloud regimes to climate perturbations, and for guiding the development of cloud and boundary layer parameterizations in global climate and weather forecast models. The System for Atmospheric Modeling (SAM) LES is used with a small, doubly-periodic domain and variable vertical resolution, initialized using thermodynamic radiosonde profiles near the start of each cruise leg. Sea-surface temperatures are prescribed from observations, and ECMWF analyses are used to derive time-varying geostrophic wind, ship-relative large-scale advective forcing, and large-scale vertical velocity. ECMWF vertical velocities are adjusted to keep the temperature profile close to radiosonde profiles with a relaxation timescale of 1 day. The ship-measured accumulation-mode aerosol concentration is assumed throughout the boundary layer for nucleation of cloud droplets. The ship-following approach allows efficient comparison of model output with a broad suite of ship-based observations. The simulations cannot be expected to match the observations on timescales less than three hours because of cloud-scale and mesoscale sampling variability. Nevertheless, a preliminary sample of eleven 2D runs of different legs predicts daily mean cloud fraction and surface longwave radiation with negligible systematic bias and correlation coefficients of 0.33 and 0.53, respectively. Full-leg 3D simulations will also be evaluated and presented.

  17. Can All Cosmological Observations Be Accurately Interpreted with a Unique Geometry?

    NASA Astrophysics Data System (ADS)

    Fleury, Pierre; Dupuy, Hélène; Uzan, Jean-Philippe

    2013-08-01

    The recent analysis of the Planck results reveals a tension between the best fits for (Ωm0, H0) derived from the cosmic microwave background or baryonic acoustic oscillations on the one hand, and the Hubble diagram on the other hand. These observations probe the Universe on very different scales since they involve light beams of very different angular sizes; hence, the tension between them may indicate that they should not be interpreted the same way. More precisely, this Letter questions the accuracy of using only the (perturbed) Friedmann-Lemaître geometry to interpret all the cosmological observations, regardless of their angular or spatial resolution. We show that using an inhomogeneous “Swiss-cheese” model to interpret the Hubble diagram allows us to reconcile the inferred value of Ωm0 with the Planck results. Such an approach does not require us to invoke new physics nor to violate the Copernican principle.

  18. Can all cosmological observations be accurately interpreted with a unique geometry?

    PubMed

    Fleury, Pierre; Dupuy, Hélène; Uzan, Jean-Philippe

    2013-08-30

    The recent analysis of the Planck results reveals a tension between the best fits for (Ω(m0), H(0)) derived from the cosmic microwave background or baryonic acoustic oscillations on the one hand, and the Hubble diagram on the other hand. These observations probe the Universe on very different scales since they involve light beams of very different angular sizes; hence, the tension between them may indicate that they should not be interpreted the same way. More precisely, this Letter questions the accuracy of using only the (perturbed) Friedmann-Lemaître geometry to interpret all the cosmological observations, regardless of their angular or spatial resolution. We show that using an inhomogeneous "Swiss-cheese" model to interpret the Hubble diagram allows us to reconcile the inferred value of Ω(m0) with the Planck results. Such an approach does not require us to invoke new physics nor to violate the Copernican principle. PMID:24033020

  19. Accurate stellar masses for SB2 components: Interferometric observations for Gaia validation

    NASA Astrophysics Data System (ADS)

    Halbwachs, J.-L.; Boffin, H. M. J.; Le Bouquin, J.-B.; Famaey, B.; Salomon, J.-B.; Arenou, F.; Pourbaix, D.; Anthonioz, F.; Grellmann, R.; Guieu, S.; Guillout, P.; Jorissen, A.; Kiefer, F.; Lebreton, Y.; Mazeh, T.; Nebot Gómez-Morán, A.; Sana, H.; Tal-Or, L.

    2015-12-01

    A sample of about 70 double-lined spectroscopic binaries (SB2) is followed with radial velocity (RV) measurements, in order to derive the masses of their components when the astrometric measurements of Gaia will be available. A subset of 6 SB2 was observed in interferometry with VLTI/PIONIER, and the components were separated for each binary. The RV measurements already obtained were combined with the interferometric observations and the masses of the components were derived. The accuracies of the 12 masses are presently between 0.4 and 7 %, but they will still be improved in the future. These masses will be used to validate the masses which will be obtained from Gaia. In addition, the parallaxes derived from the combined visual+spectroscopic orbits are compared to that of Hipparcos, and a mass-luminosity relation is derived in the infrared H band.

  20. How accurately do drivers evaluate their own driving behavior? An on-road observational study.

    PubMed

    Amado, Sonia; Arıkan, Elvan; Kaça, Gülin; Koyuncu, Mehmet; Turkan, B Nilay

    2014-02-01

    Self-assessment of driving skills became a noteworthy research subject in traffic psychology, since by knowing one's strenghts and weaknesses, drivers can take an efficient compensatory action to moderate risk and to ensure safety in hazardous environments. The current study aims to investigate drivers' self-conception of their own driving skills and behavior in relation to expert evaluations of their actual driving, by using naturalistic and systematic observation method during actual on-road driving session and to assess the different aspects of driving via comprehensive scales sensitive to different specific aspects of driving. 19-63 years old male participants (N=158) attended an on-road driving session lasting approximately 80min (45km). During the driving session, drivers' errors and violations were recorded by an expert observer. At the end of the driving session, observers completed the driver evaluation questionnaire, while drivers completed the driving self-evaluation questionnaire and Driver Behavior Questionnaire (DBQ). Low to moderate correlations between driver and observer evaluations of driving skills and behavior, mainly on errors and violations of speed and traffic lights was found. Furthermore, the robust finding that drivers evaluate their driving performance as better than the expert was replicated. Over-positive appraisal was higher among drivers with higher error/violation score and with the ones that were evaluated by the expert as "unsafe". We suggest that the traffic environment might be regulated by increasing feedback indicators of errors and violations, which in turn might increase the insight into driving performance. Improving self-awareness by training and feedback sessions might play a key role for reducing the probability of risk in their driving activity.

  1. Extracting Accurate and Precise Topography from Lroc Narrow Angle Camera Stereo Observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Speyerer, E. J.; Robinson, M. S.; LROC Team

    2016-06-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that acquire meter scale imaging. Stereo observations are acquired by imaging from two or more orbits, including at least one off-nadir slew. Digital terrain models (DTMs) generated from the stereo observations are controlled to Lunar Orbiter Laser Altimeter (LOLA) elevation profiles. With current processing methods, digital terrain models (DTM) have absolute accuracies commensurate than the uncertainties of the LOLA profiles (~10 m horizontally and ~1 m vertically) and relative horizontal and vertical precisions better than the pixel scale of the DTMs (2 to 5 m). The NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics, enabling detailed characterization of large geomorphic features and providing a key resource for future exploration planning. Currently, two percent of the lunar surface is imaged in NAC stereo and continued acquisition of stereo observations will serve to strengthen our knowledge of the Moon and geologic processes that occur on all the terrestrial planets.

  2. REPRODUCING THE STELLAR MASS/HALO MASS RELATION IN SIMULATED {Lambda}CDM GALAXIES: THEORY VERSUS OBSERVATIONAL ESTIMATES

    SciTech Connect

    Munshi, Ferah; Governato, F.; Loebman, S.; Quinn, T.; Brooks, A. M.; Christensen, C.; Shen, S.; Moster, B.; Wadsley, J.

    2013-03-20

    We examine the present-day total stellar-to-halo mass (SHM) ratio as a function of halo mass for a new sample of simulated field galaxies using fully cosmological, {Lambda}CDM, high-resolution SPH + N-body simulations. These simulations include an explicit treatment of metal line cooling, dust and self-shielding, H{sub 2}-based star formation (SF), and supernova-driven gas outflows. The 18 simulated halos have masses ranging from a few times 10{sup 8} to nearly 10{sup 12} M{sub Sun }. At z = 0, our simulated galaxies have a baryon content and morphology typical of field galaxies. Over a stellar mass range of 2.2 Multiplication-Sign 10{sup 3}-4.5 Multiplication-Sign 10{sup 10} M{sub Sun} we find extremely good agreement between the SHM ratio in simulations and the present-day predictions from the statistical abundance matching technique presented in Moster et al. This improvement over past simulations is due to a number systematic factors, each decreasing the SHM ratios: (1) gas outflows that reduce the overall SF efficiency but allow for the formation of a cold gas component; (2) estimating the stellar masses of simulated galaxies using artificial observations and photometric techniques similar to those used in observations; and (3) accounting for a systematic, up to 30% overestimate in total halo masses in DM-only simulations, due to the neglect of baryon loss over cosmic times. Our analysis suggests that stellar mass estimates based on photometric magnitudes can underestimate the contribution of old stellar populations to the total stellar mass, leading to stellar mass errors of up to 50% for individual galaxies. These results highlight that implementing a realistic high density threshold for SF considerably reduces the overall SF efficiency due to more effective feedback. However, we show that in order to reduce the perceived tension between the SF efficiency in galaxy formation models and in real galaxies, it is very important to use proper techniques to

  3. OBSERVING SIMULATED PROTOSTARS WITH OUTFLOWS: HOW ACCURATE ARE PROTOSTELLAR PROPERTIES INFERRED FROM SEDs?

    SciTech Connect

    Offner, Stella S. R.; Robitaille, Thomas P.; Hansen, Charles E.; Klein, Richard I.; McKee, Christopher F.

    2012-07-10

    The properties of unresolved protostars and their local environment are frequently inferred from spectral energy distributions (SEDs) using radiative transfer modeling. In this paper, we use synthetic observations of realistic star formation simulations to evaluate the accuracy of properties inferred from fitting model SEDs to observations. We use ORION, an adaptive mesh refinement (AMR) three-dimensional gravito-radiation-hydrodynamics code, to simulate low-mass star formation in a turbulent molecular cloud including the effects of protostellar outflows. To obtain the dust temperature distribution and SEDs of the forming protostars, we post-process the simulations using HYPERION, a state-of-the-art Monte Carlo radiative transfer code. We find that the ORION and HYPERION dust temperatures typically agree within a factor of two. We compare synthetic SEDs of embedded protostars for a range of evolutionary times, simulation resolutions, aperture sizes, and viewing angles. We demonstrate that complex, asymmetric gas morphology leads to a variety of classifications for individual objects as a function of viewing angle. We derive best-fit source parameters for each SED through comparison with a pre-computed grid of radiative transfer models. While the SED models correctly identify the evolutionary stage of the synthetic sources as embedded protostars, we show that the disk and stellar parameters can be very discrepant from the simulated values, which is expected since the disk and central source are obscured by the protostellar envelope. Parameters such as the stellar accretion rate, stellar mass, and disk mass show better agreement, but can still deviate significantly, and the agreement may in some cases be artificially good due to the limited range of parameters in the set of model SEDs. Lack of correlation between the model and simulation properties in many individual instances cautions against overinterpreting properties inferred from SEDs for unresolved protostellar

  4. Observation-driven adaptive differential evolution and its application to accurate and smooth bronchoscope three-dimensional motion tracking.

    PubMed

    Luo, Xiongbiao; Wan, Ying; He, Xiangjian; Mori, Kensaku

    2015-08-01

    This paper proposes an observation-driven adaptive differential evolution algorithm that fuses bronchoscopic video sequences, electromagnetic sensor measurements, and computed tomography images for accurate and smooth bronchoscope three-dimensional motion tracking. Currently an electromagnetic tracker with a position sensor fixed at the bronchoscope tip is commonly used to estimate bronchoscope movements. The large tracking error from directly using sensor measurements, which may be deteriorated heavily by patient respiratory motion and the magnetic field distortion of the tracker, limits clinical applications. How to effectively use sensor measurements for precise and stable bronchoscope electromagnetic tracking remains challenging. We here exploit an observation-driven adaptive differential evolution framework to address such a challenge and boost the tracking accuracy and smoothness. In our framework, two advantageous points are distinguished from other adaptive differential evolution methods: (1) the current observation including sensor measurements and bronchoscopic video images is used in the mutation equation and the fitness computation, respectively and (2) the mutation factor and the crossover rate are determined adaptively on the basis of the current image observation. The experimental results demonstrate that our framework provides much more accurate and smooth bronchoscope tracking than the state-of-the-art methods. Our approach reduces the tracking error from 3.96 to 2.89 mm, improves the tracking smoothness from 4.08 to 1.62 mm, and increases the visual quality from 0.707 to 0.741. PMID:25660001

  5. Reproducing the observed energy-dependent structure of Earth's electron radiation belts during storm recovery with an event-specific diffusion model

    NASA Astrophysics Data System (ADS)

    Ripoll, J.-F.; Reeves, G. D.; Cunningham, G. S.; Loridan, V.; Denton, M.; Santolík, O.; Kurth, W. S.; Kletzing, C. A.; Turner, D. L.; Henderson, M. G.; Ukhorskiy, A. Y.

    2016-06-01

    We present dynamic simulations of energy-dependent losses in the radiation belt "slot region" and the formation of the two-belt structure for the quiet days after the 1 March storm. The simulations combine radial diffusion with a realistic scattering model, based data-driven spatially and temporally resolved whistler-mode hiss wave observations from the Van Allen Probes satellites. The simulations reproduce Van Allen Probes observations for all energies and L shells (2-6) including (a) the strong energy dependence to the radiation belt dynamics (b) an energy-dependent outer boundary to the inner zone that extends to higher L shells at lower energies and (c) an "S-shaped" energy-dependent inner boundary to the outer zone that results from the competition between diffusive radial transport and losses. We find that the characteristic energy-dependent structure of the radiation belts and slot region is dynamic and can be formed gradually in ~15 days, although the "S shape" can also be reproduced by assuming equilibrium conditions. The highest-energy electrons (E > 300 keV) of the inner region of the outer belt (L ~ 4-5) also constantly decay, demonstrating that hiss wave scattering affects the outer belt during times of extended plasmasphere. Through these simulations, we explain the full structure in energy and L shell of the belts and the slot formation by hiss scattering during storm recovery. We show the power and complexity of looking dynamically at the effects over all energies and L shells and the need for using data-driven and event-specific conditions.

  6. Evaluation of NASA's MERRA Precipitation Product in Reproducing the Observed Trend and Distribution of Extreme Precipitation Events in the United States

    NASA Technical Reports Server (NTRS)

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison

    2016-01-01

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that

  7. When continuous observations just won't do: developing accurate and efficient sampling strategies for the laying hen.

    PubMed

    Daigle, Courtney L; Siegford, Janice M

    2014-03-01

    Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented.

  8. When continuous observations just won't do: developing accurate and efficient sampling strategies for the laying hen.

    PubMed

    Daigle, Courtney L; Siegford, Janice M

    2014-03-01

    Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. PMID:24269639

  9. X-ray and microwave emissions from the July 19, 2012 solar flare: Highly accurate observations and kinetic models

    NASA Astrophysics Data System (ADS)

    Gritsyk, P. A.; Somov, B. V.

    2016-08-01

    The M7.7 solar flare of July 19, 2012, at 05:58 UT was observed with high spatial, temporal, and spectral resolutions in the hard X-ray and optical ranges. The flare occurred at the solar limb, which allowed us to see the relative positions of the coronal and chromospheric X-ray sources and to determine their spectra. To explain the observations of the coronal source and the chromospheric one unocculted by the solar limb, we apply an accurate analytical model for the kinetic behavior of accelerated electrons in a flare. We interpret the chromospheric hard X-ray source in the thick-target approximation with a reverse current and the coronal one in the thin-target approximation. Our estimates of the slopes of the hard X-ray spectra for both sources are consistent with the observations. However, the calculated intensity of the coronal source is lower than the observed one by several times. Allowance for the acceleration of fast electrons in a collapsing magnetic trap has enabled us to remove this contradiction. As a result of our modeling, we have estimated the flux density of the energy transferred by electrons with energies above 15 keV to be ˜5 × 1010 erg cm-2 s-1, which exceeds the values typical of the thick-target model without a reverse current by a factor of ˜5. To independently test the model, we have calculated the microwave spectrum in the range 1-50 GHz that corresponds to the available radio observations.

  10. Automatic, accurate, and reproducible segmentation of the brain and cerebro-spinal fluid in T1-weighted volume MRI scans and its application to serial cerebral and intracranial volumetry

    NASA Astrophysics Data System (ADS)

    Lemieux, Louis

    2001-07-01

    A new fully automatic algorithm for the segmentation of the brain and cerebro-spinal fluid (CSF) from T1-weighted volume MRI scans of the head was specifically developed in the context of serial intra-cranial volumetry. The method is an extension of a previously published brain extraction algorithm. The brain mask is used as a basis for CSF segmentation based on morphological operations, automatic histogram analysis and thresholding. Brain segmentation is then obtained by iterative tracking of the brain-CSF interface. Grey matter (GM), white matter (WM) and CSF volumes are calculated based on a model of intensity probability distribution that includes partial volume effects. Accuracy was assessed using a digital phantom scan. Reproducibility was assessed by segmenting pairs of scans from 20 normal subjects scanned 8 months apart and 11 patients with epilepsy scanned 3.5 years apart. Segmentation accuracy as measured by overlap was 98% for the brain and 96% for the intra-cranial tissues. The volume errors were: total brain (TBV): -1.0%, intra-cranial (ICV):0.1%, CSF: +4.8%. For repeated scans, matching resulted in improved reproducibility. In the controls, the coefficient of reliability (CR) was 1.5% for the TVB and 1.0% for the ICV. In the patients, the Cr for the ICV was 1.2%.

  11. Inter-observer reproducibility before and after web-based education in the Gleason grading of the prostate adenocarcinoma among the Iranian pathologists.

    PubMed

    Abdollahi, Alireza; Sheikhbahaei, Sara; Meysamie, Alipasha; Bakhshandeh, Mohammadreza; Hosseinzadeh, Hasan

    2014-01-01

    This study was aimed at determining intra and inter-observer concordance rates in the Gleason scoring of prostatic adenocarcinoma, before and after a web-based educational course. In this self-controlled study, 150 tissue samples of prostatic adenocarcinoma are re-examined to be scored according to the Gleason scoring system. Then all pathologists attend a free web-based course. Afterwards, the same 150 samples [with different codes compared to the previous ones] are distributed differently among the pathologists to be assigned Gleason scores. After gathering the data, the concordance rate in the first and second reports of pathologists is determined. In the pre web-education, the mean kappa value of Interobserver agreement was 0.25 [fair agreement]. Post web-education significantly improved with the mean kappa value of 0.52 [moderate agreement]. Using weighted kappa values, significant improvement was observed in inter-observer agreement in higher scores of Gleason grade; Score 10 was achieved for the mean kappa value in post web-education was 0.68 [substantial agreement] compared to 0.25 (fair agreement) in pre web-education. Web-based training courses are attractive to pathologists as they do not need to spend much time and money. Therefore, such training courses are strongly recommended for significant pathological issues including the grading of the prostate adenocarcinoma. Through web-based education, pathologists can exchange views and contribute to the rise in the level of reproducibility. Such programs need to be included in post-graduation programs. PMID:24902017

  12. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  13. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  14. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  15. Tuning The Sea-Ice Seasonal Cycle Of HadCM3: Can It Reproduce Observed Trends In Sea-Ice?

    NASA Astrophysics Data System (ADS)

    Tett, S. F.; Roach, L.; Rae, C.; Cartis, C.; Mineter, M.; Steig, E. J.; Yamazaki, K.; Schurer, A. P.

    2015-12-01

    Since high quality satellite observations of sea-ice begin in 1979 Artic sea-ice extent has declined . Observed losses in Arctic sea-ice during September are greater than the majority of models in the CMIP5 archive and the multi-model average. In contrast Antarctic sea-ice has increased in contrast to an expected decline. We have carried out a set of perturbations to the HadCM3 model in which we changed the maximum ice area (a proxy for ice leads), albedo parameterizations, ice thermal conductivity and ocean diffusion. Changes in these parameters affected ice extent in both the Arctic and Antarctic. We used these simulations to identify four parameters that had most impact on minimum and maximum sea-ice extent in both hemispheres. To tune the model we used a Gauss-Newton algorithm to adjust those four parameters to minimize differences between simulated and observed sea-ice extents. With this new parameter set we then simulated the period 1940 to 2015 and compared with the default configuration of HadCM3. Compared to the default configuration the perturbed model had greater summer sea-loss in the Arctic and is consistent with observed loss estimates. However, in the Antarctic neither the perturbed or default simulations show an increase in sea-ice extent. This is in contrast to the observations which do show an increase in sea-ice extent.

  16. Ultrasonographic evaluation of the adrenal glands in healthy dogs: repeatability, reproducibility, observer-dependent variability, and the effect of bodyweight, age and sex.

    PubMed

    Mogicato, G; Layssol-Lamour, C; Conchou, F; Diquelou, A; Raharison, F; Sautet, J; Concordet, D

    2011-02-01

    Adrenal length and width were determined from two-dimensional ultrasound longitudinal images. In study 1, 540 measurements of adrenal glands were attempted from five healthy beagle dogs by three different observers with different levels of expertise in ultrasonography, to determine the variability of adrenal gland measurements. Of these, 484 measurements were included in the statistical analysis, since 16 measurements of the left adrenal gland and 40 for the right could not be visualised by the observer. In study 2, a single measurement of both adrenal glands was taken from each of 146 dogs by the most trained observer from study 1, and the effects of different health status (healthy dogs v dogs with non-adrenal diseases), bodyweight, age and sex were assessed. A total of 267 measurements were included in the statistical analysis. The lowest intra- and inter-day coefficient of variation values were observed for the left adrenal gland and by the most trained observer. The health status had no statistically significant effect on adrenal gland length or width, whereas age had a significant effect only for the left adrenal gland (the greater the age, the greater the width or length) and sex had a significant effect only for the right adrenal gland (the width was larger in males and the length larger in females). The bodyweight had a significant effect for the length of both adrenal glands (the greater the bodyweight, the greater the length), but not the width. The differences between sd and coefficient of variation values for the width of the left adrenal gland were not statistically significant between the three observers, whereas they were statistically significant for the right adrenal gland.

  17. Selecting and optimizing eco-physiological parameters of Biome-BGC to reproduce observed woody and leaf biomass growth of Eucommia ulmoides plantation in China using Dakota optimizer

    NASA Astrophysics Data System (ADS)

    Miyauchi, T.; Machimura, T.

    2013-12-01

    In the simulation using an ecosystem process model, the adjustment of parameters is indispensable for improving the accuracy of prediction. This procedure, however, requires much time and effort for approaching the simulation results to the measurements on models consisting of various ecosystem processes. In this study, we tried to apply a general purpose optimization tool in the parameter optimization of an ecosystem model, and examined its validity by comparing the simulated and measured biomass growth of a woody plantation. A biometric survey of tree biomass growth was performed in 2009 in an 11-year old Eucommia ulmoides plantation in Henan Province, China. Climate of the site was dry temperate. Leaf, above- and below-ground woody biomass were measured from three cut trees and converted into carbon mass per area by measured carbon contents and stem density. Yearly woody biomass growth of the plantation was calculated according to allometric relationships determined by tree ring analysis of seven cut trees. We used Biome-BGC (Thornton, 2002) to reproduce biomass growth of the plantation. Air temperature and humidity from 1981 to 2010 was used as input climate condition. The plant functional type was deciduous broadleaf, and non-optimizing parameters were left default. 11-year long normal simulations were performed following a spin-up run. In order to select optimizing parameters, we analyzed the sensitivity of leaf, above- and below-ground woody biomass to eco-physiological parameters. Following the selection, optimization of parameters was performed by using the Dakota optimizer. Dakota is an optimizer developed by Sandia National Laboratories for providing a systematic and rapid means to obtain optimal designs using simulation based models. As the object function, we calculated the sum of relative errors between simulated and measured leaf, above- and below-ground woody carbon at each of eleven years. In an alternative run, errors at the last year (at the

  18. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  19. Magnetogastrography (MGG) Reproducibility Assessments

    NASA Astrophysics Data System (ADS)

    de la Roca-Chiapas, J. M.; Córdova, T.; Hernández, E.; Solorio, S.; Solís Ortiz, S.; Sosa, M.

    2006-09-01

    Seven healthy subjects underwent a magnetic pulse of 32 mT for 17 ms, seven times in 90 minutes. The procedure was repeated one and two weeks later. Assessments of the gastric emptying were carried out for each one of the measurements and a statistical analysis of ANOVA was performed for every group of data. The gastric emptying time was 19.22 ± 5 min. Reproducibility estimation was above 85%. Therefore, magnetogastrography seems to be an excellent technique to be implemented in routine clinical trials.

  20. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  1. CLARREO Cornerstone of the Earth Observing System: Measuring Decadal Change Through Accurate Emitted Infrared and Reflected Solar Spectra and Radio Occultation

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    2010-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in

  2. Reproducible measurements of MPI performance characteristics.

    SciTech Connect

    Gropp, W.; Lusk, E.

    1999-06-25

    In this paper we describe the difficulties inherent in making accurate, reproducible measurements of message-passing performance. We describe some of the mistakes often made in attempting such measurements and the consequences of such mistakes. We describe mpptest, a suite of performance measurement programs developed at Argonne National Laboratory, that attempts to avoid such mistakes and obtain reproducible measures of MPI performance that can be useful to both MPI implementers and MPI application writers. We include a number of illustrative examples of its use.

  3. Reproducing in cities.

    PubMed

    Mace, Ruth

    2008-02-01

    Reproducing in cities has always been costly, leading to lower fertility (that is, lower birth rates) in urban than in rural areas. Historically, although cities provided job opportunities, initially residents incurred the penalty of higher infant mortality, but as mortality rates fell at the end of the 19th century, European birth rates began to plummet. Fertility decline in Africa only started recently and has been dramatic in some cities. Here it is argued that both historical and evolutionary demographers are interpreting fertility declines across the globe in terms of the relative costs of child rearing, which increase to allow children to outcompete their peers. Now largely free from the fear of early death, postindustrial societies may create an environment that generates runaway parental investment, which will continue to drive fertility ever lower.

  4. Accurate detection of spatio-temporal variability of plant phenology by using satellite-observed daily green-red vegetation index (GRVI) in Japan

    NASA Astrophysics Data System (ADS)

    Nagai, S.; Saitoh, T. M.; Nasahara, K. N.; Inoue, T.; Suzuki, R.

    2015-12-01

    To evaluate the spatio-temporal variability of biodiversity and ecosystem functioning and service in deciduous forests, accurate detection of the timing of plant phenology such as leaf-flushing, -coloring, and -falling is important from plot to continental scales. Here, (1) we detected the spatio-temporal variability in the timing of start (SGS) and end of growing season (EGS) in Japan from 2001 to 2014 by analyzing Terra and Aqua/MODIS satellite-observed daily green-red vegetation index (GRVI) with a 500-m spatial resolution. (2) We examined the characteristics of timing of SGS and EGS in deciduous forests along vertical (altitude) and horizontal (latitude) gradients and their sensitivity to air temperature. (3) We evaluated the relationship between the spatial distribution of leaf-coloring phenology derived from Landsat-8/OLI satellite-observed GRVI with a 30-m spatial resolution on 23 November 2014 and leaf-coloring information published on web sites in Kanagawa Prefecture, Japan. We found that (1) changes along the vertical and horizontal gradients in the timing of SGS tended to be larger than those of EGS; (2) the sensitivity of the timing of SGS to air temperature was much more than that of EGS; and (3) leaf-coloring information published on web sites covering multiple points was useful for verification of leaf-coloring phenology derived from satellite-observed GRVI in relation to the altitude gradient in mountainous regions.

  5. Reproducible Experiment Platform

    NASA Astrophysics Data System (ADS)

    Likhomanenko, Tatiana; Rogozhnikov, Alex; Baranov, Alexander; Khairullin, Egor; Ustyuzhanin, Andrey

    2015-12-01

    Data analysis in fundamental sciences nowadays is an essential process that pushes frontiers of our knowledge and leads to new discoveries. At the same time we can see that complexity of those analyses increases fast due to a) enormous volumes of datasets being analyzed, b) variety of techniques and algorithms one have to check inside a single analysis, c) distributed nature of research teams that requires special communication media for knowledge and information exchange between individual researchers. There is a lot of resemblance between techniques and problems arising in the areas of industrial information retrieval and particle physics. To address those problems we propose Reproducible Experiment Platform (REP), a software infrastructure to support collaborative ecosystem for computational science. It is a Python based solution for research teams that allows running computational experiments on shared datasets, obtaining repeatable results, and consistent comparisons of the obtained results. We present some key features of REP based on case studies which include trigger optimization and physics analysis studies at the LHCb experiment.

  6. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  7. Assessing the reproducibility of discriminant function analyses.

    PubMed

    Andrew, Rose L; Albert, Arianne Y K; Renaut, Sebastien; Rennison, Diana J; Bock, Dan G; Vines, Tim

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  8. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  9. Reproducibility in a multiprocessor system

    DOEpatents

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  10. REPRODUCING THE OBSERVED ABUNDANCES IN RCB AND HdC STARS WITH POST-DOUBLE-DEGENERATE MERGER MODELS-CONSTRAINTS ON MERGER AND POST-MERGER SIMULATIONS AND PHYSICS PROCESSES

    SciTech Connect

    Menon, Athira; Herwig, Falk; Denissenkov, Pavel A.; Clayton, Geoffrey C.; Staff, Jan; Pignatari, Marco; Paxton, Bill

    2013-07-20

    The R Coronae Borealis (RCB) stars are hydrogen-deficient, variable stars that are most likely the result of He-CO WD mergers. They display extremely low oxygen isotopic ratios, {sup 16}O/{sup 18}O {approx_equal} 1-10, {sup 12}C/{sup 13}C {>=} 100, and enhancements up to 2.6 dex in F and in s-process elements from Zn to La, compared to solar. These abundances provide stringent constraints on the physical processes during and after the double-degenerate merger. As shown previously, O-isotopic ratios observed in RCB stars cannot result from the dynamic double-degenerate merger phase, and we now investigate the role of the long-term one-dimensional spherical post-merger evolution and nucleosynthesis based on realistic hydrodynamic merger progenitor models. We adopt a model for extra envelope mixing to represent processes driven by rotation originating in the dynamical merger. Comprehensive nucleosynthesis post-processing simulations for these stellar evolution models reproduce, for the first time, the full range of the observed abundances for almost all the elements measured in RCB stars: {sup 16}O/{sup 18}O ratios between 9 and 15, C-isotopic ratios above 100, and {approx}1.4-2.35 dex F enhancements, along with enrichments in s-process elements. The nucleosynthesis processes in our models constrain the length and temperature in the dynamic merger shell-of-fire feature as well as the envelope mixing in the post-merger phase. s-process elements originate either in the shell-of-fire merger feature or during the post-merger evolution, but the contribution from the asymptotic giant branch progenitors is negligible. The post-merger envelope mixing must eventually cease {approx}10{sup 6} yr after the dynamic merger phase before the star enters the RCB phase.

  11. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  12. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  13. CC/DFT Route toward Accurate Structures and Spectroscopic Features for Observed and Elusive Conformers of Flexible Molecules: Pyruvic Acid as a Case Study.

    PubMed

    Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien; Cimino, Paola; Penocchio, Emanuele; Puzzarini, Cristina

    2015-09-01

    The structures and relative stabilities as well as the rotational and vibrational spectra of the three low-energy conformers of pyruvic acid (PA) have been characterized using a state-of-the-art quantum-mechanical approach designed for flexible molecules. By making use of the available experimental rotational constants for several isotopologues of the most stable PA conformer, Tc-PA, the semiexperimental equilibrium structure has been derived. The latter provides a reference for the pure theoretical determination of the equilibrium geometries for all conformers, thus confirming for these structures an accuracy of 0.001 Å and 0.1 deg for bond lengths and angles, respectively. Highly accurate relative energies of all conformers (Tc-, Tt-, and Ct-PA) and of the transition states connecting them are provided along with the thermodynamic properties at low and high temperatures, thus leading to conformational enthalpies accurate to 1 kJ mol(-1). Concerning microwave spectroscopy, rotational constants accurate to about 20 MHz are provided for the Tt- and Ct-PA conformers, together with the computed centrifugal-distortion constants and dipole moments required to simulate their rotational spectra. For Ct-PA, vibrational frequencies in the mid-infrared region accurate to 10 cm(-1) are reported along with theoretical estimates for the transitions in the near-infrared range, and the corresponding infrared spectrum including fundamental transitions, overtones, and combination bands has been simulated. In addition to the new data described above, theoretical results for the Tc- and Tt-PA conformers are compared with all available experimental data to further confirm the accuracy of the hybrid coupled-cluster/density functional theory (CC/DFT) protocol applied in the present study. Finally, we discuss in detail the accuracy of computational models fully based on double-hybrid DFT functionals (mainly at the B2PLYP/aug-cc-pVTZ level) that avoid the use of very expensive CC

  14. CC/DFT Route toward Accurate Structures and Spectroscopic Features for Observed and Elusive Conformers of Flexible Molecules: Pyruvic Acid as a Case Study.

    PubMed

    Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien; Cimino, Paola; Penocchio, Emanuele; Puzzarini, Cristina

    2015-09-01

    The structures and relative stabilities as well as the rotational and vibrational spectra of the three low-energy conformers of pyruvic acid (PA) have been characterized using a state-of-the-art quantum-mechanical approach designed for flexible molecules. By making use of the available experimental rotational constants for several isotopologues of the most stable PA conformer, Tc-PA, the semiexperimental equilibrium structure has been derived. The latter provides a reference for the pure theoretical determination of the equilibrium geometries for all conformers, thus confirming for these structures an accuracy of 0.001 Å and 0.1 deg for bond lengths and angles, respectively. Highly accurate relative energies of all conformers (Tc-, Tt-, and Ct-PA) and of the transition states connecting them are provided along with the thermodynamic properties at low and high temperatures, thus leading to conformational enthalpies accurate to 1 kJ mol(-1). Concerning microwave spectroscopy, rotational constants accurate to about 20 MHz are provided for the Tt- and Ct-PA conformers, together with the computed centrifugal-distortion constants and dipole moments required to simulate their rotational spectra. For Ct-PA, vibrational frequencies in the mid-infrared region accurate to 10 cm(-1) are reported along with theoretical estimates for the transitions in the near-infrared range, and the corresponding infrared spectrum including fundamental transitions, overtones, and combination bands has been simulated. In addition to the new data described above, theoretical results for the Tc- and Tt-PA conformers are compared with all available experimental data to further confirm the accuracy of the hybrid coupled-cluster/density functional theory (CC/DFT) protocol applied in the present study. Finally, we discuss in detail the accuracy of computational models fully based on double-hybrid DFT functionals (mainly at the B2PLYP/aug-cc-pVTZ level) that avoid the use of very expensive CC

  15. Reproducibility and reliability of fetal cardiac time intervals using magnetocardiography.

    PubMed

    van Leeuwen, P; Lange, S; Klein, A; Geue, D; Zhang, Y; Krause, H J; Grönemeyer, D

    2004-04-01

    We investigated several factors which may affect the accuracy of fetal cardiac time intervals (CTI) determined in magnetocardiographic (MCG) recordings: observer differences, the number of available recording sites and the type of sensor used in acquisition. In 253 fetal MCG recordings, acquired using different biomagnetometer devices between the 15th and 42nd weeks of gestation, P-wave, QRS complex and T-wave onsets and ends were identified in signal averaged data sets independently by different observers. Using a defined procedure for setting signal events, interobserver reliability was high. Increasing the number of registration sites led to more accurate identification of the events. The differences in wave morphology between magnetometer and gradiometer configurations led to deviations in timing whereas the differences between low and high temperature devices seemed to be primarily due to noise. Signal-to-noise ratio played an important overall role in the accurate determination of CTI and changes in signal amplitude associated with fetal maturation may largely explain the effects of gestational age on reproducibility. As fetal CTI may be of value in the identification of pathologies such as intrauterine growth retardation or fetal cardiac hypertrophy, their reliable estimation will be enhanced by strategies which take these factors into account.

  16. Latent fingermark pore area reproducibility.

    PubMed

    Gupta, A; Buckley, K; Sutton, R

    2008-08-01

    The study of the reproducibility of friction ridge pore detail in fingermarks is a measure of their usefulness in personal identification. Pore area in latent prints developed using cyanoacrylate and ninhydrin were examined and measured by photomicrography using appropriate software tools. The data were analysed statistically and the results showed that pore area is not reproducible in developed latent prints, using either of the development techniques. The results add further support to the lack of reliability of pore area in personal identification. PMID:18617339

  17. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  18. Combined NMR-observation of cold denaturation in supercooled water and heat denaturation enables accurate measurement of deltaC(p) of protein unfolding.

    PubMed

    Szyperski, Thomas; Mills, Jeffrey L; Perl, Dieter; Balbach, Jochen

    2006-04-01

    Cold and heat denaturation of the double mutant Arg 3-->Glu/Leu 66-->Glu of cold shock protein Csp of Bacillus caldolyticus was monitored using 1D (1)H NMR spectroscopy in the temperature range from -12 degrees C in supercooled water up to +70 degrees C. The fraction of unfolded protein, f (u), was determined as a function of the temperature. The data characterizing the unfolding transitions could be consistently interpreted in the framework of two-state models: cold and heat denaturation temperatures were determined to be -11 degrees C and 39 degrees C, respectively. A joint fit to both cold and heat transition data enabled the accurate spectroscopic determination of the heat capacity difference between native and denatured state, DeltaC(p) of unfolding. The approach described in this letter, or a variant thereof, is generally applicable and promises to be of value for routine studies of protein folding.

  19. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  20. Reproducible Research in Computational Science

    PubMed Central

    Peng, Roger D.

    2012-01-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible. PMID:22144613

  1. Reproducible research in computational science.

    PubMed

    Peng, Roger D

    2011-12-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  2. Performance reproducibility index for classification

    PubMed Central

    Yousefi, Mohammadmahdi R.; Dougherty, Edward R.

    2012-01-01

    Motivation: A common practice in biomarker discovery is to decide whether a large laboratory experiment should be carried out based on the results of a preliminary study on a small set of specimens. Consideration of the efficacy of this approach motivates the introduction of a probabilistic measure, for whether a classifier showing promising results in a small-sample preliminary study will perform similarly on a large independent sample. Given the error estimate from the preliminary study, if the probability of reproducible error is low, then there is really no purpose in substantially allocating more resources to a large follow-on study. Indeed, if the probability of the preliminary study providing likely reproducible results is small, then why even perform the preliminary study? Results: This article introduces a reproducibility index for classification, measuring the probability that a sufficiently small error estimate on a small sample will motivate a large follow-on study. We provide a simulation study based on synthetic distribution models that possess known intrinsic classification difficulties and emulate real-world scenarios. We also set up similar simulations on four real datasets to show the consistency of results. The reproducibility indices for different distributional models, real datasets and classification schemes are empirically calculated. The effects of reporting and multiple-rule biases on the reproducibility index are also analyzed. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routine and error estimation methods. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi12a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:22954625

  3. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  4. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  5. Some properties of negative cloud-to-ground flashes from observations of a local thunderstorm based on accurate-stroke-count studies

    NASA Astrophysics Data System (ADS)

    Zhu, Baoyou; Ma, Ming; Xu, Weiwei; Ma, Dong

    2015-12-01

    Properties of negative cloud-to-ground (CG) lightning flashes, in terms of number of strokes per flash, inter-stroke intervals and the relative intensity of subsequent and first strokes, were presented by accurate-stroke-count studies based on all 1085 negative flashes from a local thunderstorm. The percentage of single-stroke flashes and stroke multiplicity evolved significantly during the whole life cycle of the study thunderstorm. The occurrence probability of negative CG flashes decreased exponentially with the increasing number of strokes per flash. About 30.5% of negative CG flashes contained only one stroke and number of strokes per flash averaged 3.3. In a subset of 753 negative multiple-stroke flashes, about 41.4% contained at least one subsequent stroke stronger than the corresponding first stroke. Subsequent strokes tended to decrease in strength with their orders and the ratio of subsequent to first stroke peaks presented a geometric mean value of 0.52. Interestingly, negative CG flashes of higher multiplicity tended to have stronger initial strokes. 2525 inter-stroke intervals showed a more or less log-normal distribution and gave a geometric mean value of 62 ms. For CG flashes of particular multiplicity geometric mean inter-stroke intervals tended to decrease with the increasing number of strokes per flash, while those intervals associated with higher order strokes tended to be larger than those associated with low order strokes.

  6. Reproducibility of NIF hohlraum measurements

    NASA Astrophysics Data System (ADS)

    Moody, J. D.; Ralph, J. E.; Turnbull, D. P.; Casey, D. T.; Albert, F.; Bachmann, B. L.; Doeppner, T.; Divol, L.; Grim, G. P.; Hoover, M.; Landen, O. L.; MacGowan, B. J.; Michel, P. A.; Moore, A. S.; Pino, J. E.; Schneider, M. B.; Tipton, R. E.; Smalyuk, V. A.; Strozzi, D. J.; Widmann, K.; Hohenberger, M.

    2015-11-01

    The strategy of experimentally ``tuning'' the implosion in a NIF hohlraum ignition target towards increasing hot-spot pressure, areal density of compressed fuel, and neutron yield relies on a level of experimental reproducibility. We examine the reproducibility of experimental measurements for a collection of 15 identical NIF hohlraum experiments. The measurements include incident laser power, backscattered optical power, x-ray measurements, hot-electron fraction and energy, and target characteristics. We use exact statistics to set 1-sigma confidence levels on the variations in each of the measurements. Of particular interest is the backscatter and laser-induced hot-spot locations on the hohlraum wall. Hohlraum implosion designs typically include variability specifications [S. W. Haan et al., Phys. Plasmas 18, 051001 (2011)]. We describe our findings and compare with the specifications. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  7. Cloning to reproduce desired genotypes.

    PubMed

    Westhusin, M E; Long, C R; Shin, T; Hill, J R; Looney, C R; Pryor, J H; Piedrahita, J A

    2001-01-01

    Cloned sheep, cattle, goats, pigs and mice have now been produced using somatic cells for nuclear transplantation. Animal cloning is still very inefficient with on average less than 10% of the cloned embryos transferred resulting in a live offspring. However successful cloning of a variety of different species and by a number of different laboratory groups has generated tremendous interest in reproducing desired genotypes. Some of these specific genotypes represent animal cell lines that have been genetically modified. In other cases there is a significant demand for cloning animals characterized by their inherent genetic value, for example prize livestock, household pets and rare or endangered species. A number of different variables may influence the ability to reproduce a specific genotype by cloning. These include species, source of recipient ova, cell type of nuclei donor, treatment of donor cells prior to nuclear transfer, and the techniques employed for nuclear transfer. At present, there is no solid evidence that suggests cloning will be limited to only a few specific animals, and in fact, most data collected to date suggests cloning will be applicable to a wide variety of different animals. The ability to reproduce any desired genotype by cloning will ultimately depend on the amount of time and resources invested in research.

  8. Masses of the components of SB2 binaries observed with Gaia - III. Accurate SB2 orbits for 10 binaries and masses of HIP 87895

    NASA Astrophysics Data System (ADS)

    Kiefer, F.; Halbwachs, J.-L.; Arenou, F.; Pourbaix, D.; Famaey, B.; Guillout, P.; Lebreton, Y.; Nebot Gómez-Morán, A.; Mazeh, T.; Salomon, J.-B.; Soubiran, C.; Tal-Or, L.

    2016-05-01

    In anticipation of the Gaia astrometric mission, a large sample of spectroscopic binaries has been observed since 2010 with the Spectrographe pour l'Observation des PHénomènes des Intérieurs Stellaires et des Exoplanètes spectrograph at the Haute-Provence Observatory. Our aim is to derive the orbital elements of double-lined spectroscopic binaries (SB2s) with an accuracy sufficient to finally obtain the masses of the components with relative errors as small as 1 per cent when the astrometric measurements of Gaia are taken into account. In this paper, we present the results from five years of observations of 10 SB2 systems with periods ranging from 37 to 881 d. Using the TODMOR algorithm, we computed radial velocities from the spectra, and then derived the orbital elements of these binary systems. The minimum masses of the components are then obtained with an accuracy better than 1.2 per cent for the 10 binaries. Combining the radial velocities with existing interferometric measurements, we derived the masses of the primary and secondary components of HIP 87895 with an accuracy of 0.98 and 1.2 per cent, respectively.

  9. Accomplishments of the MUSICA project to provide accurate, long-term, global and high-resolution observations of tropospheric {H2O,δD} pairs - a review

    NASA Astrophysics Data System (ADS)

    Schneider, Matthias; Wiegele, Andreas; Barthlott, Sabine; González, Yenny; Christner, Emanuel; Dyroff, Christoph; García, Omaira E.; Hase, Frank; Blumenstock, Thomas; Sepúlveda, Eliezer; Mengistu Tsidu, Gizaw; Takele Kenea, Samuel; Rodríguez, Sergio; Andrey, Javier

    2016-07-01

    In the lower/middle troposphere, {H2O,δD} pairs are good proxies for moisture pathways; however, their observation, in particular when using remote sensing techniques, is challenging. The project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water) addresses this challenge by integrating the remote sensing with in situ measurement techniques. The aim is to retrieve calibrated tropospheric {H2O,δD} pairs from the middle infrared spectra measured from ground by FTIR (Fourier transform infrared) spectrometers of the NDACC (Network for the Detection of Atmospheric Composition Change) and the thermal nadir spectra measured by IASI (Infrared Atmospheric Sounding Interferometer) aboard the MetOp satellites. In this paper, we present the final MUSICA products, and discuss the characteristics and potential of the NDACC/FTIR and MetOp/IASI {H2O,δD} data pairs. First, we briefly resume the particularities of an {H2O,δD} pair retrieval. Second, we show that the remote sensing data of the final product version are absolutely calibrated with respect to H2O and δD in situ profile references measured in the subtropics, between 0 and 7 km. Third, we reveal that the {H2O,δD} pair distributions obtained from the different remote sensors are consistent and allow distinct lower/middle tropospheric moisture pathways to be identified in agreement with multi-year in situ references. Fourth, we document the possibilities of the NDACC/FTIR instruments for climatological studies (due to long-term monitoring) and of the MetOp/IASI sensors for observing diurnal signals on a quasi-global scale and with high horizontal resolution. Fifth, we discuss the risk of misinterpreting {H2O,δD} pair distributions due to incomplete processing of the remote sensing products.

  10. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  11. Reproducibility of sterilized rubber impressions.

    PubMed

    Abdelaziz, Khalid M; Hassan, Ahmed M; Hodges, J S

    2004-01-01

    Impressions, dentures and other dental appliances may be contaminated with oral micro-flora or other organisms of varying pathogenicity from patient's saliva and blood. Several approaches have been tried to control the transmission of infectious organisms via dental impressions and because disinfection is less effective and has several drawbacks for impression characterization, several sterilization methods have been suggested. This study evaluated the reproducibility of rubber impressions after sterilization by different methods. Dimensional accuracy and wettability of two rubber impression materials (vinyl polysiloxane and polyether) were evaluated after sterilization by each of three well-known methods (immersion in 2% glutaraldehyde for 10 h, autoclaving and microwave radiation). Non-sterilized impressions served as control. The effect of the tray material on impression accuracy and the effect of topical surfactant on the wettability were also evaluated. One-way ANOVA with Dunnett's method was used for statistical analysis. All sterilizing methods reduced the reproducibility of rubber impressions, although not always significantly. Microwave sterilization had a small effect on both accuracy and wettability. The greater effects of the other methods could usually be overcome by using ceramic trays and by spraying impression surfaces with surfactant before pouring the gypsum mix. There was one exception: glutaraldehyde still degraded dimensional accuracy even with ceramic trays and surfactant. We conclude that a) sterilization of rubber impressions made on acrylic trays was usually associated with a degree of dimensional change; b) microwave energy seems to be a suitable technique for sterilizing rubber impressions; c) topical surfactant application helped restore wettability of sterilized impressions. PMID:15798825

  12. Evaluation of guidewire path reproducibility.

    PubMed

    Schafer, Sebastian; Hoffmann, Kenneth R; Noël, Peter B; Ionita, Ciprian N; Dmochowski, Jacek

    2008-05-01

    The number of minimally invasive vascular interventions is increasing. In these interventions, a variety of devices are directed to and placed at the site of intervention. The device used in almost all of these interventions is the guidewire, acting as a monorail for all devices which are delivered to the intervention site. However, even with the guidewire in place, clinicians still experience difficulties during the interventions. As a first step toward understanding these difficulties and facilitating guidewire and device guidance, we have investigated the reproducibility of the final paths of the guidewire in vessel phantom models on different factors: user, materials and geometry. Three vessel phantoms (vessel diameters approximately 4 mm) were constructed having tortuousity similar to the internal carotid artery from silicon tubing and encased in Sylgard elastomer. Several trained users repeatedly passed two guidewires of different flexibility through the phantoms under pulsatile flow conditions. After the guidewire had been placed, rotational c-arm image sequences were acquired (9 in. II mode, 0.185 mm pixel size), and the phantom and guidewire were reconstructed (512(3), 0.288 mm voxel size). The reconstructed volumes were aligned. The centerlines of the guidewire and the phantom vessel were then determined using region-growing techniques. Guidewire paths appear similar across users but not across materials. The average root mean square difference of the repeated placement was 0.17 +/- 0.02 mm (plastic-coated guidewire), 0.73 +/- 0.55 mm (steel guidewire) and 1.15 +/- 0.65 mm (steel versus plastic-coated). For a given guidewire, these results indicate that the guidewire path is relatively reproducible in shape and position.

  13. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  14. The reproducible radio outbursts of SS Cygni

    NASA Astrophysics Data System (ADS)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disc material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  15. Wire like link for cycle reproducible and cycle accurate hardware accelerator

    SciTech Connect

    Asaad, Sameh; Kapur, Mohit; Parker, Benjamin D

    2015-04-07

    First and second field programmable gate arrays are provided which implement first and second blocks of a circuit design to be simulated. The field programmable gate arrays are operated at a first clock frequency and a wire like link is provided to send a plurality of signals between them. The wire like link includes a serializer, on the first field programmable gate array, to serialize the plurality of signals; a deserializer on the second field programmable gate array, to deserialize the plurality of signals; and a connection between the serializer and the deserializer. The serializer and the deserializer are operated at a second clock frequency, greater than the first clock frequency, and the second clock frequency is selected such that latency of transmission and reception of the plurality of signals is less than the period corresponding to the first clock frequency.

  16. Applicability of density functional theory in reproducing accurate vibrational spectra of surface bound species.

    PubMed

    Matanović, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando H; Henson, Neil J

    2014-10-01

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of -2.62 and -1.1% for the N-N stretching and Rh-H stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the Rh-H and N-N stretching modes from the bulk phonons and by solving one- and two-dimensional Schrödinger equation associated with the Rh-H, Rh-N, and N-N potential energy we calculated the anharmonic correction for N-N and Rh-H stretching modes as -31 cm(-1) and -77 cm(-1) at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments. PMID:25164265

  17. Applicability of Density Functional Theory in Reproducing Accurate Vibrational Spectra of Surface Bound Species

    SciTech Connect

    Matanovic, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando; Henson, Neil J.

    2014-10-05

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of 22.62 and 21.1% for the NAN stretching and RhAH stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the RhAH and NAN stretching modes from the bulk phonons and by solving one- and two-dimensional Schr€odinger equation associated with the RhAH, RhAN, and NAN potential energy we calculated the anharmonic correction for NAN and RhAH stretching modes as 231 cm21 and 277 cm21 at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments.

  18. Accurate Optical Reference Catalogs

    NASA Astrophysics Data System (ADS)

    Zacharias, N.

    2006-08-01

    Current and near future all-sky astrometric catalogs on the ICRF are reviewed with the emphasis on reference star data at optical wavelengths for user applications. The standard error of a Hipparcos Catalogue star position is now about 15 mas per coordinate. For the Tycho-2 data it is typically 20 to 100 mas, depending on magnitude. The USNO CCD Astrograph Catalog (UCAC) observing program was completed in 2004 and reductions toward the final UCAC3 release are in progress. This all-sky reference catalogue will have positional errors of 15 to 70 mas for stars in the 10 to 16 mag range, with a high degree of completeness. Proper motions for the about 60 million UCAC stars will be derived by combining UCAC astrometry with available early epoch data, including yet unpublished scans of the complete set of AGK2, Hamburg Zone astrograph and USNO Black Birch programs. Accurate positional and proper motion data are combined in the Naval Observatory Merged Astrometric Dataset (NOMAD) which includes Hipparcos, Tycho-2, UCAC2, USNO-B1, NPM+SPM plate scan data for astrometry, and is supplemented by multi-band optical photometry as well as 2MASS near infrared photometry. The Milli-Arcsecond Pathfinder Survey (MAPS) mission is currently being planned at USNO. This is a micro-satellite to obtain 1 mas positions, parallaxes, and 1 mas/yr proper motions for all bright stars down to about 15th magnitude. This program will be supplemented by a ground-based program to reach 18th magnitude on the 5 mas level.

  19. On the use of ICE/SAT Lidar Space-Born Observations to Evaluate the Ability of MM5 Meso-Scale Model to Reproduce High Altitude Clouds Over Europe in Fall.

    NASA Astrophysics Data System (ADS)

    Krotkov, N. A.; Bhartia, P.; Yang, K.; Carn, S. A.; Krueger, A. J.; Dickerson, R. R.; Hains, J.; Li, C.; Li, Z.; Marufu, L.; Stehr, J.; Levelt, P. F.

    2005-05-01

    The Ozone Monitoring Instrument (OMI) on EOS/Aura offers unprecedented spatial and spectral resolution, coupled with global coverage, for space-based UV measurements of sulfur dioxide (SO2). Publicly released SO2 pollution data are processed with the Band Residual Difference (BRD) algorithm that uses calibrated residuals at SO2 absorption band centers produced by the NASA operational ozone algorithm (OMTO3). By using optimum wavelengths for retrieval of SO2, the retrieval sensitivity is improved over NASA predecessor Total Ozone Mapping Spectrometer (TOMS) by factors of 10 to 20, depending on location. The ground footprint of OMI is 8 times smaller than TOMS. These factors produce a two orders of magnitude improvement in the minimum detectable mass of SO2. The improved sensitivity now permits daily global measurement of heavy anthropogenic SO2 pollution. Anthropogenic SO2 emissions have been measured by OMI over known sources of air pollution, such as eastern China, Eastern Europe, and from individual copper smelters in South America and elsewhere. Here we present data from a case study conducted over Shenyang in NE China as part of EAST-AIRE in April 2005. SO2 observations from instrumented aircraft flights are compared with OMI SO2 maps. The OMI SO2 algorithm was improved to account for the known altitude profile of SO2, and the comparison demonstrates that this algorithm can distinguish between background SO2 conditions and heavy pollution on a daily basis. Between 5 and 7 April 2005 a cold front traveled from continental China, over Korea and on to the Sea of Japan. The satellite-derived measurements of SO2 confirm the in situ aircraft observations of high concentrations of SO2 (ca 4 DU) ahead of the front and lower concentrations behind it and provide evidence for a large-scale impact of pollutant emissions. The BRD algorithm sensitivity does not represent the maximum sensitivity theoretically achievable with OMI, and hence future improvements in instrument

  20. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    NASA Astrophysics Data System (ADS)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  1. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  2. Meteorite Atmospheric Entry Reproduced in Plasmatron

    NASA Astrophysics Data System (ADS)

    Pittarello, L.; McKibbin, S.; Goderis, S.; Soens, B.; Bariselli, F.; Barros Dias, B. R.; Zavalan, F. L.; Magin, T.; Claeys, Ph.

    2016-08-01

    Plasmatron facility allows experimental conditions that reproduce atmospheric entry of meteorites. Tests on basalt, as meteorite analogue, have been performed. Preliminary results have highlighted melting and evaporation effects.

  3. Reproducibility of MRI-Determined Proton Density Fat Fraction Across Two Different MR Scanner Platforms

    PubMed Central

    Kang, Geraldine H.; Cruite, Irene; Shiehmorteza, Masoud; Wolfson, Tanya; Gamst, Anthony C.; Hamilton, Gavin; Bydder, Mark; Middleton, Michael S.; Sirlin, Claude B.

    2016-01-01

    Purpose To evaluate magnetic resonance imaging (MRI)-determined proton density fat fraction (PDFF) reproducibility across two MR scanner platforms and, using MR spectroscopy (MRS)-determined PDFF as reference standard, to confirm MRI-determined PDFF estimation accuracy. Materials and Methods This prospective, cross-sectional, crossover, observational pilot study was approved by an Institutional Review Board. Twenty-one subjects gave written informed consent and underwent liver MRI and MRS at both 1.5T (Siemens Symphony scanner) and 3T (GE Signa Excite HD scanner). MRI-determined PDFF was estimated using an axial 2D spoiled gradient-recalled echo sequence with low flip-angle to minimize T1 bias and six echo-times to permit correction of T2* and fat-water signal interference effects. MRS-determined PDFF was estimated using a stimulated-echo acquisition mode sequence with long repetition time to minimize T1 bias and five echo times to permit T2 correction. Interscanner reproducibility of MRI determined PDFF was assessed by correlation analysis; accuracy was assessed separately at each field strength by linear regression analysis using MRS-determined PDFF as reference standard. Results 1.5T and 3T MRI-determined PDFF estimates were highly correlated (r = 0.992). MRI-determined PDFF estimates were accurate at both 1.5T (regression slope/intercept = 0.958/−0.48) and 3T (slope/intercept = 1.020/0.925) against the MRS-determined PDFF reference. Conclusion MRI-determined PDFF estimation is reproducible and, using MRS-determined PDFF as reference standard, accurate across two MR scanner platforms at 1.5T and 3T. PMID:21769986

  4. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  5. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  6. A simple and reproducible breast cancer prognostic test

    PubMed Central

    2013-01-01

    Background A small number of prognostic and predictive tests based on gene expression are currently offered as reference laboratory tests. In contrast to such success stories, a number of flaws and errors have recently been identified in other genomic-based predictors and the success rate for developing clinically useful genomic signatures is low. These errors have led to widespread concerns about the protocols for conducting and reporting of computational research. As a result, a need has emerged for a template for reproducible development of genomic signatures that incorporates full transparency, data sharing and statistical robustness. Results Here we present the first fully reproducible analysis of the data used to train and test MammaPrint, an FDA-cleared prognostic test for breast cancer based on a 70-gene expression signature. We provide all the software and documentation necessary for researchers to build and evaluate genomic classifiers based on these data. As an example of the utility of this reproducible research resource, we develop a simple prognostic classifier that uses only 16 genes from the MammaPrint signature and is equally accurate in predicting 5-year disease free survival. Conclusions Our study provides a prototypic example for reproducible development of computational algorithms for learning prognostic biomarkers in the era of personalized medicine. PMID:23682826

  7. Skill, reproducibility and potential predictability of the West African monsoon in coupled GCMs

    NASA Astrophysics Data System (ADS)

    Philippon, N.; Doblas-Reyes, F. J.; Ruti, P. M.

    2010-07-01

    In the framework of the ENSEMBLES FP6 project, an ensemble prediction system based on five different state-of-the-art European coupled models has been developed. This study evaluates the performance of these models for forecasting the West African monsoon (WAM) at the monthly time scale. From simulations started the 1 May of each year and covering the period 1991-2001, the reproducibility and potential predictability (PP) of key parameters of the WAM—rainfall, zonal and meridional wind at four levels from the surface to 200 hPa, and specific humidity, from July to September—are assessed. The Sahelian rainfall mode of variability is not accurately reproduced contrary to the Guinean rainfall one: the correlation between observations (from CMAP) and the multi-model ensemble mean is 0.17 and 0.55, respectively. For the Sahelian mode, the correlation is consistent with a low PP of about ~6%. The PP of the Guinean mode is higher, ~44% suggesting a stronger forcing of the sea surface temperature on rainfall variability over this region. Parameters relative to the atmospheric dynamics are on average much more skillful and reproducible than rainfall. Among them, the first mode of variability of the zonal wind at 200 hPa that depicts the Tropical Easterly Jet, is correlated at 0.79 with its “observed” counterpart (from the NCEP/DOE2 reanalyses) and has a PP of 39%. Moreover, models reproduce the correlations between all the atmospheric dynamics parameters and the Sahelian rainfall in a satisfactory way. In that context, a statistical adaptation of the atmospheric dynamic forecasts, using a linear regression model with the leading principal components of the atmospheric dynamical parameters studied, leads to moderate receiver operating characteristic area under the curve and correlation skill scores for the Sahelian rainfall. These scores are however much higher than those obtained using the modelled rainfall.

  8. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  9. An index of parameter reproducibility accounting for estimation uncertainty: theory and case study on β-cell responsivity and insulin sensitivity.

    PubMed

    Dalla Man, Chiara; Pillonetto, Gianluigi; Riz, Michela; Cobelli, Claudio

    2015-06-01

    Parameter reproducibility is necessary to perform longitudinal studies where parameters are assessed to monitor disease progression or effect of therapy but are also useful in powering the study, i.e., to define how many subjects should be studied to observe a given effect. The assessment of parameter reproducibility is usually accomplished by methods that do not take into account the fact that these parameters are estimated with uncertainty. This is particularly relevant in physiological and clinical studies where usually reproducibility cannot be assessed by multiple testing and is usually assessed from a single replication of the test. Working in a suitable stochastic framework, here we propose a new index (S) to measure reproducibility that takes into account parameter uncertainty and is particularly suited to handle the normal testing conditions of physiological and clinical investigations. Simulation results prove that S, by properly taking into account parameter uncertainty, is more accurate and robust than the methods available in the literature. The new metric is applied to assess reproducibility of insulin sensitivity and β-cell responsivity of a mixed-meal tolerance test from data obtained in the same subjects retested 1 wk apart. Results show that the indices of insulin sensitivity and β-cell responsivity to glucose are well reproducible. We conclude that the oral minimal models provide useful indices that can be used safely in prospective studies or to assess the efficacy of a given therapy.

  10. An evaluation of WRF's ability to reproduce the surface wind over complex terrain based on typical circulation patterns

    NASA Astrophysics Data System (ADS)

    Jiménez, P. A.; Dudhia, J.; González-Rouco, J. F.; Montávez, J. P.; García-Bustamante, E.; Navarro, J.; Vilã-Guerau de Arellano, J.; Muñoz-Roldán, A.

    2013-07-01

    The performance of the Weather Research and Forecasting (WRF) model to reproduce the surface wind circulations over complex terrain is examined. The atmospheric evolution is simulated using two versions of the WRF model during an over 13 year period (1992 to 2005) over a complex terrain region located in the northeast of the Iberian Peninsula. A high horizontal resolution of 2km is used to provide an accurate representation of the terrain features. The multiyear evaluation focuses on the analysis of the accuracy displayed by the WRF simulations to reproduce the wind field of the six typical wind patterns (WPs) identified over the area in a previous observational work. Each pattern contains a high number of days which allows one to reach solid conclusions regarding the model performance. The accuracy of the simulations to reproduce the wind field under representative synoptic situations, or pressure patterns (PPs), of the Iberian Peninsula is also inspected in order to diagnose errors as a function of the large-scale situation. The evaluation is accomplished using daily averages in order to inspect the ability of WRF to reproduce the surface flow as a result of the interaction between the synoptic scale and the regional topography. Results indicate that model errors can originate from problems in the initial and lateral boundary conditions, misrepresentations at the synoptic scale, or the realism of the topographic features.

  11. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  12. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  13. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  14. Reproducibility of the Structural Connectome Reconstruction across Diffusion Methods.

    PubMed

    Prčkovska, Vesna; Rodrigues, Paulo; Puigdellivol Sanchez, Ana; Ramos, Marc; Andorra, Magi; Martinez-Heras, Eloy; Falcon, Carles; Prats-Galino, Albert; Villoslada, Pablo

    2016-01-01

    Analysis of the structural connectomes can lead to powerful insights about the brain's organization and damage. However, the accuracy and reproducibility of constructing the structural connectome done with different acquisition and reconstruction techniques is not well defined. In this work, we evaluated the reproducibility of the structural connectome techniques by performing test-retest (same day) and longitudinal studies (after 1 month) as well as analyzing graph-based measures on the data acquired from 22 healthy volunteers (6 subjects were used for the longitudinal study). We compared connectivity matrices and tract reconstructions obtained with the most typical acquisition schemes used in clinical application: diffusion tensor imaging (DTI), high angular resolution diffusion imaging (HARDI), and diffusion spectrum imaging (DSI). We observed that all techniques showed high reproducibility in the test-retest analysis (correlation >.9). However, HARDI was the only technique with low variability (2%) in the longitudinal assessment (1-month interval). The intraclass coefficient analysis showed the highest reproducibility for the DTI connectome, however, with more sparse connections than HARDI and DSI. Qualitative (neuroanatomical) assessment of selected tracts confirmed the quantitative results showing that HARDI managed to detect most of the analyzed fiber groups and fanning fibers. In conclusion, we found that HARDI acquisition showed the most balanced trade-off between high reproducibility of the connectome, higher rate of path detection and of fanning fibers, and intermediate acquisition times (10-15 minutes), although at the cost of higher appearance of aberrant fibers. PMID:26464179

  15. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  16. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  17. Reproducibility and uncertainty of wastewater turbidity measurements.

    PubMed

    Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G

    2008-01-01

    Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used.

  18. Thou Shalt Be Reproducible! A Technology Perspective.

    PubMed

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  19. On the Possibility to Combine the Order Effect with Sequential Reproducibility for Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Basieva, Irina; Khrennikov, Andrei

    2015-10-01

    In this paper we study the problem of a possibility to use quantum observables to describe a possible combination of the order effect with sequential reproducibility for quantum measurements. By the order effect we mean a dependence of probability distributions (of measurement results) on the order of measurements. We consider two types of the sequential reproducibility: adjacent reproducibility (A-A) (the standard perfect repeatability) and separated reproducibility(A-B-A). The first one is reproducibility with probability 1 of a result of measurement of some observable A measured twice, one A measurement after the other. The second one, A-B-A, is reproducibility with probability 1 of a result of A measurement when another quantum observable B is measured between two A's. Heuristically, it is clear that the second type of reproducibility is complementary to the order effect. We show that, surprisingly, this may not be the case. The order effect can coexist with a separated reproducibility as well as adjacent reproducibility for both observables A and B. However, the additional constraint in the form of separated reproducibility of the B-A-B type makes this coexistence impossible. The problem under consideration was motivated by attempts to apply the quantum formalism outside of physics, especially, in cognitive psychology and psychophysics. However, it is also important for foundations of quantum physics as a part of the problem about the structure of sequential quantum measurements.

  20. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  1. Interrater reproducibility of clinical tests for rotator cuff lesions

    PubMed Central

    Ostor, A; Richards, C; Prevost, A; Hazleman, B; Speed, C

    2004-01-01

    Background: Rotator cuff lesions are common in the community but reproducibility of tests for shoulder assessment has not been adequately appraised and there is no uniform approach to their use. Objective: To study interrater reproducibility of standard tests for shoulder evaluation among a rheumatology specialist, rheumatology trainee, and research nurse. Methods: 136 patients were reviewed over 12 months at a major teaching hospital. The three assessors examined each patient in random order and were unaware of each other's evaluation. Each shoulder was examined in a standard manner by recognised tests for specific lesions and a diagnostic algorithm was used. Between-observer agreement was determined by calculating Cohen's κ coefficients (measuring agreement beyond that expected by chance). Results: Fair to substantial agreement was obtained for the observations of tenderness, painful arc, and external rotation. Tests for supraspinatus and subscapularis also showed at least fair agreement between observers. 40/55 (73%) κ coefficient assessments were rated at >0.2, indicating at least fair concordance between observers; 21/55 (38%) were rated at >0.4, indicating at least moderate concordance between observers. Conclusion: The reproducibility of certain tests, employed by observers of varying experience, in the assessment of the rotator cuff and general shoulder disease was determined. This has implications for delegation of shoulder assessment to nurse specialists, the development of a simplified evaluation schedule for general practitioners, and uniformity in epidemiological research studies. PMID:15361389

  2. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  3. 3D models of slow motions in the Earth's crust and upper mantle in the source zones of seismically active regions and their comparison with highly accurate observational data: II. Results of numerical calculations

    NASA Astrophysics Data System (ADS)

    Molodenskii, S. M.; Molodenskii, M. S.; Begitova, T. A.

    2016-09-01

    In the first part of the paper, a new method was developed for solving the inverse problem of coseismic and postseismic deformations in the real (imperfectly elastic, radially and horizontally heterogeneous, self-gravitating) Earth with hydrostatic initial stresses from highly accurate modern satellite data. The method is based on the decomposition of the sought parameters in the orthogonalized basis. The method was suggested for estimating the ambiguity of the solution of the inverse problem for coseismic and postseismic deformations. For obtaining this estimate, the orthogonal complement is constructed to the n-dimensional space spanned by the system of functional derivatives of the residuals in the system of n observed and model data on the coseismic and postseismic displacements at a variety of sites on the ground surface with small variations in the models. Below, we present the results of the numerical modeling of the elastic displacements of the ground surface, which were based on calculating Green's functions of the real Earth for the plane dislocation surface and different orientations of the displacement vector as described in part I of the paper. The calculations were conducted for the model of a horizontally homogeneous but radially heterogeneous selfgravitating Earth with hydrostatic initial stresses and the mantle rheology described by the Lomnitz logarithmic creep function according to (M. Molodenskii, 2014). We compare our results with the previous numerical calculations (Okado, 1985; 1992) for the simplest model of a perfectly elastic nongravitating homogeneous Earth. It is shown that with the source depths starting from the first hundreds of kilometers and with magnitudes of about 8.0 and higher, the discrepancies significantly exceed the errors of the observations and should therefore be taken into account. We present the examples of the numerical calculations of the creep function of the crust and upper mantle for the coseismic deformations. We

  4. Reproducibility responsibilities in the HPC arena

    SciTech Connect

    Fahey, Mark R; McLay, Robert

    2014-01-01

    Expecting bit-for-bit reproducibility in the HPC arena is not feasible because of the ever changing hardware and software. No user s application is an island; it lives in an HPC eco-system that changes over time. Old hardware stops working and even old software won t run on new hardware. Further, software libraries change over time either by changing the internals or even interfaces. So bit-for-bit reproducibility should not be expected. Rather a reasonable expectation is that results are reproducible within error bounds; or that the answers are close (which is its own debate.) To expect a researcher to reproduce their own results or the results of others within some error bounds, there must be enough information to recreate all the details of the experiment. This requires complete documentation of all phases of the researcher s workflow; from code to versioning to programming and runtime environments to publishing of data. This argument is the core statement of the Yale 2009 Declaration on Reproducible Research [1]. Although the HPC ecosystem is often outside the researchers control, the application code could be built almost identically and there is a chance for very similar results with just only round-off error differences. To achieve complete documentation at every step, the researcher, the computing center, and the funding agencies all have a role. In this thesis, the role of the researcher is expanded upon as compared to the Yale report and the role of the computing centers is described.

  5. Submicroscopic malaria parasite carriage: how reproducible are polymerase chain reaction-based methods?

    PubMed

    Costa, Daniela Camargos; Madureira, Ana Paula; Amaral, Lara Cotta; Sanchez, Bruno Antônio Marinho; Gomes, Luciano Teixeira; Fontes, Cor Jésus Fernandes; Limongi, Jean Ezequiel; Brito, Cristiana Ferreira Alves de; Carvalho, Luzia Helena

    2014-02-01

    The polymerase chain reaction (PCR)-based methods for the diagnosis of malaria infection are expected to accurately identify submicroscopic parasite carriers. Although a significant number of PCR protocols have been described, few studies have addressed the performance of PCR amplification in cases of field samples with submicroscopic malaria infection. Here, the reproducibility of two well-established PCR protocols (nested-PCR and real-time PCR for the Plasmodium 18 small subunit rRNA gene) were evaluated in a panel of 34 blood field samples from individuals that are potential reservoirs of malaria infection, but were negative for malaria by optical microscopy. Regardless of the PCR protocol, a large variation between the PCR replicates was observed, leading to alternating positive and negative results in 38% (13 out of 34) of the samples. These findings were quite different from those obtained from the microscopy-positive patients or the unexposed individuals; the diagnosis of these individuals could be confirmed based on the high reproducibility and specificity of the PCR-based protocols. The limitation of PCR amplification was restricted to the field samples with very low levels of parasitaemia because titrations of the DNA templates were able to detect < 3 parasites/µL in the blood. In conclusion, conventional PCR protocols require careful interpretation in cases of submicroscopic malaria infection, as inconsistent and false-negative results can occur.

  6. Quantitative real-time PCR for rapid and accurate titration of recombinant baculovirus particles.

    PubMed

    Hitchman, Richard B; Siaterli, Evangelia A; Nixon, Clare P; King, Linda A

    2007-03-01

    We describe the use of quantitative PCR (QPCR) to titer recombinant baculoviruses. Custom primers and probe were designed to gp64 and used to calculate a standard curve of QPCR derived titers from dilutions of a previously titrated baculovirus stock. Each dilution was titrated by both plaque assay and QPCR, producing a consistent and reproducible inverse relationship between C(T) and plaque forming units per milliliter. No significant difference was observed between titers produced by QPCR and plaque assay for 12 recombinant viruses, confirming the validity of this technique as a rapid and accurate method of baculovirus titration.

  7. The Economics of Reproducibility in Preclinical Research

    PubMed Central

    Freedman, Leonard P.; Cockburn, Iain M.; Simcoe, Timothy S.

    2015-01-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible—in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures. PMID:26057340

  8. Reproducibility, Controllability, and Optimization of Lenr Experiments

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  9. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  10. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  11. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  12. SPLASH: Accurate OH maser positions

    NASA Astrophysics Data System (ADS)

    Walsh, Andrew; Gomez, Jose F.; Jones, Paul; Cunningham, Maria; Green, James; Dawson, Joanne; Ellingsen, Simon; Breen, Shari; Imai, Hiroshi; Lowe, Vicki; Jones, Courtney

    2013-10-01

    The hydroxyl (OH) 18 cm lines are powerful and versatile probes of diffuse molecular gas, that may trace a largely unstudied component of the Galactic ISM. SPLASH (the Southern Parkes Large Area Survey in Hydroxyl) is a large, unbiased and fully-sampled survey of OH emission, absorption and masers in the Galactic Plane that will achieve sensitivities an order of magnitude better than previous work. In this proposal, we request ATCA time to follow up OH maser candidates. This will give us accurate (~10") positions of the masers, which can be compared to other maser positions from HOPS, MMB and MALT-45 and will provide full polarisation measurements towards a sample of OH masers that have not been observed in MAGMO.

  13. A Physical Activity Questionnaire: Reproducibility and Validity

    PubMed Central

    Barbosa, Nicolas; Sanchez, Carlos E.; Vera, Jose A.; Perez, Wilson; Thalabard, Jean-Christophe; Rieu, Michel

    2007-01-01

    This study evaluates the Quantification de L’Activite Physique en Altitude chez les Enfants (QAPACE) supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE) on Bogotá’s schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC). The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2) from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97); by age categories 8-10, 0.94 (0.89-0. 97); 11-13, 0.98 (0.96- 0.99); 14-16, 0.95 (0.91-0.98). The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66) (p<0.01); by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87), 0.76 (0.78) and 0.88 (0.80) respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake. Key pointsThe presence of a supervisor, the limited size of the group with the possibility of answering to their questions could explain the high reproducibility for this questionnaire.No study in the literature had directly addressed the issue of estimating a yearly average PA including school and vacation period.A two step procedure, in the population of schoolchildren of Bogotá, gives confidence in the use of the QAPACE questionnaire in a large epidemiological survey in related populations. PMID:24149485

  14. Venusian Polar Vortex reproduced by a general circulation model

    NASA Astrophysics Data System (ADS)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  15. Reproducibility of electroretinograms recorded with DTL electrodes.

    PubMed

    Hébert, M; Lachapelle, P; Dumont, M

    The purpose of this study was to examine whether the use of the DTL fiber electrode yields stable and reproducible electroretinographic recordings. To do so, luminance response function, derived from dark-adapted electroretinograms, was obtained from both eyes of 10 normal subjects at two recording sessions spaced by 7-14 days. The data thus generated was used to calculate Naka-Rushton Vmax and k parameters and values obtained at the two recording sessions were compared. Our results showed that there was no significant difference in the values of Vmax and k calculated from the data generated at the two recording sessions. The above clearly demonstrate that the use of the DTL fiber electrode does not jeopardize, in any way, the stability and reproducibility of ERG responses.

  16. In Spite of Indeterminacy Many Common Factor Score Estimates Yield an Identical Reproduced Covariance Matrix

    ERIC Educational Resources Information Center

    Beauducel, Andre

    2007-01-01

    It was investigated whether commonly used factor score estimates lead to the same reproduced covariance matrix of observed variables. This was achieved by means of Schonemann and Steiger's (1976) regression component analysis, since it is possible to compute the reproduced covariance matrices of the regression components corresponding to different…

  17. Reproducing Actual Morphology of Planetary Lava Flows

    NASA Astrophysics Data System (ADS)

    Miyamoto, H.; Sasaki, S.

    1996-03-01

    Assuming that lava flows behave as non-isothermal laminar Bingham fluids, we developed a numerical code of lava flows. We take the self gravity effects and cooling mechanisms into account. The calculation method is a kind of cellular automata using a reduced random space method, which can eliminate the mesh shape dependence. We can calculate large scale lava flows precisely without numerical instability and reproduce morphology of actual lava flows.

  18. Reproducibility of liquid oxygen impact test results

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1975-01-01

    Results for 12,000 impacts on a wide range of materials were studied to determine the reproducibility of the liquid oxygen impact test method. Standard deviations representing the overall variability of results were in close agreement with the expected values for a binomial process. This indicates that the major source of variability is due to the go - no go nature of the test method and that variations due to sampling and testing operations were not significant.

  19. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  20. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  1. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales. PMID:24805343

  2. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  3. The Vienna LTE simulators - Enabling reproducibility in wireless communications research

    NASA Astrophysics Data System (ADS)

    Mehlführer, Christian; Colom Colom Ikuno, Josep; Šimko, Michal; Schwarz, Stefan; Wrulich, Martin; Rupp, Markus

    2011-12-01

    In this article, we introduce MATLAB-based link and system level simulation environments for UMTS Long-Term Evolution (LTE). The source codes of both simulators are available under an academic non-commercial use license, allowing researchers full access to standard-compliant simulation environments. Owing to the open source availability, the simulators enable reproducible research in wireless communications and comparison of novel algorithms. In this study, we explain how link and system level simulations are connected and show how the link level simulator serves as a reference to design the system level simulator. We compare the accuracy of the PHY modeling at system level by means of simulations performed both with bit-accurate link level simulations and PHY-model-based system level simulations. We highlight some of the currently most interesting research questions for LTE, and explain by some research examples how our simulators can be applied.

  4. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  5. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  6. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  7. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  8. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  9. Accurate Molecular Polarizabilities Based on Continuum Electrostatics

    PubMed Central

    Truchon, Jean-François; Nicholls, Anthony; Iftimie, Radu I.; Roux, Benoît; Bayly, Christopher I.

    2013-01-01

    A novel approach for representing the intramolecular polarizability as a continuum dielectric is introduced to account for molecular electronic polarization. It is shown, using a finite-difference solution to the Poisson equation, that the Electronic Polarization from Internal Continuum (EPIC) model yields accurate gas-phase molecular polarizability tensors for a test set of 98 challenging molecules composed of heteroaromatics, alkanes and diatomics. The electronic polarization originates from a high intramolecular dielectric that produces polarizabilities consistent with B3LYP/aug-cc-pVTZ and experimental values when surrounded by vacuum dielectric. In contrast to other approaches to model electronic polarization, this simple model avoids the polarizability catastrophe and accurately calculates molecular anisotropy with the use of very few fitted parameters and without resorting to auxiliary sites or anisotropic atomic centers. On average, the unsigned error in the average polarizability and anisotropy compared to B3LYP are 2% and 5%, respectively. The correlation between the polarizability components from B3LYP and this approach lead to a R2 of 0.990 and a slope of 0.999. Even the F2 anisotropy, shown to be a difficult case for existing polarizability models, can be reproduced within 2% error. In addition to providing new parameters for a rapid method directly applicable to the calculation of polarizabilities, this work extends the widely used Poisson equation to areas where accurate molecular polarizabilities matter. PMID:23646034

  10. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  11. Reproducible quantitative proteotype data matrices for systems biology.

    PubMed

    Röst, Hannes L; Malmström, Lars; Aebersold, Ruedi

    2015-11-01

    Historically, many mass spectrometry-based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals.

  12. On the reproducibility of protein crystal structures: five atomic resolution structures of trypsin

    SciTech Connect

    Liebschner, Dorothee; Dauter, Miroslawa; Brzuszkiewicz, Anna; Dauter, Zbigniew

    2013-08-01

    Details of five very high-resolution accurate structures of bovine trypsin are compared in the context of the reproducibility of models obtained from crystals grown under identical conditions. Structural studies of proteins usually rely on a model obtained from one crystal. By investigating the details of this model, crystallographers seek to obtain insight into the function of the macromolecule. It is therefore important to know which details of a protein structure are reproducible or to what extent they might differ. To address this question, the high-resolution structures of five crystals of bovine trypsin obtained under analogous conditions were compared. Global parameters and structural details were investigated. All of the models were of similar quality and the pairwise merged intensities had large correlation coefficients. The C{sup α} and backbone atoms of the structures superposed very well. The occupancy of ligands in regions of low thermal motion was reproducible, whereas solvent molecules containing heavier atoms (such as sulfur) or those located on the surface could differ significantly. The coordination lengths of the calcium ion were conserved. A large proportion of the multiple conformations refined to similar occupancies and the residues adopted similar orientations. More than three quarters of the water-molecule sites were conserved within 0.5 Å and more than one third were conserved within 0.1 Å. An investigation of the protonation states of histidine residues and carboxylate moieties was consistent for all of the models. Radiation-damage effects to disulfide bridges were observed for the same residues and to similar extents. Main-chain bond lengths and angles averaged to similar values and were in agreement with the Engh and Huber targets. Other features, such as peptide flips and the double conformation of the inhibitor molecule, were also reproducible in all of the trypsin structures. Therefore, many details are similar in models obtained

  13. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  14. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  15. Mechanostructure and composition of highly reproducible decellularized liver matrices.

    PubMed

    Mattei, G; Di Patria, V; Tirella, A; Alaimo, A; Elia, G; Corti, A; Paolicchi, A; Ahluwalia, A

    2014-02-01

    Despite the increasing number of papers on decellularized scaffolds, there is little consensus on the optimum method of decellularizing biological tissue such that the micro-architecture and protein content of the matrix are conserved as far as possible. Focusing on the liver, the aim of this study was therefore to develop a method for the production of well-characterized and reproducible matrices that best preserves the structure and composition of the native extra cellular matrix (ECM). Given the importance of matrix stiffness in regulating cell response, the mechanical properties of the decellularized tissue were also considered. The testing and analysis framework is based on the characterization of decellularized and untreated samples in the same reproducible initial state (i.e., the equilibrium swollen state). Decellularized ECM (dECM) were characterized using biochemical, histological, mechanical and structural analyses to identify the best procedure to ensure complete cell removal while preserving most of the native ECM structure and composition. Using this method, sterile decellularized porcine ECM with highly conserved intra-lobular micro-structure and protein content were obtained in a consistent and reproducible manner using the equilibrium swollen state of tissue or matrix as a reference. A significant reduction in the compressive elastic modulus was observed for liver dECM with respect to native tissue, suggesting a re-examination of design parameters for ECM-mimicking scaffolds for engineering tissues in vitro.

  16. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers. PMID:9787445

  17. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers.

  18. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  19. Nonlinear sequential laminates reproducing hollow sphere assemblages

    NASA Astrophysics Data System (ADS)

    Idiart, Martín I.

    2007-07-01

    A special class of nonlinear porous materials with isotropic 'sequentially laminated' microstructures is found to reproduce exactly the hydrostatic behavior of 'hollow sphere assemblages'. It is then argued that this result supports the conjecture that Gurson's approximate criterion for plastic porous materials, and its viscoplastic extension of Leblond et al. (1994), may actually yield rigorous upper bounds for the hydrostatic flow stress of porous materials containing an isotropic, but otherwise arbitrary, distribution of porosity. To cite this article: M.I. Idiart, C. R. Mecanique 335 (2007).

  20. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  1. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  2. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. PMID:21982279

  3. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.

  4. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted. PMID:26941312

  5. Reproducibility Data on SUMMiT

    SciTech Connect

    Irwin, Lloyd; Jakubczak, Jay; Limary, Siv; McBrayer, John; Montague, Stephen; Smith, James; Sniegowski, Jeffry; Stewart, Harold; de Boer, Maarten

    1999-07-16

    SUMMiT (Sandia Ultra-planar Multi-level MEMS Technology) at the Sandia National Laboratories' MDL (Microelectronics Development Laboratory) is a standardized MEMS (Microelectromechanical Systems) technology that allows designers to fabricate concept prototypes. This technology provides four polysilicon layers plus three sacrificial oxide layers (with the third oxide layer being planarized) to enable fabrication of complex mechanical systems-on-a-chip. Quantified reproducibility of the SUMMiT process is important for process engineers as well as designers. Summary statistics for critical MEMS technology parameters such as film thickness, line width, and sheet resistance will be reported for the SUMMiT process. Additionally, data from Van der Pauw test structures will be presented. Data on film thickness, film uniformity and critical dimensions of etched line widths are collected from both process and monitor wafers during manufacturing using film thickness metrology tools and SEM tools. A standardized diagnostic module is included in each SWiT run to obtain post-processing parametric data to monitor run-to-run reproducibility such as Van der Pauw structures for measuring sheet resistance. This characterization of the SUMMiT process enables design for manufacturability in the SUMMiT technology.

  6. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  7. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. PMID:26315443

  8. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  9. Datathons and Software to Promote Reproducible Research

    PubMed Central

    2016-01-01

    Background Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. Objective We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Methods Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Results Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes—a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. Conclusions This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility. PMID:27558834

  10. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.

  11. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone. PMID:9140874

  12. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    PubMed

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  13. Reproducibility and quantitation of amplicon sequencing-based detection.

    PubMed

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-08-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  14. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  15. Laboratory 20-km cycle time trial reproducibility.

    PubMed

    Zavorsky, G S; Murias, J M; Gow, J; Kim, D J; Poulin-Harnois, C; Kubow, S; Lands, L C

    2007-09-01

    This study evaluated the reproducibility of laboratory based 20-km time trials in well trained versus recreational cyclists. Eighteen cyclists (age = 34 +/- 8 yrs; body mass index = 23.1 +/- 2.2 kg/m (2); VO(2max) = 4.19 +/- 0.65 L/min) completed three 20-km time trials over a month on a Velotron cycle ergometer. Average power output (PO) (W), speed, and heart rate (HR) were significantly lower in the first time trial compared to the second and third time trial. The coefficients of variation (CV) between the second and third trial of the top eight performers for average PO, time to completion, and speed were 1.2 %, 0.6 %, 0.5 %, respectively, compared to 4.8 %, 2.0 %, and 2.3 % for the bottom ten. In addition, the average HR, VO(2), and percentage of VO(2max) were similar between trials. This study demonstrated that (1) a familiarization session improves the reliability of the measurements (i.e., average PO, time to completion and speed), and (2) the CV was much smaller for the best performers.

  16. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  17. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2014-04-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale. A new approach for modelling soil erosion at large spatial scale is here proposed. It is based on the joint use of low data demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available datasets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country level statistics of pre-existing European maps of soil erosion by water is also provided.

  18. Accurate comparison of antibody expression levels by reproducible transgene targeting in engineered recombination-competent CHO cells.

    PubMed

    Mayrhofer, Patrick; Kratzer, Bernhard; Sommeregger, Wolfgang; Steinfellner, Willibald; Reinhart, David; Mader, Alexander; Turan, Soeren; Qiao, Junhua; Bode, Juergen; Kunert, Renate

    2014-12-01

    Over the years, Chinese hamster ovary (CHO) cells have emerged as the major host for expressing biotherapeutic proteins. Traditional methods to generate high-producer cell lines rely on random integration(s) of the gene of interest but have thereby left the identification of bottlenecks as a challenging task. For comparison of different producer cell lines derived from various transfections, a system that provides control over transgene expression behavior is highly needed. This motivated us to develop a novel "DUKX-B11 F3/F" cell line to target different single-chain antibody fragments into the same chromosomal target site by recombinase-mediated cassette exchange (RMCE) using the flippase (FLP)/FLP recognition target (FRT) system. The RMCE-competent cell line contains a gfp reporter fused to a positive/negative selection system flanked by heterospecific FRT (F) variants under control of an external CMV promoter, constructed as "promoter trap". The expression stability and FLP accessibility of the tagged locus was demonstrated by successive rounds of RMCE. As a proof of concept, we performed RMCE using cassettes encoding two different anti-HIV single-chain Fc fragments, 3D6scFv-Fc and 2F5scFv-Fc. Both targeted integrations yielded homogenous cell populations with comparable intracellular product contents and messenger RNA (mRNA) levels but product related differences in specific productivities. These studies confirm the potential of the newly available "DUKX-B11 F3/F" cell line to guide different transgenes into identical transcriptional control regions by RMCE and thereby generate clones with comparable amounts of transgene mRNA. This new host is a prerequisite for cell biology studies of independent transfections and transgenes.

  19. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets

  20. Experimental challenges to reproduce seismic fault motion

    NASA Astrophysics Data System (ADS)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  1. High Reproducibility of ELISPOT Counts from Nine Different Laboratories.

    PubMed

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem; BenHamouda, Nadine; Gottwein, Judith; Laxmanan, Sreenivas; Levine, Steven M; Loffredo, John T; McArdle, Stephanie; Neudoerfl, Christine; Roen, Diana; Silina, Karina; Welch, Mackenzie; Lehmann, Paul V

    2015-01-01

    The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable background intensities. Due to the subjective nature of judging maximal and minimal spot sizes, different investigators come up with different numbers. This study aims to determine whether statistics-based, automated size-gating can harmonize the number of spot counts calculated between different laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A) Basic Count™ relying on subjective counting parameters set by the respective investigators and (B) SmartCount™, an automated counting protocol by the ImmunoSpot® Software that uses statistics-based spot size auto-gating with spot intensity auto-thresholding. The average coefficient of variation (CV) for the mean values between independent laboratories was 26.7% when counting with Basic Count™, and 6.7% when counting with SmartCount™. Our data indicates that SmartCount™ allows harmonization of counting ELISPOT results between different laboratories and investigators. PMID:25585297

  2. Reproducibility and utility of dune luminescence chronologies

    NASA Astrophysics Data System (ADS)

    Leighton, Carly L.; Thomas, David S. G.; Bailey, Richard M.

    2014-02-01

    Optically stimulated luminescence (OSL) dating of dune deposits has increasingly been used as a tool to investigate the response of aeolian systems to environmental change. Amalgamation of individual dune accumulation chronologies has been employed in order to distinguish regional from local geomorphic responses to change. However, advances in dating have produced chronologies of increasing complexity. In particular, questions regarding the interpretation of dune ages have been raised, including over the most appropriate method to evaluate the significance of suites of OSL ages when local 'noisy' and discontinuous records are combined. In this paper, these issues are reviewed and the reproducibility of dune chronologies is assessed. OSL ages from two cores sampled from the same dune in the northeast Rub' al Khali, United Arab Emirates, are presented and compared, alongside an analysis of previously published dune ages dated to within the last 30 ka. Distinct periods of aeolian activity and preservation are identified, which can be tied to regional climatic and environmental changes. This case study is used to address fundamental questions that are persistently asked of dune dating studies, including the appropriate spatial scale over which to infer environmental and climatic change based on dune chronologies, whether chronological hiatuses can be interpreted, how to most appropriately combine and display datasets, and the relationship between geomorphic and palaeoclimatic signals. Chronological profiles reflect localised responses to environmental variability and climatic forcing, and amalgamation of datasets, with consideration of sampling resolution, is required; otherwise local factors are always likely to dominate. Using net accumulation rates to display ages may provide an informative approach of analysing and presenting dune OSL chronologies less susceptible to biases resulting from insufficient sampling resolution.

  3. 3D models of slow motions in the Earth's crust and upper mantle in the source zones of seismically active regions and their comparison with highly accurate observational data: I. Main relationships

    NASA Astrophysics Data System (ADS)

    Molodenskii, S. M.; Molodenskii, M. S.; Begitova, T. A.

    2016-09-01

    Constructing detailed models for postseismic and coseismic deformations of the Earth's surface has become particularly important because of the recently established possibility to continuously monitor the tectonic stresses in the source zones based on the data on the time variations in the tidal tilt amplitudes. Below, a new method is suggested for solving the inverse problem about the coseismic and postseismic deformations in the real non-ideally elastic, radially and horizontally heterogeneous, self-gravitating Earth with a hydrostatic distribution of the initial stresses from the satellite data on the ground surface displacements. The solution of this problem is based on decomposing the parameters determining the geometry of the fault surface and the distribution of the dislocation vector on this surface and elastic modules in the source in the orthogonal bases. The suggested approach includes four steps: 1. Calculating (by the perturbation method) the variations in Green's function for the radial and tangential ground surface displacements with small 3D variations in the mechanical parameters and geometry of the source area (i.e., calculating the functional derivatives of the three components of Green's function on the surface from the distributions of the elastic moduli and creep function within the volume of the source area and Burgers' vector on the surface of the dislocations); 2. Successive orthogonalization of the functional derivatives; 3. Passing from the decompositions of the residuals between the observed and modeled surface displacements in the system of nonorthogonalized functional derivatives to their decomposition in the system of orthogonalized derivatives; finding the corrections to the distributions of the sought parameters from the coefficients of their decompositions in the orthogonalized basis; and 4. Analyzing the ambiguity of the inverse problem solution by constructing the orthogonal complement to the obtained basis. The described

  4. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  5. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  6. A Mechanical System to Reproduce Cardiovascular Flows

    NASA Astrophysics Data System (ADS)

    Lindsey, Thomas; Valsecchi, Pietro

    2010-11-01

    Within the framework of the "Pumps&Pipes" collaboration between ExxonMobil Upstream Research Company and The DeBakey Heart and Vascular Center in Houston, a hydraulic control system was developed to accurately simulate general cardiovascular flows. The final goal of the development of the apparatus was the reproduction of the periodic flow of blood through the heart cavity with the capability of varying frequency and amplitude, as well as designing the systolic/diastolic volumetric profile over one period. The system consists of a computer-controlled linear actuator that drives hydraulic fluid in a closed loop to a secondary hydraulic cylinder. The test section of the apparatus is located inside a MRI machine, and the closed loop serves to physically separate all metal moving parts (control system and actuator cylinder) from the MRI-compatible pieces. The secondary cylinder is composed of nonmetallic elements and directly drives the test section circulatory flow loop. The circulatory loop consists of nonmetallic parts and several types of Newtonian and non-Newtonian fluids, which model the behavior of blood. This design allows for a periodic flow of blood-like fluid pushed through a modeled heart cavity capable of replicating any healthy heart condition as well as simulating anomalous conditions. The behavior of the flow inside the heart can thus be visualized by MRI techniques.

  7. Reproducing the kinematics of damped Lyman α systems

    NASA Astrophysics Data System (ADS)

    Bird, Simeon; Haehnelt, Martin; Neeleman, Marcel; Genel, Shy; Vogelsberger, Mark; Hernquist, Lars

    2015-02-01

    We examine the kinematic structure of damped Lyman α systems (DLAs) in a series of cosmological hydrodynamic simulations using the AREPO code. We are able to match the distribution of velocity widths of associated low-ionization metal absorbers substantially better than earlier work. Our simulations produce a population of DLAs dominated by haloes with virial velocities around 70 km s-1, consistent with a picture of relatively small, faint objects. In addition, we reproduce the observed correlation between velocity width and metallicity and the equivalent width distribution of Si II. Some discrepancies of moderate statistical significance remain; too many of our spectra show absorption concentrated at the edge of the profile and there are slight differences in the exact shape of the velocity width distribution. We show that the improvement over previous work is mostly due to our strong feedback from star formation and our detailed modelling of the metal ionization state.

  8. Reproducing kernel hilbert space based single infrared image super resolution

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang; Deng, Liangjian; Shen, Wei; Xi, Ning; Zhou, Zhanxin; Song, Bo; Yang, Yongliang; Cheng, Yu; Dong, Lixin

    2016-07-01

    The spatial resolution of Infrared (IR) images is limited by lens optical diffraction, sensor array pitch size and pixel dimension. In this work, a robust model is proposed to reconstruct high resolution infrared image via a single low resolution sampling, where the image features are discussed and classified as reflective, cooled emissive and uncooled emissive based on infrared irradiation source. A spline based reproducing kernel hilbert space and approximative heaviside function are deployed to model smooth part and edge component of image respectively. By adjusting the parameters of heaviside function, the proposed model can enhance distinct part of images. The experimental results show that the model is applicable on both reflective and emissive low resolution infrared images to improve thermal contrast. The overall outcome produces a high resolution IR image, which makes IR camera better measurement accuracy and observes more details at long distance.

  9. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    for inclusion in standard atmospheric and planetary spectroscopic databases. The methods involved in computing the ab initio potential energy and dipole moment surfaces involved minor corrections to the equilibrium S-O distance, which produced a good agreement with experimentally determined rotational energies. However the purely ab initio method was not been able to reproduce an equally spectroscopically accurate representation of vibrational motion. We therefore present an empirical refinement to this original, ab initio potential surface, based on the experimental data available. This will not only be used to reproduce the room-temperature spectrum to a greater degree of accuracy, but is essential in the production of a larger, accurate line list necessary for the simulation of higher temperature spectra: we aim for coverage suitable for T ? 800 K. Our preliminary studies on SO3 have also shown it to exhibit an interesting "forbidden" rotational spectrum and "clustering" of rotational states; to our knowledge this phenomenon has not been observed in other examples of trigonal planar molecules and is also an investigative avenue we wish to pursue. Finally, the IR absorption bands for SO2 and SO3 exhibit a strong overlap, and the inclusion of SO2 as a complement to our studies is something that we will be interested in doing in the near future.

  10. Fast and Accurate Exhaled Breath Ammonia Measurement

    PubMed Central

    Solga, Steven F.; Mudalel, Matthew L.; Spacek, Lisa A.; Risby, Terence H.

    2014-01-01

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations. PMID:24962141

  11. Reproducibility of urinary bisphenol A concentrations measured during pregnancy in the Generation R Study.

    PubMed

    Jusko, Todd A; Shaw, Pamela A; Snijder, Claudia A; Pierik, Frank H; Koch, Holger M; Hauser, Russ; Jaddoe, Vincent W V; Burdorf, Alex; Hofman, Albert; Tiemeier, Henning; Longnecker, Matthew P

    2014-01-01

    The potential human health effects of bisphenol A (BPA) exposure are a public health concern. In order to design adequately powered epidemiological studies to address potential health effects, data on the reproducibility of BPA concentration in serial urine specimens taken during pregnancy are needed. To provide additional data on the reproducibility of maternal urine specimens, 80 women in the Generation R Study (Rotterdam, The Netherlands) contributed a spot urine specimen at <18, 18-25, and >25 weeks of pregnancy. Reproducibility, estimated by the intraclass correlation coefficient (ICC), was 0.32 (95% confidence interval: 0.18-0.46), and, on a creatinine basis, 0.31 (95% confidence interval: 0.16-0.47). Although the ICC observed in the Generation R Study is slightly higher than previous reproducibility studies of BPA, it nevertheless indicates a high degree of within-person variability that presents challenges for designing well-powered epidemiologic studies. PMID:24736100

  12. Profitable capitation requires accurate costing.

    PubMed

    West, D A; Hicks, L L; Balas, E A; West, T D

    1996-01-01

    In the name of costing accuracy, nurses are asked to track inventory use on per treatment basis when more significant costs, such as general overhead and nursing salaries, are usually allocated to patients or treatments on an average cost basis. Accurate treatment costing and financial viability require analysis of all resources actually consumed in treatment delivery, including nursing services and inventory. More precise costing information enables more profitable decisions as is demonstrated by comparing the ratio-of-cost-to-treatment method (aggregate costing) with alternative activity-based costing methods (ABC). Nurses must participate in this costing process to assure that capitation bids are based upon accurate costs rather than simple averages. PMID:8788799

  13. A comprehensive assessment of RNA-seq accuracy, reproducibility and information content by the Sequencing Quality Control consortium

    PubMed Central

    2014-01-01

    We present primary results from the Sequencing Quality Control (SEQC) project, coordinated by the United States Food and Drug Administration. Examining Illumina HiSeq, Life Technologies SOLiD and Roche 454 platforms at multiple laboratory sites using reference RNA samples with built-in controls, we assess RNA sequencing (RNA-seq) performance for junction discovery and differential expression profiling and compare it to microarray and quantitative PCR (qPCR) data using complementary metrics. At all sequencing depths, we discover unannotated exon-exon junctions, with >80% validated by qPCR. We find that measurements of relative expression are accurate and reproducible across sites and platforms if specific filters are used. In contrast, RNA-seq and microarrays do not provide accurate absolute measurements, and gene-specific biases are observed, for these and qPCR. Measurement performance depends on the platform and data analysis pipeline, and variation is large for transcript-level profiling. The complete SEQC data sets, comprising >100 billion reads (10Tb), provide unique resources for evaluating RNA-seq analyses for clinical and regulatory settings. PMID:25150838

  14. A comprehensive assessment of RNA-seq accuracy, reproducibility and information content by the Sequencing Quality Control Consortium.

    PubMed

    2014-09-01

    We present primary results from the Sequencing Quality Control (SEQC) project, coordinated by the US Food and Drug Administration. Examining Illumina HiSeq, Life Technologies SOLiD and Roche 454 platforms at multiple laboratory sites using reference RNA samples with built-in controls, we assess RNA sequencing (RNA-seq) performance for junction discovery and differential expression profiling and compare it to microarray and quantitative PCR (qPCR) data using complementary metrics. At all sequencing depths, we discover unannotated exon-exon junctions, with >80% validated by qPCR. We find that measurements of relative expression are accurate and reproducible across sites and platforms if specific filters are used. In contrast, RNA-seq and microarrays do not provide accurate absolute measurements, and gene-specific biases are observed for all examined platforms, including qPCR. Measurement performance depends on the platform and data analysis pipeline, and variation is large for transcript-level profiling. The complete SEQC data sets, comprising >100 billion reads (10Tb), provide unique resources for evaluating RNA-seq analyses for clinical and regulatory settings.

  15. Reproducibility of Facial Soft Tissue Thickness Measurements Using Cone-Beam CT Images According to the Measurement Methods.

    PubMed

    Hwang, Hyeon-Shik; Choe, Seon-Yeong; Hwang, Ji-Sup; Moon, Da-Nal; Hou, Yanan; Lee, Won-Joon; Wilkinson, Caroline

    2015-07-01

    The purpose of this study was to establish the reproducibility of facial soft tissue (ST) thickness measurements by comparing three different measurement methods applied at 32 landmarks on three-dimensional cone-beam computed tomography (CBCT) images. Two observers carried out the measurements of facial ST thickness of 20 adult subjects using CBCT scan data, and inter- and intra-observer reproducibilities were evaluated. The measurement method of "perpendicular to bone" resulted in high inter- and intra-observer reproducibility at all 32 landmarks. In contrast, the "perpendicular to skin" method and "direct" method, which measures a distance between one point on bone and the other point on skin, presented low reproducibility. The results indicate that reproducibility could be increased by identifying the landmarks on hard tissue images, rather than on ST images, and the landmark description used in this study can be used in the establishment of reliable tissue depth data using CBCT images.

  16. Reproducibility of Facial Soft Tissue Thickness Measurements Using Cone-Beam CT Images According to the Measurement Methods.

    PubMed

    Hwang, Hyeon-Shik; Choe, Seon-Yeong; Hwang, Ji-Sup; Moon, Da-Nal; Hou, Yanan; Lee, Won-Joon; Wilkinson, Caroline

    2015-07-01

    The purpose of this study was to establish the reproducibility of facial soft tissue (ST) thickness measurements by comparing three different measurement methods applied at 32 landmarks on three-dimensional cone-beam computed tomography (CBCT) images. Two observers carried out the measurements of facial ST thickness of 20 adult subjects using CBCT scan data, and inter- and intra-observer reproducibilities were evaluated. The measurement method of "perpendicular to bone" resulted in high inter- and intra-observer reproducibility at all 32 landmarks. In contrast, the "perpendicular to skin" method and "direct" method, which measures a distance between one point on bone and the other point on skin, presented low reproducibility. The results indicate that reproducibility could be increased by identifying the landmarks on hard tissue images, rather than on ST images, and the landmark description used in this study can be used in the establishment of reliable tissue depth data using CBCT images. PMID:25845397

  17. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  18. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy.

    PubMed

    Brinkmann, Benjamin H; Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C; Chen, Min; Tieng, Quang M; He, Jialune; Muñoz-Almaraz, F J; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E; Litt, Brian; Worrell, Gregory A

    2016-06-01

    SEE MORMANN AND ANDRZEJAK DOI101093/BRAIN/AWW091 FOR A SCIENTIFIC COMMENTARY ON THIS ARTICLE  : Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and human

  19. Reproducibility of urinary phthalate metabolites in first morning urine samples.

    PubMed Central

    Hoppin, Jane A; Brock, John W; Davis, Barbara J; Baird, Donna D

    2002-01-01

    Phthalates are ubiquitous in our modern environment because of their use in plastics and cosmetic products. Phthalate monoesters--primarily monoethylhexyl phthalate and monobutyl phthalate--are reproductive and developmental toxicants in animals. Accurate measures of phthalate exposure are needed to assess their human health effects. Phthalate monoesters have a biologic half-life of approximately 12 hr, and little is known about the temporal variability and daily reproducibility of urinary measures in humans. To explore these aspects, we measured seven phthalate monoesters and creatinine concentration in two consecutive first-morning urine specimens from 46 African-American women, ages 35-49 years, residing in the Washington, DC, area in 1996-1997. We measured phthalate monoesters using high-pressure liquid chromatography followed by tandem mass spectrometry on a triple quadrupole instrument using atmospheric pressure chemical ionization. We detected four phthalate monoesters in all subjects, with median levels of 31 ng/mL for monobenzyl phthalate (mBzP), 53 ng/mL for monobutyl phthalate (mBP), 211 ng/mL for monoethyl phthalate (mEP), and 7.3 ng/mL for monoethylhexyl phthalate (mEHP). These were similar to concentrations reported for other populations using spot urine specimens. Phthalate levels did not differ between the two sampling days. The Pearson correlation coefficient between the concentrations on the 2 days was 0.8 for mBP, 0.7 for mEHP, 0.6 for mEP, and 0.5 for mBzP. These results suggest that even with the short half-lives of phthalates, women's patterns of exposure may be sufficiently stable to assign an exposure level based on a single first morning void urine measurement. PMID:12003755

  20. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  1. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  2. Accurate Weather Forecasting for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  3. Accurate documentation and wound measurement.

    PubMed

    Hampton, Sylvie

    This article, part 4 in a series on wound management, addresses the sometimes routine yet crucial task of documentation. Clear and accurate records of a wound enable its progress to be determined so the appropriate treatment can be applied. Thorough records mean any practitioner picking up a patient's notes will know when the wound was last checked, how it looked and what dressing and/or treatment was applied, ensuring continuity of care. Documenting every assessment also has legal implications, demonstrating due consideration and care of the patient and the rationale for any treatment carried out. Part 5 in the series discusses wound dressing characteristics and selection.

  4. A neural coding scheme reproducing foraging trajectories

    PubMed Central

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-01-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series. PMID:26648311

  5. A neural coding scheme reproducing foraging trajectories

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-12-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  6. A neural coding scheme reproducing foraging trajectories.

    PubMed

    Gutiérrez, Esther D; Cabrera, Juan Luis

    2015-12-09

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  7. Towards Reproducible Descriptions of Neuronal Network Models

    PubMed Central

    Nordlie, Eilen; Gewaltig, Marc-Oliver; Plesser, Hans Ekkehard

    2009-01-01

    Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing—and thinking about—complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain. PMID:19662159

  8. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  9. Inter-platform reproducibility of liver and spleen stiffness measured with MR Elastography

    PubMed Central

    Yasar, Temel Kaya; Wagner, Mathilde; Bane, Octavia; Besa, Cecilia; Babb, James S; Kannengiesser, Stephan; Fung, Maggie; Ehman, Richard L.; Taouli, Bachir

    2016-01-01

    Purpose To assess inter-platform reproducibility of liver stiffness (LS) and spleen stiffness (SS) measured with MR elastography (MRE) based on a 2D GRE sequence. Materials and Methods This prospective HIPAA-compliant and IRB-approved study involved 12 subjects (5 healthy volunteers and 7 patients with liver disease). A multi-slice 2D-GRE-based MRE sequence was performed using two systems from different vendors (3.0T GE and 1.5T Siemens) on the same day. Two independent observers measured LS and SS on confidence maps. Bland-Altman analysis (with coefficient of reproducibility, CR), coefficient of variability (CV) and intraclass correlation (ICC) were used to analyze inter-platform, intra- and inter-observer variability. Human data was validated using a gelatin-based phantom. Results There was excellent reproducibility of phantom stiffness measurement (CV 4.4%). Mean LS values were 3.44–3.48 kPa and 3.62–3.63 kPa, and mean SS values were 7.54–7.91 kPa and 8.40–8.85 kPa at 3.0T and 1.5T for observers 1 and 2, respectively. The mean CVs between platforms were 9.2%–11.5% and 13.1%–14.4% for LS and SS, respectively for observers 1 and 2. There was excellent inter-platform reproducibility (ICC >0.88 and CR <36.2%) for both LS and SS, and excellent intra- and inter-observer reproducibility (intra-observer: ICC >0.99, CV <2.1%, CR <6.6%; inter-observer: ICC >0.97, CV and CR <16%). Conclusion This study demonstrates that 2D-GRE MRE provides platform- and observer-independent LS and SS measurements. PMID:26469708

  10. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  11. Reproducibility study of TLD-100 micro-cubes at radiotherapy dose level.

    PubMed

    da Rosa, L A; Regulla, D F; Fill, U A

    1999-03-01

    The precision of the thermoluminescent response of Harshaw micro-cube dosimeters (TLD-100), evaluated in both Harshaw thermoluminescent readers 5500 and 3500, for 1 Gy dose value, was investigated. The mean reproducibility for micro-cubes, pre-readout annealed at 100 degrees C for 15 min, evaluated with the manual planchet reader 3500, is 0.61% (1 standard deviation). When micro-cubes are evaluated with the automated hot-gas reader 5500, reproducibility values are undoubtedly worse, mean reproducibility for numerically stabilised dosimeters being equal to 3.27% (1 standard deviation). These results indicate that the reader model 5500, or, at least, the instrument used for the present measurements, is not adequate for micro-cube evaluation, if precise and accurate dosimetry is required. The difference in precision is apparently due to geometry inconsistencies in the orientation of the imperfect micro-cube faces during readout, requiring careful and manual reproducible arrangement of the selected micro-cube faces in contact with the manual reader planchet.

  12. Epigenetic variation in asexually reproducing organisms.

    PubMed

    Verhoeven, Koen J F; Preite, Veronica

    2014-03-01

    The role that epigenetic inheritance can play in adaptation may differ between sexuals and asexuals because (1) the dynamics of adaptation differ under sexual and asexual reproduction and the opportunities offered by epigenetic inheritance may affect these dynamics differently; and (2) in asexual reproduction epigenetic reprogramming mechanisms that are associated with meiosis can be bypassed, which could promote the buildup of epigenetic variation in asexuals. Here, we evaluate current evidence for an epigenetic contribution to adaptation in asexuals. We argue that two aspects of epigenetic variation should have particular relevance for asexuals, namely epigenetics-mediated phenotypic plasticity within and between generations, and heritable variation via stochastic epimutations. An evaluation of epigenetic reprogramming mechanisms suggests that some, but not all, forms of asexual reproduction enhance the likelihood of stable transmission of epigenetic marks across generations compared to sexual reproduction. However, direct tests of these predicted sexual-asexual differences are virtually lacking. Stable transmission of DNA methylation, transcriptomes, and phenotypes from parent to clonal offspring are demonstrated in various asexual species, and clonal genotypes from natural populations show habitat-specific DNA methylation. We discuss how these initial observations can be extended to demonstrate an epigenetic contribution to adaptation.

  13. Epigenetic variation in asexually reproducing organisms.

    PubMed

    Verhoeven, Koen J F; Preite, Veronica

    2014-03-01

    The role that epigenetic inheritance can play in adaptation may differ between sexuals and asexuals because (1) the dynamics of adaptation differ under sexual and asexual reproduction and the opportunities offered by epigenetic inheritance may affect these dynamics differently; and (2) in asexual reproduction epigenetic reprogramming mechanisms that are associated with meiosis can be bypassed, which could promote the buildup of epigenetic variation in asexuals. Here, we evaluate current evidence for an epigenetic contribution to adaptation in asexuals. We argue that two aspects of epigenetic variation should have particular relevance for asexuals, namely epigenetics-mediated phenotypic plasticity within and between generations, and heritable variation via stochastic epimutations. An evaluation of epigenetic reprogramming mechanisms suggests that some, but not all, forms of asexual reproduction enhance the likelihood of stable transmission of epigenetic marks across generations compared to sexual reproduction. However, direct tests of these predicted sexual-asexual differences are virtually lacking. Stable transmission of DNA methylation, transcriptomes, and phenotypes from parent to clonal offspring are demonstrated in various asexual species, and clonal genotypes from natural populations show habitat-specific DNA methylation. We discuss how these initial observations can be extended to demonstrate an epigenetic contribution to adaptation. PMID:24274255

  14. Banff Initiative for Quality Assurance in Transplantation (BIFQUIT): Reproducibility of Polyomavirus Immunohistochemistry in Kidney Allografts

    PubMed Central

    Adam, Benjamin; Randhawa, Parmjeet; Chan, Samantha; Zeng, Gang; Regele, Heinz; Kushner, Yael B.; Colvin, Robert B.; Reeve, Jeff; Mengel, Michael

    2014-01-01

    Immunohistochemistry is the gold standard for diagnosing (positive versus negative) polyomavirus BK (BKV) nephropathy and has the potential for disease staging based on staining intensity and quantification of infected cells. This multicenter trial evaluated the reproducibility of BKV immunohistochemistry among 81 pathologists at 60 institutions. Participants stained tissue microarray slides and scored them for staining intensity and percentage of positive nuclei. Staining protocol details and evaluation scores were collected online. Slides were returned for centralized panel re-evaluation and kappa statistics were calculated. Individual assessment of staining intensity and percentage was more reproducible than combined scoring. Inter-institutional reproducibility was moderate for staining intensity (κ=0.49) and percentage (κ=0.42), fair for combined (κ=0.25), and best for simple positive/negative scoring (κ=0.63). Inter-observer reproducibility was substantial for intensity (κ=0.74), percentage (κ=0.66), and positive/negative (κ=0.67), and moderate for combined scoring (κ=0.43). Inter-laboratory reproducibility was fair for intensity (κ=0.37), percentage (κ=0.40), and combined (κ=0.24), but substantial for positive/negative scoring (κ=0.78). BKV RNA copies/cell correlated with staining intensity (r=0.56) and percentage (r=0.62). These results indicate that BKV immunohistochemistry is reproducible between observers but scoring should be simplified to a single-feature schema. Standardization of tissue processing and staining protocols would further improve inter-laboratory reproducibility. PMID:25091177

  15. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  16. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  17. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  18. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  19. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  20. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Authority to reproduce. 95.43 Section 95.43 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.43 Authority to reproduce. (a) Each...

  1. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-12-10

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

  2. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  3. Accurate thickness measurement of graphene

    NASA Astrophysics Data System (ADS)

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  4. Accurate thickness measurement of graphene.

    PubMed

    Shearer, Cameron J; Slattery, Ashley D; Stapleton, Andrew J; Shapter, Joseph G; Gibson, Christopher T

    2016-03-29

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  5. Rainfall variability and extremes over southern Africa: assessment of a climate model to reproduce daily extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  6. Analytical strategies for improving the robustness and reproducibility of bioluminescent microbial bioreporters.

    PubMed

    Roda, Aldo; Roda, Barbara; Cevenini, Luca; Michelini, Elisa; Mezzanotte, Laura; Reschiglian, Pierluigi; Hakkila, Kaisa; Virta, Marko

    2011-07-01

    Whole-cell bioluminescent (BL) bioreporter technology is a useful analytical tool for developing biosensors for environmental toxicology and preclinical studies. However, when applied to real samples, several methodological problems prevent it from being widely used. Here, we propose a methodological approach for improving its analytical performance with complex matrix. We developed bioluminescent Escherichia coli and Saccharomyces cerevisiae bioreporters for copper ion detection. In the same cell, we introduced two firefly luciferases requiring the same luciferin substrate emitting at different wavelengths. The expression of one was copper ion specific. The other, constitutively expressed, was used as a cell viability internal control. Engineered BL cells were characterized using the noninvasive gravitational field-flow fractionation (GrFFF) technique. Homogeneous cell population was isolated. Cells were then immobilized in a polymeric matrix improving cell responsiveness. The bioassay was performed in 384-well black polystyrene microtiter plates directly on the sample. After 2 h of incubation at 37 °C and the addition of the luciferin, we measured the emitted light. These dual-color bioreporters showed more robustness and a wider dynamic range than bioassays based on the same strains with a single reporter gene and that uses a separate cell strain as BL control. The internal correction allowed to accurately evaluate the copper content even in simulated toxic samples, where reduced cell viability was observed. Homogenous cells isolated by GrFFF showed improvement in method reproducibility, particularly for yeast cells. The applicability of these bioreporters to real samples was demonstrated in tap water and wastewater treatment plant effluent samples spiked with copper and other metal ions. PMID:21603915

  7. Technological Basis and Scientific Returns for Absolutely Accurate Measurements

    NASA Astrophysics Data System (ADS)

    Dykema, J. A.; Anderson, J.

    2011-12-01

    The 2006 NRC Decadal Survey fostered a new appreciation for societal objectives as a driving motivation for Earth science. Many high-priority societal objectives are dependent on predictions of weather and climate. These predictions are based on numerical models, which derive from approximate representations of well-founded physics and chemistry on space and timescales appropriate to global and regional prediction. These laws of chemistry and physics in turn have a well-defined quantitative relationship with physical measurement units, provided these measurement units are linked to international measurement standards that are the foundation of contemporary measurement science and standards for engineering and commerce. Without this linkage, measurements have an ambiguous relationship to scientific principles that introduces avoidable uncertainty in analyses, predictions, and improved understanding of the Earth system. Since the improvement of climate and weather prediction is fundamentally dependent on the improvement of the representation of physical processes, measurement systems that reduce the ambiguity between physical truth and observations represent an essential component of a national strategy for understanding and living with the Earth system. This paper examines the technological basis and potential science returns of sensors that make measurements that are quantitatively tied on-orbit to international measurement standards, and thus testable to systematic errors. This measurement strategy provides several distinct benefits. First, because of the quantitative relationship between these international measurement standards and fundamental physical constants, measurements of this type accurately capture the true physical and chemical behavior of the climate system and are not subject to adjustment due to excluded measurement physics or instrumental artifacts. In addition, such measurements can be reproduced by scientists anywhere in the world, at any time

  8. Accurate Stellar Parameters for Exoplanet Host Stars

    NASA Astrophysics Data System (ADS)

    Brewer, John Michael; Fischer, Debra; Basu, Sarbani; Valenti, Jeff A.

    2015-01-01

    A large impedement to our understanding of planet formation is obtaining a clear picture of planet radii and densities. Although determining precise ratios between planet and stellar host are relatively easy, determining accurate stellar parameters is still a difficult and costly undertaking. High resolution spectral analysis has traditionally yielded precise values for some stellar parameters but stars in common between catalogs from different authors or analyzed using different techniques often show offsets far in excess of their uncertainties. Most analyses now use some external constraint, when available, to break observed degeneracies between surface gravity, effective temperature, and metallicity which can otherwise lead to correlated errors in results. However, these external constraints are impossible to obtain for all stars and can require more costly observations than the initial high resolution spectra. We demonstrate that these discrepencies can be mitigated by use of a larger line list that has carefully tuned atomic line data. We use an iterative modeling technique that does not require external constraints. We compare the surface gravity obtained with our spectral synthesis modeling to asteroseismically determined values for 42 Kepler stars. Our analysis agrees well with only a 0.048 dex offset and an rms scatter of 0.05 dex. Such accurate stellar gravities can reduce the primary source of uncertainty in radii by almost an order of magnitude over unconstrained spectral analysis.

  9. Periotest values: Its reproducibility, accuracy, and variability with hormonal influence

    PubMed Central

    Chakrapani, Swarna; Goutham, Madireddy; Krishnamohan, Thota; Anuparthy, Sujitha; Tadiboina, Nagarjuna; Rambha, Somasekhar

    2015-01-01

    Tooth mobility can be assessed by both subjective and objective means. The use of subjective measures may lead to bias and hence it becomes imperative to use objective means to assess tooth mobility. It has also been observed that hormonal fluctuations may have significantly influence tooth mobility. Aims: The study was undertaken to assess the reproducibility of periotest in the assessment of tooth mobility and, to unravel the obscurity associated with the hormonal influence on tooth mobility. Materials and Methods: 100 subjects were included in the study and were divided equally into two groups based on their age, group I (11-14 years) and group II(16-22 years). Results: There was no statistical significant difference between the periotest values (PTV) taken at two different time periods with a time difference of 20 minutes. PTV of group I was found to have a statistical significant greater PTV than group II. Conclusion: Periotest can reliably measure tooth mobility. Tooth mobility is greater during puberty as compared to adolescence and during adolescence mobility was slightly greater in males. PMID:25684904

  10. Reproducible and controllable induction voltage adder for scaled beam experiments

    NASA Astrophysics Data System (ADS)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  11. Reproducible and controllable induction voltage adder for scaled beam experiments.

    PubMed

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments. PMID:27587112

  12. Innovative Flow Cytometry Allows Accurate Identification of Rare Circulating Cells Involved in Endothelial Dysfunction

    PubMed Central

    Boraldi, Federica; Bartolomeo, Angelica; De Biasi, Sara; Orlando, Stefania; Costa, Sonia; Cossarizza, Andrea; Quaglino, Daniela

    2016-01-01

    Introduction Although rare, circulating endothelial and progenitor cells could be considered as markers of endothelial damage and repair potential, possibly predicting the severity of cardiovascular manifestations. A number of studies highlighted the role of these cells in age-related diseases, including those characterized by ectopic calcification. Nevertheless, their use in clinical practice is still controversial, mainly due to difficulties in finding reproducible and accurate methods for their determination. Methods Circulating mature cells (CMC, CD45-, CD34+, CD133-) and circulating progenitor cells (CPC, CD45dim, CD34bright, CD133+) were investigated by polychromatic high-speed flow cytometry to detect the expression of endothelial (CD309+) or osteogenic (BAP+) differentiation markers in healthy subjects and in patients affected by peripheral vascular manifestations associated with ectopic calcification. Results This study shows that: 1) polychromatic flow cytometry represents a valuable tool to accurately identify rare cells; 2) the balance of CD309+ on CMC/CD309+ on CPC is altered in patients affected by peripheral vascular manifestations, suggesting the occurrence of vascular damage and low repair potential; 3) the increase of circulating cells exhibiting a shift towards an osteoblast-like phenotype (BAP+) is observed in the presence of ectopic calcification. Conclusion Differences between healthy subjects and patients with ectopic calcification indicate that this approach may be useful to better evaluate endothelial dysfunction in a clinical context. PMID:27560136

  13. An universal and accurate replica technique for scanning electron microscope study in clinical dentistry.

    PubMed

    Lambrechts, P; Vanherle, G; Davidson, C

    1981-09-01

    One of the main concerns of dental research is the observation of the oral tissues and the materials applied to the dentition. The changes in composition and structure of the outer surfaces and the materials deposited on these surfaces are of special interest. In the literature, a variety of replica techniques for these purposes is described (Grundy in 1971 [12]; Saxton in 1973 [25]). The use of these techniques is limited because of artifacts in the samples, and a restricted resolution power resulting from useful magnifications in the order of 800x. An accurate and universal replica technique for the examination of specimens to be viewed under the SEM has been developed. The first impression is made by a light body silicone elastomer (President Coltene). The positive replica is made by electrodeposition of copper in an electro plating bath (Acru plat 5 electronic, Dr. Th. Wieland, D-7530 Pforzheim). The reliability and accuracy of this replica technique was verified by a scanning electron microscopic comparison of the replicas and the actual structures of etched enamel. To illustrate the applicability of the replica technique to structures with much lower hardness, also high resolution images of dental plaque were produced. The copper surface offers a perfect, original and proper electroconductive medium that withstands the bombardment of electrons and the relatively severe conditions in the scanning electron microscope. Reproducibility was accurate as judged by the duplication in position, size, and shape of the fine detail at magnifications of 7500x offering a resolution of 25 nm.

  14. Reproducibility of transcutaneous oximetry and laser Doppler flowmetry in facial skin and gingival tissue.

    PubMed

    Svalestad, J; Hellem, S; Vaagbø, G; Irgens, A; Thorsen, E

    2010-01-01

    Laser Doppler flowmetry (LDF) and transcutaneous oximetry (TcPO(2)) are non-invasive techniques, widely used in the clinical setting, for assessing microvascular blood flow and tissue oxygen tension, e.g. recording vascular changes after radiotherapy and hyperbaric oxygen therapy. With standardized procedures and improved reproducibility, these methods might also be applicable in longitudinal studies. The aim of this study was to evaluate the reproducibility of facial skin and gingival LDF and facial skin TcPO(2). The subjects comprised ten healthy volunteers, 5 men, aged 31-68 years. Gingival perfusion was recorded with the LDF probe fixed to a custom made, tooth-supported acrylic splint. Skin perfusion was recorded on the cheek. TcPO(2) was recorded on the forehead and cheek and in the second intercostal space. The reproducibility of LDF measurements taken after vasodilation by heat provocation was greater than for basal flow in both facial skin and mandibular gingiva. Pronounced intraday variations were observed. Interweek reproducibility assessed by intraclass correlation coefficient ranged from 0.74 to 0.96 for LDF and from 0.44 to 0.75 for TcPO(2). The results confirm acceptable reproducibility of LDF and TcPO(2) in longitudinal studies in a vascular laboratory where subjects serve as their own controls. The use of thermoprobes is recommended. Repeat measurements should be taken at the same time of day.

  15. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  16. Reproducibility of ERG responses obtained with the DTL electrode.

    PubMed

    Hébert, M; Vaegan; Lachapelle, P

    1999-03-01

    Previous investigators have suggested that the DTL fibre electrode might not be suitable for the recording of replicable electroretinograms. We present experimental evidence that when used adequately, this electrode does permit the recording of highly reproducible retinal potentials.

  17. 15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE AMERICA NEW YORK, 1872. THE HOOD WINDMILL IS IN THE FOREGROUND AND THE PANTIGO WINDMILL IS IN THE BACKGROUND - Pantigo Windmill, James Lane, East Hampton, Suffolk County, NY

  18. 8. Historic American Buildings Survey Reproduced from the collections of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey Reproduced from the collections of the Library of Congress, Accession No. 45041 Geographical File ('Nantucket, Mass.') Division of Prints and Photographs c. 1880 - Jethro Coffin House, Sunset Hill, Nantucket, Nantucket County, MA

  19. 223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS WITH INDEPENDENT ALIGNMENTS FROM KEY BRIDGE LOOKING NORTHWEST, 1953. - George Washington Memorial Parkway, Along Potomac River from McLean to Mount Vernon, VA, Mount Vernon, Fairfax County, VA

  20. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  1. Global climate modeling of Saturn's atmosphere: fast and accurate radiative transfer and exploration of seasonal variability

    NASA Astrophysics Data System (ADS)

    Guerlet, Sandrine; Spiga, A.; Sylvestre, M.; Fouchet, T.; Millour, E.; Wordsworth, R.; Leconte, J.; Forget, F.

    2013-10-01

    Recent observations of Saturn’s stratospheric thermal structure and composition revealed new phenomena: an equatorial oscillation in temperature, reminiscent of the Earth's Quasi-Biennal Oscillation ; strong meridional contrasts of hydrocarbons ; a warm “beacon” associated with the powerful 2010 storm. Those signatures cannot be reproduced by 1D photochemical and radiative models and suggest that atmospheric dynamics plays a key role. This motivated us to develop a complete 3D General Circulation Model (GCM) for Saturn, based on the LMDz hydrodynamical core, to explore the circulation, seasonal variability, and wave activity in Saturn's atmosphere. In order to closely reproduce Saturn's radiative forcing, a particular emphasis was put in obtaining fast and accurate radiative transfer calculations. Our radiative model uses correlated-k distributions and spectral discretization tailored for Saturn's atmosphere. We include internal heat flux, ring shadowing and aerosols. We will report on the sensitivity of the model to spectral discretization, spectroscopic databases, and aerosol scenarios (varying particle sizes, opacities and vertical structures). We will also discuss the radiative effect of the ring shadowing on Saturn's atmosphere. We will present a comparison of temperature fields obtained with this new radiative equilibrium model to that inferred from Cassini/CIRS observations. In the troposphere, our model reproduces the observed temperature knee caused by heating at the top of the tropospheric aerosol layer. In the lower stratosphere (20mbar observations except in the equatorial region, where the temperature structure is governed by the dynamical equatorial oscillation. In the upper stratosphere (p<0.1 mbar), our modeled temperature is 5-10K too low compared to measurements. This suggests that processes other than radiative heating/cooling by trace

  2. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-01

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  3. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  4. Standardization of Hemagglutination Inhibition Assay for Influenza Serology Allows for High Reproducibility between Laboratories.

    PubMed

    Zacour, Mary; Ward, Brian J; Brewer, Angela; Tang, Patrick; Boivin, Guy; Li, Yan; Warhuus, Michelle; McNeil, Shelly A; LeBlanc, Jason J; Hatchette, Todd F

    2016-03-01

    Standardization of the hemagglutination inhibition (HAI) assay for influenza serology is challenging. Poor reproducibility of HAI results from one laboratory to another is widely cited, limiting comparisons between candidate vaccines in different clinical trials and posing challenges for licensing authorities. In this study, we standardized HAI assay materials, methods, and interpretive criteria across five geographically dispersed laboratories of a multidisciplinary influenza research network and then evaluated intralaboratory and interlaboratory variations in HAI titers by repeatedly testing standardized panels of human serum samples. Duplicate precision and reproducibility from comparisons between assays within laboratories were 99.8% (99.2% to 100%) and 98.0% (93.3% to 100%), respectively. The results for 98.9% (95% to 100%) of the samples were within 2-fold of all-laboratory consensus titers, and the results for 94.3% (85% to 100%) of the samples were within 2-fold of our reference laboratory data. Low-titer samples showed the greatest variability in comparisons between assays and between sites. Classification of seroprotection (titer ≥ 40) was accurate in 93.6% or 89.5% of cases in comparison to the consensus or reference laboratory classification, respectively. This study showed that with carefully chosen standardization processes, high reproducibility of HAI results between laboratories is indeed achievable. PMID:26818953

  5. Accurate ab initio Quartic Force Fields of Cyclic and Bent HC2N Isomers

    NASA Technical Reports Server (NTRS)

    Inostroza, Natalia; Huang, Xinchuan; Lee, Timothy J.

    2012-01-01

    Highly correlated ab initio quartic force field (QFFs) are used to calculate the equilibrium structures and predict the spectroscopic parameters of three HC2N isomers. Specifically, the ground state quasilinear triplet and the lowest cyclic and bent singlet isomers are included in the present study. Extensive treatment of correlation effects were included using the singles and doubles coupled-cluster method that includes a perturbational estimate of the effects of connected triple excitations, denoted CCSD(T). Dunning s correlation-consistent basis sets cc-pVXZ, X=3,4,5, were used, and a three-point formula for extrapolation to the one-particle basis set limit was used. Core-correlation and scalar relativistic corrections were also included to yield highly accurate QFFs. The QFFs were used together with second-order perturbation theory (with proper treatment of Fermi resonances) and variational methods to solve the nuclear Schr dinger equation. The quasilinear nature of the triplet isomer is problematic, and it is concluded that a QFF is not adequate to describe properly all of the fundamental vibrational frequencies and spectroscopic constants (though some constants not dependent on the bending motion are well reproduced by perturbation theory). On the other hand, this procedure (a QFF together with either perturbation theory or variational methods) leads to highly accurate fundamental vibrational frequencies and spectroscopic constants for the cyclic and bent singlet isomers of HC2N. All three isomers possess significant dipole moments, 3.05D, 3.06D, and 1.71D, for the quasilinear triplet, the cyclic singlet, and the bent singlet isomers, respectively. It is concluded that the spectroscopic constants determined for the cyclic and bent singlet isomers are the most accurate available, and it is hoped that these will be useful in the interpretation of high-resolution astronomical observations or laboratory experiments.

  6. Direct Pressure Monitoring Accurately Predicts Pulmonary Vein Occlusion During Cryoballoon Ablation

    PubMed Central

    Kosmidou, Ioanna; Wooden, Shannnon; Jones, Brian; Deering, Thomas; Wickliffe, Andrew; Dan, Dan

    2013-01-01

    Cryoballoon ablation (CBA) is an established therapy for atrial fibrillation (AF). Pulmonary vein (PV) occlusion is essential for achieving antral contact and PV isolation and is typically assessed by contrast injection. We present a novel method of direct pressure monitoring for assessment of PV occlusion. Transcatheter pressure is monitored during balloon advancement to the PV antrum. Pressure is recorded via a single pressure transducer connected to the inner lumen of the cryoballoon. Pressure curve characteristics are used to assess occlusion in conjunction with fluoroscopic or intracardiac echocardiography (ICE) guidance. PV occlusion is confirmed when loss of typical left atrial (LA) pressure waveform is observed with recordings of PA pressure characteristics (no A wave and rapid V wave upstroke). Complete pulmonary vein occlusion as assessed with this technique has been confirmed with concurrent contrast utilization during the initial testing of the technique and has been shown to be highly accurate and readily reproducible. We evaluated the efficacy of this novel technique in 35 patients. A total of 128 veins were assessed for occlusion with the cryoballoon utilizing the pressure monitoring technique; occlusive pressure was demonstrated in 113 veins with resultant successful pulmonary vein isolation in 111 veins (98.2%). Occlusion was confirmed with subsequent contrast injection during the initial ten procedures, after which contrast utilization was rapidly reduced or eliminated given the highly accurate identification of occlusive pressure waveform with limited initial training. Verification of PV occlusive pressure during CBA is a novel approach to assessing effective PV occlusion and it accurately predicts electrical isolation. Utilization of this method results in significant decrease in fluoroscopy time and volume of contrast. PMID:23485956

  7. Reproducibility and intraindividual variability of the pattern electroretinogram.

    PubMed

    Jacobi, P C; Walter, P; Brunner, R; Krieglstein, G K

    1994-08-01

    The human pattern electroretinogram (PERG) is a contrast-specific potential presumedly reflecting the functional integrity of ganglion cells. Many studies have devised criteria that enable PERG measurements to distinguish established glaucomatous (hypertonic) eyes from normal controls. As there are relatively few reports concerning the reproducibility and reliability of the PERG, we studied the intraindividual variability of the PERG in 20 healthy subjects. Both transient and steady-state responses were recorded using a high-contrast (98%), black-and-white, counterphasing checkerboard pattern (average luminance, 80 cd/m2) generated by a television monitor (subtending angle, 13.8 degrees x 10.8 degrees) using three different check sizes (15', 30', and 60'). Recordings were performed in both eyes simultaneously at a 7-day interval under test-retest conditions. Responses of 30' spatial frequency were most consistent and resulted in a mean amplitude (+/- SD) of 2.18 +/- 0.95 microV (P50) and 4.00 +/- 1.69 microV (N95) for transient patterns and 1.84 +/- 1.25 microV for steady-state patterns. No statistically significant difference was observed between either right and left eyes, test and retest conditions or 1st- and 7th-day recording sessions for PERG parameters. In linear correlation analysis there was an adequate, positive correlation between the right and left eyes (r = 0.78); a weak correlation between test and retest conditions (r = 0.58); and no correlation between measurements made at a 7-day interval. As a consequence, we conclude that the follow-up of patients (e.g., glaucoma, ocular hypertension) by means of PERG is critical, especially when therapeutic consequences may be based on the physiological variability of a weak retinal signal.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7804106

  8. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  9. Engineering preliminaries to obtain reproducible mixtures of atelocollagen and polysaccharides.

    PubMed

    Lefter, Cristina-Mihaela; Maier, Stelian Sergiu; Maier, Vasilica; Popa, Marcel; Desbrieres, Jacques

    2013-05-01

    The critical stage in producing blends of biomacromolecules consists in the mixing of component solutions to generate homogenous diluted colloidal systems. Simple experimental investigations allow the establishment of the design rules of recipes and the procedures for preparing homogenous and compositionally reproducible mixtures. Starting from purified solutions of atelocollagen, hyaluronan and native gellan, having as low as possible inorganic salts content, initial binary and ternary mixtures can be prepared up to a total dry matter content of 0.150 g/dL, in no co-precipitating conditions. Two pH manipulation ways are feasible for homogenous mixing: (i) unbuffered prior correction at pH 5.5, and (ii) "rigid" buffering at pH 9.0, using organic species. Atelocollagen including co-precipitates can be obtained in the presence of one or both polysaccharides, preferably in pH domains far from the isoelectric point of scleroprotein. A critical behavior has been observed in mixtures containing gellan, due to its macromolecular dissimilarities compared with atelocollagen. In optimal binary mixtures, the coordinates of threshold points on the phase diagrams are 0.028% w/w atelocollagen/0.025% w/w hyaluronan, and 0.022% w/w atelocollagen/0.020% w/w gellan. Uni- or bi-phasic ternary systems having equilibrated ratios of co-precipitated components can be prepared starting from initial mixtures containing up to 0.032 g/dL atelocollagen, associated with, for example, 0.040 g/dL hyaluronan and 0.008 g/dL gellan, following the first pH manipulation way.

  10. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  11. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  12. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  13. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  14. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  15. Reproducibility of multiple breath washout indices in the unsedated preterm neonate.

    PubMed

    Sinhal, Sanjay; Galati, John; Baldwin, David N; Stocks, Janet; Pillow, J Jane

    2010-01-01

    Multiple breath inert gas washout (MBW) is gaining popularity for measurements of resting lung volume and ventilation inhomogeneity. Test reproducibility is an important determinant of the clinical applicability of diagnostic tests. The between-test reproducibility of variables derived from MBW tests in newborn infants is unknown. We aimed to determine the within-test repeatability and short-term between-test reproducibility of MBW variables in unsedated preterm infants. We hypothesized that measurements obtained within a 3-day interval in clinically stable preterm infants would be reproducible and suitable for use as an objective clinical outcome measurement. In this cross-sectional observational study, clinically stable hospitalized preterm infants whose parents had given informed consent for MBW studies were tested twice within 72 hr during quiet, unsedated sleep. Functional residual capacity (FRC), lung clearance index (LCI), and the first and second to zeroeth moment ratios (M(1):M(0); M(2):M(0)) were computed from MBW traces obtained using a mainstream ultrasonic flowmeter and 4% sulphur hexafluoride (MBW(SF6)). Within-test repeatability and between-test reproducibility were determined. Within-test repeatability (expressed as a coefficient of variability (C(v))) for differences between two and four replicate measurements on the same test occasion, were 9.3% (FRC), 9.0% (LCI), 7.6% (M(1):M(0)), and 15.6% (M(2):M(0)), respectively. The within-test C(v)'s were not statistically different to the between-tests C(v)'s, which were 7.7% (FRC), 10.3% (LCI), 6.1% (M(1):M(0)), and 13.0% (M(2):M(0)), respectively. Among unsedated preterm infants, between-test reproducibility over a 3-day interval was similar to within-test repeatability. The wide limits of agreement may limit the application of these measures to detect a clinically significant change in condition in small preterm infants. PMID:20025050

  16. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  17. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  18. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  19. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  20. Self-reproducing systems: structure, niche relations and evolution.

    PubMed

    Sharov, A A

    1991-01-01

    A formal definition of a self-reproducing system is proposed using Petri nets. A potential self-reproducing system is a set of places in the Petri net such that the number of tokens in each place increases due to some sequence of internal transitions (a transition is called internal to the marked subset of places if at least one of its starting places and one of its terminating places belongs to that subset). An actual self-reproducing system is a system that compensates the outflow of its components by reproduction. In a suitable environment every potential self-reproducing system becomes an actual one. Each Petri net can be considered as an ecosystem with the web of ecological niches bound together with trophic and other relations. The stationary dynamics of the ecosystem is characterized by the set of filled niches. The process of evolution is described in terms of niche composition change. Perspectives of the theory of self-reproducing systems in biology are discussed.

  1. Voxel size dependency, reproducibility and sensitivity of an in vivo bone loading estimation algorithm.

    PubMed

    Christen, Patrik; Schulte, Friederike A; Zwahlen, Alexander; van Rietbergen, Bert; Boutroy, Stephanie; Melton, L Joseph; Amin, Shreyasee; Khosla, Sundeep; Goldhahn, Jörg; Müller, Ralph

    2016-01-01

    A bone loading estimation algorithm was previously developed that provides in vivo loading conditions required for in vivo bone remodelling simulations. The algorithm derives a bone's loading history from its microstructure as assessed by high-resolution (HR) computed tomography (CT). This reverse engineering approach showed accurate and realistic results based on micro-CT and HR-peripheral quantitative CT images. However, its voxel size dependency, reproducibility and sensitivity still need to be investigated, which is the purpose of this study. Voxel size dependency was tested on cadaveric distal radii with micro-CT images scanned at 25 µm and downscaled to 50, 61, 75, 82, 100, 125 and 150 µm. Reproducibility was calculated with repeated in vitro as well as in vivo HR-pQCT measurements at 82 µm. Sensitivity was defined using HR-pQCT images from women with fracture versus non-fracture, and low versus high bone volume fraction, expecting similar and different loading histories, respectively. Our results indicate that the algorithm is voxel size independent within an average (maximum) error of 8.2% (32.9%) at 61 µm, but that the dependency increases considerably at voxel sizes bigger than 82 µm. In vitro and in vivo reproducibility are up to 4.5% and 10.2%, respectively, which is comparable to other in vitro studies and slightly higher than in other in vivo studies. Subjects with different bone volume fraction were clearly distinguished but not subjects with and without fracture. This is in agreement with bone adapting to customary loading but not to fall loads. We conclude that the in vivo bone loading estimation algorithm provides reproducible, sensitive and fairly voxel size independent results at up to 82 µm, but that smaller voxel sizes would be advantageous.

  2. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  3. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  4. Accurate Evaluation of Ion Conductivity of the Gramicidin A Channel Using a Polarizable Force Field without Any Corrections.

    PubMed

    Peng, Xiangda; Zhang, Yuebin; Chu, Huiying; Li, Yan; Zhang, Dinglin; Cao, Liaoran; Li, Guohui

    2016-06-14

    Classical molecular dynamic (MD) simulation of membrane proteins faces significant challenges in accurately reproducing and predicting experimental observables such as ion conductance and permeability due to its incapability of precisely describing the electronic interactions in heterogeneous systems. In this work, the free energy profiles of K(+) and Na(+) permeating through the gramicidin A channel are characterized by using the AMOEBA polarizable force field with a total sampling time of 1 μs. Our results indicated that by explicitly introducing the multipole terms and polarization into the electrostatic potentials, the permeation free energy barrier of K(+) through the gA channel is considerably reduced compared to the overestimated results obtained from the fixed-charge model. Moreover, the estimated maximum conductance, without any corrections, for both K(+) and Na(+) passing through the gA channel are much closer to the experimental results than any classical MD simulations, demonstrating the power of AMOEBA in investigating the membrane proteins. PMID:27171823

  5. Next-generation sequencing data interpretation: enhancing reproducibility and accessibility.

    PubMed

    Nekrutenko, Anton; Taylor, James

    2012-09-01

    Areas of life sciences research that were previously distant from each other in ideology, analysis practices and toolkits, such as microbial ecology and personalized medicine, have all embraced techniques that rely on next-generation sequencing instruments. Yet the capacity to generate the data greatly outpaces our ability to analyse it. Existing sequencing technologies are more mature and accessible than the methodologies that are available for individual researchers to move, store, analyse and present data in a fashion that is transparent and reproducible. Here we discuss currently pressing issues with analysis, interpretation, reproducibility and accessibility of these data, and we present promising solutions and venture into potential future developments.

  6. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.

  7. Accurate and Efficient Resolution of Overlapping Isotopic Envelopes in Protein Tandem Mass Spectra

    PubMed Central

    Xiao, Kaijie; Yu, Fan; Fang, Houqin; Xue, Bingbing; Liu, Yan; Tian, Zhixin

    2015-01-01

    It has long been an analytical challenge to accurately and efficiently resolve extremely dense overlapping isotopic envelopes (OIEs) in protein tandem mass spectra to confidently identify proteins. Here, we report a computationally efficient method, called OIE_CARE, to resolve OIEs by calculating the relative deviation between the ideal and observed experimental abundance. In the OIE_CARE method, the ideal experimental abundance of a particular overlapping isotopic peak (OIP) is first calculated for all the OIEs sharing this OIP. The relative deviation (RD) of the overall observed experimental abundance of this OIP relative to the summed ideal value is then calculated. The final individual abundance of the OIP for each OIE is the individual ideal experimental abundance multiplied by 1 + RD. Initial studies were performed using higher-energy collisional dissociation tandem mass spectra on myoglobin (with direct infusion) and the intact E. coli proteome (with liquid chromatographic separation). Comprehensive data at the protein and proteome levels, high confidence and good reproducibility were achieved. The resolving method reported here can, in principle, be extended to resolve any envelope-type overlapping data for which the corresponding theoretical reference values are available. PMID:26439836

  8. Rainfall variability and extremes over southern Africa: Assessment of a climate model to reproduce daily extremes

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  9. Ruggedness and reproducibility of the MBEC biofilm disinfectant efficacy test.

    PubMed

    Parker, A E; Walker, D K; Goeres, D M; Allan, N; Olson, M E; Omar, A

    2014-07-01

    The MBEC™ Physiology & Genetics Assay recently became the first approved ASTM standardized biofilm disinfectant efficacy test method. This report summarizes the results of the standardization process using Pseudomonas aeruginosa biofilms. Initial ruggedness testing of the MBEC method suggests that the assay is rugged (i.e., insensitive) to small changes to the protocol with respect to 4 factors: incubation time of the bacteria (when varied from 16 to 18h), treatment temperature (20-24°C), sonication duration (25-35min), and sonication power (130-480W). In order to assess the repeatability of MBEC results across multiple tests in the same laboratory and the reproducibility across multiple labs, an 8-lab study was conducted in which 8 concentrations of each of 3 disinfectants (a non-chlorine oxidizer, a phenolic, and a quaternary ammonium compound) were applied to biofilms using the MBEC method. The repeatability and reproducibility of the untreated control biofilms were acceptable, as indicated by small repeatability and reproducibility standard deviations (SD) (0.33 and 0.67 log10(CFU/mm(2)), respectively). The repeatability SDs of the biofilm log reductions after application of the 24 concentration and disinfectant combinations ranged from 0.22 to 1.61, and the reproducibility SDs ranged from 0.27 to 1.70. In addition, for each of the 3 disinfectant types considered, the assay was statistically significantly responsive to the increasing treatment concentrations.

  10. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states.

  11. Artificially reproduced image of earth photographed by UV camera

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A reproduction of a color enhancement of a picture photographed in far-ultraviolet light by Astronaut John W. Young, Apollo 16 commander, showing the Earth. Note this is an artificially reproduced image. The three auroral belts, the sunlit atmosphere and background stars are visible.

  12. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  13. Reproducibility of polycarbonate reference material in toxicity evaluation

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Huttlinger, P. A.

    1981-01-01

    A specific lot of bisphenol A polycarbonate has been used for almost four years as the reference material for the NASA-USF-PSC toxicity screening test method. The reproducibility of the test results over this period of time indicate that certain plastics may be more suitable reference materials than the more traditional cellulosic materials.

  14. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  15. Slide rule-type color chart predicts reproduced photo tones

    NASA Technical Reports Server (NTRS)

    Griffin, J. D.

    1966-01-01

    Slide rule-type color chart determines the final reproduced gray tones in the production of briefing charts that are photographed in black and white. The chart shows both the color by drafting paint manufacturers name and mixture number, and the gray tone resulting from black and white photographic reproduction.

  16. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states. PMID:16077266

  17. Reproducibility of topographic measures of the glaucomatous optic nerve head

    PubMed Central

    Geyer, O; Michaeli-Cohen, A; Silver, D; Versano, D; Neudorfer, M; Dzhanov, R; Lazar, M

    1998-01-01

    AIMS/BACKGROUND—Laser scanning tomography provides an assessment of three dimensional optic disc topography. For the clinical purpose of follow up of glaucoma patients, the repeatability of the various measured variables is essential. In the present study, the reproducibility of morphometric variables calculated by the topographic scanning system, TopSS (Laser Diagnostic Technology, San Diego, CA) was investigated.
METHODS—Two independent measurements (30 minutes apart) each consisting of three complete images of the optic disc were performed on 16 eyes of 16 glaucoma patients using a TopSS. The instrument calculates 14 morphometric variables for the characterisation of the optic disc.
RESULTS—From the two tailed paired tests, all variables were seen to have good reproducibility. However, correlation and regression analyses showed that only the three variables, volume below, half depth area, and average cup depth, are acceptably reproducible.
CONCLUSION—The TopSS provides three variables which describe the physiological shape of the optic disc that have high reproducibility. These three variables might be useful for following the progression of optic disc changes in glaucoma patients.

 Keywords: optic nerve head; scanning laser; glaucoma; tomography PMID:9536873

  18. The United States Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Black and white maps, graphs and tables that may be reproduced are presented in this volume focusing on the United States. Some of the features of the United States depicted are: size, population, agriculture and resources, manufactures, trade, citizenship, employment, income, poverty, the federal budget, energy, health, education, crime, and the…

  19. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  20. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  1. Accurate Determination of Conformational Transitions in Oligomeric Membrane Proteins

    PubMed Central

    Sanz-Hernández, Máximo; Vostrikov, Vitaly V.; Veglia, Gianluigi; De Simone, Alfonso

    2016-01-01

    The structural dynamics governing collective motions in oligomeric membrane proteins play key roles in vital biomolecular processes at cellular membranes. In this study, we present a structural refinement approach that combines solid-state NMR experiments and molecular simulations to accurately describe concerted conformational transitions identifying the overall structural, dynamical, and topological states of oligomeric membrane proteins. The accuracy of the structural ensembles generated with this method is shown to reach the statistical error limit, and is further demonstrated by correctly reproducing orthogonal NMR data. We demonstrate the accuracy of this approach by characterising the pentameric state of phospholamban, a key player in the regulation of calcium uptake in the sarcoplasmic reticulum, and by probing its dynamical activation upon phosphorylation. Our results underline the importance of using an ensemble approach to characterise the conformational transitions that are often responsible for the biological function of oligomeric membrane protein states. PMID:26975211

  2. Calculation of Accurate Hexagonal Discontinuity Factors for PARCS

    SciTech Connect

    Pounders. J., Bandini, B. R. , Xu, Y, and Downar, T. J.

    2007-11-01

    In this study we derive a methodology for calculating discontinuity factors consistent with the Triangle-based Polynomial Expansion Nodal (TPEN) method implemented in PARCS for hexagonal reactor geometries. The accuracy of coarse-mesh nodal methods is greatly enhanced by permitting flux discontinuities at node boundaries, but the practice of calculating discontinuity factors from infinite-medium (zero-current) single bundle calculations may not be sufficiently accurate for more challenging problems in which there is a large amount of internodal neutron streaming. The authors therefore derive a TPEN-based method for calculating discontinuity factors that are exact with respect to generalized equivalence theory. The method is validated by reproducing the reference solution for a small hexagonal core.

  3. Reproducibility of Acetabular Landmarks and a Standardized Coordinate System Obtained from 3D Hip Ultrasound.

    PubMed

    Mabee, Myles; Dulai, Sukhdeep; Thompson, Richard B; Jaremko, Jacob L

    2015-10-01

    Two-dimensional (2D) ultrasound detection of developmental dysplasia of the hip (DDH) is limited by variation in acetabular appearance and alpha angle measurements, which change with position of the ultrasound probe. Three-dimensional (3D) ultrasound captures the entire acetabular shape, and a reproducible "standard central plane" may be generated, from two landmarks located on opposite ends of the acetabulum, for measurement of alpha angle and other indices. Two users identified landmarks on 51 3D ultrasounds, with ranging severity of disease, and inter- and intra-observer reproducibility of landmark and "standard plane" locations was compared; landmarks were chosen within 2 mm, and the "standard plane" rotation was reproducible within 10° between observers. We observed no difference in variability between alpha angles measured on the "standard plane" in comparison with 2D ultrasound. Applications of the standardized 3D ultrasound central plane will be to fuse serial ultrasounds for follow-up and development of new indices of 3D deformity. PMID:25394808

  4. Accurate masses for dispersion-supported galaxies

    NASA Astrophysics Data System (ADS)

    Wolf, Joe; Martinez, Gregory D.; Bullock, James S.; Kaplinghat, Manoj; Geha, Marla; Muñoz, Ricardo R.; Simon, Joshua D.; Avedo, Frank F.

    2010-08-01

    We derive an accurate mass estimator for dispersion-supported stellar systems and demonstrate its validity by analysing resolved line-of-sight velocity data for globular clusters, dwarf galaxies and elliptical galaxies. Specifically, by manipulating the spherical Jeans equation we show that the mass enclosed within the 3D deprojected half-light radius r1/2 can be determined with only mild assumptions about the spatial variation of the stellar velocity dispersion anisotropy as long as the projected velocity dispersion profile is fairly flat near the half-light radius, as is typically observed. We find M1/2 = 3 G-1< σ2los > r1/2 ~= 4 G-1< σ2los > Re, where < σ2los > is the luminosity-weighted square of the line-of-sight velocity dispersion and Re is the 2D projected half-light radius. While deceptively familiar in form, this formula is not the virial theorem, which cannot be used to determine accurate masses unless the radial profile of the total mass is known a priori. We utilize this finding to show that all of the Milky Way dwarf spheroidal galaxies (MW dSphs) are consistent with having formed within a halo of a mass of approximately 3 × 109 Msolar, assuming a Λ cold dark matter cosmology. The faintest MW dSphs seem to have formed in dark matter haloes that are at least as massive as those of the brightest MW dSphs, despite the almost five orders of magnitude spread in luminosity between them. We expand our analysis to the full range of observed dispersion-supported stellar systems and examine their dynamical I-band mass-to-light ratios ΥI1/2. The ΥI1/2 versus M1/2 relation for dispersion-supported galaxies follows a U shape, with a broad minimum near ΥI1/2 ~= 3 that spans dwarf elliptical galaxies to normal ellipticals, a steep rise to ΥI1/2 ~= 3200 for ultra-faint dSphs and a more shallow rise to ΥI1/2 ~= 800 for galaxy cluster spheroids.

  5. Tract Specific Reproducibility of Tractography Based Morphology and Diffusion Metrics

    PubMed Central

    Besseling, René M. H.; Jansen, Jacobus F. A.; Overvliet, Geke M.; Vaessen, Maarten J.; Braakman, Hilde M. H.; Hofman, Paul A. M.; Aldenkamp, Albert P.; Backes, Walter H.

    2012-01-01

    Introduction The reproducibility of tractography is important to determine its sensitivity to pathological abnormalities. The reproducibility of tract morphology has not yet been systematically studied and the recently developed tractography contrast Tract Density Imaging (TDI) has not yet been assessed at the tract specific level. Materials and Methods Diffusion tensor imaging (DTI) and probabilistic constrained spherical deconvolution (CSD) tractography are performed twice in 9 healthy subjects. Tractography is based on common space seed and target regions and performed for several major white matter tracts. Tractograms are converted to tract segmentations and inter-session reproducibility of tract morphology is assessed using Dice similarity coefficient (DSC). The coefficient of variation (COV) and intraclass correlation coefficient (ICC) are calculated of the following tract metrics: fractional anisotropy (FA), apparent diffusion coefficient (ADC), volume, and TDI. Analyses are performed both for proximal (deep white matter) and extended (including subcortical white matter) tract segmentations. Results Proximal DSC values were 0.70–0.92. DSC values were 5–10% lower in extended compared to proximal segmentations. COV/ICC values of FA, ADC, volume and TDI were 1–4%/0.65–0.94, 2–4%/0.62–0.94, 3–22%/0.53–0.96 and 8–31%/0.48–0.70, respectively, with the lower COV and higher ICC values found in the proximal segmentations. Conclusion For all investigated metrics, reproducibility depended on the segmented tract. FA and ADC had relatively low COV and relatively high ICC, indicating clinical potential. Volume had higher COV but its moderate to high ICC values in most tracts still suggest subject-differentiating power. Tract TDI had high COV and relatively low ICC, which reflects unfavorable reproducibility. PMID:22485157

  6. Reproducibility of dual-photon absorptiometry using a clinical phantom

    SciTech Connect

    DaCosta, M.; DeLaney, M.; Goldsmith, S.J.

    1985-05-01

    The use of dual-photon absorptiometry (DPA) bone mineral density (BMD) to monitor bone for diagnosis and monitoring therapy of osteoporosis has been established. The objective of this study is to determine the reproducibility of DPA measurements. A phantom was constructed using a section of human boney pelvis and lumbo-sacral spine. Provisions were made to mimic changes in patient girth. To evaluate the DPA reproducibility within a single day, 12 consecutive studies were performed on the phantom using standard acquisition and processing procedures. The mean BMD +-1 SD in gms/cm/sup 2/ (BMD-bar)of lumbar vertebrae 2-4 was 0.771 +- 0.007 with a 0.97% coefficient of variation (1SD) (CV). This evaluation was repeated 7 times over the next 4 months with the performance of 3 to 6 studies each time, the maximum CV found was 1.93. In order to evaluate the DPA reproducibility with time, phantom studies were performed over a 7 month period which included a 153-Gd source change. The BMD-bar was 0.770 +- 0.017 with a 2.15CV. DPA reproducibility with patient girth changes was evaluated by performing the phantom studies at water depths of 12.5, 17.0 and 20.0cm. Five studies of each were performed using standard acquisition and processing procedures. The BMD-bar was 0.779 +- 0.012 with a 1.151CV. based on these results, BMD measurements by DPA are reproducible within 2%. This reliability is maintained for studies performed over extended period of time and are independent of changes in patient girth.

  7. A mechanistic approach for accurate simulation of village scale malaria transmission

    PubMed Central

    Bomblies, Arne; Duchemin, Jean-Bernard; Eltahir, Elfatih AB

    2009-01-01

    Background Malaria transmission models commonly incorporate spatial environmental and climate variability for making regional predictions of disease risk. However, a mismatch of these models' typical spatial resolutions and the characteristic scale of malaria vector population dynamics may confound disease risk predictions in areas of high spatial hydrological variability such as the Sahel region of Africa. Methods Field observations spanning two years from two Niger villages are compared. The two villages are separated by only 30 km but exhibit a ten-fold difference in anopheles mosquito density. These two villages would be covered by a single grid cell in many malaria models, yet their entomological activity differs greatly. Environmental conditions and associated entomological activity are simulated at high spatial- and temporal resolution using a mechanistic approach that couples a distributed hydrology scheme and an entomological model. Model results are compared to regular field observations of Anopheles gambiae sensu lato mosquito populations and local hydrology. The model resolves the formation and persistence of individual pools that facilitate mosquito breeding and predicts spatio-temporal mosquito population variability at high resolution using an agent-based modeling approach. Results Observations of soil moisture, pool size, and pool persistence are reproduced by the model. The resulting breeding of mosquitoes in the simulated pools yields time-integrated seasonal mosquito population dynamics that closely follow observations from captured mosquito abundance. Interannual difference in mosquito abundance is simulated, and the inter-village difference in mosquito population is reproduced for two years of observations. These modeling results emulate the known focal nature of malaria in Niger Sahel villages. Conclusion Hydrological variability must be represented at high spatial and temporal resolution to achieve accurate predictive ability of malaria risk

  8. Assessments of endothelial function and arterial stiffness are reproducible in patients with COPD

    PubMed Central

    Rodriguez-Miguelez, Paula; Seigler, Nichole; Bass, Leon; Dillard, Thomas A; Harris, Ryan A

    2015-01-01

    Background Elevated cardiovascular disease risk is observed in patients with COPD. Non-invasive assessments of endothelial dysfunction and arterial stiffness have recently emerged to provide mechanistic insight into cardiovascular disease risk in COPD; however, the reproducibility of endothelial function and arterial stiffness has yet to be investigated in this patient population. Objectives This study sought to examine the within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. Methods Baseline diameter, peak diameter, flow-mediated dilation, augmentation index, augmentation index at 75 beats per minute, and pulse wave velocity were assessed three times in 17 patients with COPD (six males, eleven females, age range 47–75 years old; forced expiratory volume in 1 second =51.5% predicted). Session A and B were separated by 3 hours (within-day), whereas session C was conducted at least 7 days following session B (between-day). Reproducibility was assessed by: 1) paired t-tests, 2) coefficients of variation, 3) coefficients of variation prime, 4) intra-class correlation coefficient, 5) Pearson’s correlations (r), and 6) Bland–Altman plots. Five acceptable assessments were required to confirm reproducibility. Results Six out of six within-day criteria were met for endothelial function and arterial stiffness outcomes. Six out of six between-day criteria were met for baseline and peak diameter, augmentation index and pulse wave velocity, whereas five out of six criteria were met for flow-mediated dilation. Conclusion The present study provides evidence for within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. PMID:26396509

  9. Repeatability, Reproducibility, Separative Power and Subjectivity of Different Fish Morphometric Analysis Methods

    PubMed Central

    Takács, Péter

    2016-01-01

    We compared the repeatability, reproducibility (intra- and inter-measurer similarity), separative power and subjectivity (measurer effect on results) of four morphometric methods frequently used in ichthyological research, the “traditional” caliper-based (TRA) and truss-network (TRU) distance methods and two geometric methods that compare landmark coordinates on the body (GMB) and scales (GMS). In each case, measurements were performed three times by three measurers on the same specimen of three common cyprinid species (roach Rutilus rutilus (Linnaeus, 1758), bleak Alburnus alburnus (Linnaeus, 1758) and Prussian carp Carassius gibelio (Bloch, 1782)) collected from three closely-situated sites in the Lake Balaton catchment (Hungary) in 2014. TRA measurements were made on conserved specimens using a digital caliper, while TRU, GMB and GMS measurements were undertaken on digital images of the bodies and scales. In most cases, intra-measurer repeatability was similar. While all four methods were able to differentiate the source populations, significant differences were observed in their repeatability, reproducibility and subjectivity. GMB displayed highest overall repeatability and reproducibility and was least burdened by measurer effect. While GMS showed similar repeatability to GMB when fish scales had a characteristic shape, it showed significantly lower reproducability (compared with its repeatability) for each species than the other methods. TRU showed similar repeatability as the GMS. TRA was the least applicable method as measurements were obtained from the fish itself, resulting in poor repeatability and reproducibility. Although all four methods showed some degree of subjectivity, TRA was the only method where population-level detachment was entirely overwritten by measurer effect. Based on these results, we recommend a) avoidance of aggregating different measurer’s datasets when using TRA and GMS methods; and b) use of image-based methods for

  10. Reproducibility and validity of the Shanghai Women's Health Study physical activity questionnaire.

    PubMed

    Matthews, Charles E; Shu, Xiao-Ou; Yang, Gong; Jin, Fan; Ainsworth, Barbara E; Liu, Dake; Gao, Yu-Tang; Zheng, Wei

    2003-12-01

    In this investigation, the authors evaluated the reproducibility and validity of the Shanghai Women's Health Study (SWHS) physical activity questionnaire (PAQ), which was administered in a cohort study of approximately 75,000 Chinese women aged 40-70 years. Reproducibility (2-year test-retest) was evaluated using kappa statistics and intraclass correlation coefficients (ICCs). Validity was evaluated by comparing Spearman correlations (r) for the SWHS PAQ with two criterion measures administered over a period of 12 months: four 7-day physical activity logs and up to 28 7-day PAQs. Women were recruited from the SWHS cohort (n = 200). Results indicated that the reproducibility of adolescent and adult exercise participation (kappa = 0.85 and kappa = 0.64, respectively) and years of adolescent exercise and adult exercise energy expenditure (ICC = 0.83 and ICC = 0.70, respectively) was reasonable. Reproducibility values for adult lifestyle activities were lower (ICC = 0.14-0.54). Significant correlations between the PAQ and criterion measures of adult exercise were observed for the first PAQ administration (physical activity log, r = 0.50; 7-day PAQ, r = 0.62) and the second PAQ administration (physical activity log, r = 0.74; 7-day PAQ, r = 0.80). Significant correlations between PAQ lifestyle activities and the 7-day PAQ were also noted (r = 0.33-0.88). These data indicate that the SWHS PAQ is a reproducible and valid measure of exercise behaviors and that it demonstrates utility in stratifying women by levels of important lifestyle activities (e.g., housework, walking, cycling).

  11. Assessment of tumor motion reproducibility with audio-visual coaching through successive 4D CT sessions.

    PubMed

    Goossens, Samuel; Senny, Frédéric; Lee, John A; Janssens, Guillaume; Geets, Xavier

    2014-01-04

    This study aimed to compare combined audio-visual coaching with audio coaching alone and assess their respective impact on the reproducibility of external breathing motion and, one step further, on the internal lung tumor motion itself, through successive sessions. Thirteen patients with NSCLC were enrolled in this study. The tumor motion was assessed by three to four successive 4D CT sessions, while the breathing signal was measured from magnetic sensors positioned on the epigastric region. For all sessions, the breathing was regularized with either audio coaching alone (AC, n = 5) or combined with a real-time visual feedback (A/VC, n = 8) when tolerated by the patients. Peak-to-peak amplitude, period and signal shape of both breathing and tumor motions were first measured. Then, the correlation between the respiratory signal and internal tumor motion over time was evaluated, as well as the residual tumor motion for a gated strategy. Although breathing and tumor motions were comparable between AC and AV/C groups, A/VC approach achieved better reproducibility through sessions than AC alone (mean tumor motion of 7.2 mm ± 1 vs. 8.6 mm ± 1.8 mm, and mean breathing motion of 14.9 mm ± 1.2 mm vs. 13.3mm ± 3.7 mm, respectively). High internal/external correlation reproducibility was achieved in the superior-inferior tumor motion direction for all patients. For the anterior posterior tumor motion direction, better correlation reproducibility has been observed when visual feedback has been used. For a displacement-based gating approach, A/VC might also be recommended, since it led to smaller residual tumor motion within clinically relevant duty cycles. This study suggests that combining real-time visual feedback with audio coaching might improve the reproducibility of key characteristics of the breathing pattern, and might thus be considered in the implementation of lung tumor radiotherapy.

  12. Repeatability, Reproducibility, Separative Power and Subjectivity of Different Fish Morphometric Analysis Methods.

    PubMed

    Takács, Péter; Vitál, Zoltán; Ferincz, Árpád; Staszny, Ádám

    2016-01-01

    We compared the repeatability, reproducibility (intra- and inter-measurer similarity), separative power and subjectivity (measurer effect on results) of four morphometric methods frequently used in ichthyological research, the "traditional" caliper-based (TRA) and truss-network (TRU) distance methods and two geometric methods that compare landmark coordinates on the body (GMB) and scales (GMS). In each case, measurements were performed three times by three measurers on the same specimen of three common cyprinid species (roach Rutilus rutilus (Linnaeus, 1758), bleak Alburnus alburnus (Linnaeus, 1758) and Prussian carp Carassius gibelio (Bloch, 1782)) collected from three closely-situated sites in the Lake Balaton catchment (Hungary) in 2014. TRA measurements were made on conserved specimens using a digital caliper, while TRU, GMB and GMS measurements were undertaken on digital images of the bodies and scales. In most cases, intra-measurer repeatability was similar. While all four methods were able to differentiate the source populations, significant differences were observed in their repeatability, reproducibility and subjectivity. GMB displayed highest overall repeatability and reproducibility and was least burdened by measurer effect. While GMS showed similar repeatability to GMB when fish scales had a characteristic shape, it showed significantly lower reproducability (compared with its repeatability) for each species than the other methods. TRU showed similar repeatability as the GMS. TRA was the least applicable method as measurements were obtained from the fish itself, resulting in poor repeatability and reproducibility. Although all four methods showed some degree of subjectivity, TRA was the only method where population-level detachment was entirely overwritten by measurer effect. Based on these results, we recommend a) avoidance of aggregating different measurer's datasets when using TRA and GMS methods; and b) use of image-based methods for morphometric

  13. Reproducibility of a silicone-based test food to masticatory performance evaluation by different sieve methods.

    PubMed

    Sánchez-Ayala, Alfonso; Vilanova, Larissa Soares Reis; Costa, Marina Abrantes; Farias-Neto, Arcelino

    2014-01-01

    The aim of this study was to evaluate the reproducibility of the condensation silicone Optosil Comfort® as an artificial test food for masticatory performance evaluation. Twenty dentate subjects with mean age of 23.3±0.7 years were selected. Masticatory performance was evaluated using the simple (MPI), the double (IME) and the multiple sieve methods. Trials were carried out five times by three examiners: three times by the first, and once by the second and third examiners. Friedman's test was used to find the differences among time trials. Reproducibility was determined by the intra-class correlation (ICC) test (α=0.05). No differences among time trials were found, except for MPI-4 mm (p=0.022) from the first examiner results. The intra-examiner reproducibility (ICC) of almost all data was high (ICC≥0.92, p<0.001), being moderate only for MPI-0.50 mm (ICC=0.89, p<0.001). The inter-examiner reproducibility was high (ICC>0.93, p<0.001) for all results. For the multiple sieve method, the average mean of absolute difference from repeated measurements were lower than 1 mm. This trend was observed only from MPI-0.50 to MPI-1.4 for the single sieve method, and from IME-0.71/0.50 to IME-1.40/1.00 for the double sieve method. The results suggest that regardless of the method used, the reproducibility of Optosil Comfort® is high. PMID:24918363

  14. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  15. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  16. MEMS accelerometers in accurate mount positioning systems

    NASA Astrophysics Data System (ADS)

    Mészáros, László; Pál, András.; Jaskó, Attila

    2014-07-01

    In order to attain precise, accurate and stateless positioning of telescope mounts we apply microelectromechanical accelerometer systems (also known as MEMS accelerometers). In common practice, feedback from the mount position is provided by electronic, optical or magneto-mechanical systems or via real-time astrometric solution based on the acquired images. Hence, MEMS-based systems are completely independent from these mechanisms. Our goal is to investigate the advantages and challenges of applying such devices and to reach the sub-arcminute range { that is well smaller than the field-of-view of conventional imaging telescope systems. We present how this sub-arcminute accuracy can be achieved with very cheap MEMS sensors. Basically, these sensors yield raw output within an accuracy of a few degrees. We show what kind of calibration procedures could exploit spherical and cylindrical constraints between accelerometer output channels in order to achieve the previously mentioned accuracy level. We also demonstrate how can our implementation be inserted in a telescope control system. Although this attainable precision is less than both the resolution of telescope mount drive mechanics and the accuracy of astrometric solutions, the independent nature of attitude determination could significantly increase the reliability of autonomous or remotely operated astronomical observations.

  17. Measuring Age-Dependent Myocardial Stiffness across the Cardiac Cycle using MR Elastography: A Reproducibility Study

    PubMed Central

    Wassenaar, Peter A; Eleswarpu, Chethanya N; Schroeder, Samuel A; Mo, Xiaokui; Raterman, Brian D; White, Richard D; Kolipaka, Arunark

    2015-01-01

    Purpose To assess reproducibility in measuring left ventricular (LV) myocardial stiffness in volunteers throughout the cardiac cycle using magnetic resonance elastography (MRE) and to determine its correlation with age. Methods Cardiac MRE (CMRE) was performed on 29 normal volunteers, with ages ranging from 21 to 73 years. For assessing reproducibility of CMRE-derived stiffness measurements, scans were repeated per volunteer. Wave images were acquired throughout the LV myocardium, and were analyzed to obtain mean stiffness during the cardiac cycle. CMRE-derived stiffness values were correlated to age. Results Concordance correlation coefficient revealed good inter-scan agreement with rc of 0.77, with p-value<0.0001. Significantly higher myocardial stiffness was observed during end-systole (ES) compared to end-diastole (ED) across all subjects. Additionally, increased deviation between ES and ED stiffness was observed with increased age. Conclusion CMRE-derived stiffness is reproducible, with myocardial stiffness changing cyclically across the cardiac cycle. Stiffness is significantly higher during ES compared to ED. With age, ES myocardial stiffness increases more than ED, giving rise to an increased deviation between the two. PMID:26010456

  18. Reliability and reproducibility of interapical distance assessment of the lateral deviation of vertebrae in scoliosis.

    PubMed

    Lim, Jeong Hoon; Lee, Jongmin; Koh, Seong-Eun; Lee, In-Sik

    2015-04-01

    [Purpose] The purpose of this study was to investigate the interobserver reliability and intraobserver reproducibility of interapical distance (IAD) and to analyze its correlation with the Cobb angle (CA). [Subjects and Methods] IAD, a handy tool for assessment of the lateral deviation of vertebrae with a metric scale, was defined as the horizontal distance between one apical vertebra and its counterpart, the opposite apical vertebra in the case of a double curve and the farthest vertebra in the case of a single curve. Fifty full-length, standing anteroposterior radiographs of "idiopathic scoliosis" were reviewed. Three investigators independently measure the CA and IAD at the same time and remeasured the IAD on the same radiograph a week later. [Results] There was no interobserver difference (reliability) in the measurement of IAD or statistical differences in intraobserver reproducibility for each observer. IAD was well correlated with the CA for each observer (r=0.765, r=0.737, and r=0.764). [Conclusion] IAD is useful when assessing lateral deviation in scoliosis and may be a reliable and reproducible index that is well correlated with the CA, and it can be used as a supplementary measure to describe the overall derangement of scoliosis in the coronal plane. PMID:25995588

  19. The effect of sample holder material on ion mobility spectrometry reproducibility

    NASA Technical Reports Server (NTRS)

    Jadamec, J. Richard; Su, Chih-Wu; Rigdon, Stephen; Norwood, Lavan

    1995-01-01

    When a positive detection of a narcotic occurs during the search of a vessel, a decision has to be made whether further intensive search is warranted. This decision is based in part on the results of a second sample collected from the same area. Therefore, the reproducibility of both sampling and instrumental analysis is critical in terms of justifying an in depth search. As reported at the 2nd Annual IMS Conference in Quebec City, the U.S. Coast Guard has determined that when paper is utilized as the sample desorption medium for the Barringer IONSCAN, the analytical results using standard reference samples are reproducible. A study was conducted utilizing papers of varying pore sizes and comparing their performance as a desorption material relative to the standard Barringer 50 micron Teflon. Nominal pore sizes ranged from 30 microns down to 2 microns. Results indicate that there is some peak instability in the first two to three windows during the analysis. The severity of the instability was observed to increase as the pore size of the paper is decreased. However, the observed peak instability does not create a situation that results in a decreased reliability or reproducibility in the analytical result.

  20. A New Surgical Model of Skeletal Muscle Injuries in Rats Reproduces Human Sports Lesions.

    PubMed

    Contreras-Muñoz, P; Fernández-Martín, A; Torrella, R; Serres, X; De la Varga, M; Viscor, G; Järvinen, T A H; Martínez-Ibáñez, V; Peiró, J L; Rodas, G; Marotta, M

    2016-03-01

    Skeletal muscle injuries are the most common sports-related injuries in sports medicine. In this work, we have generated a new surgically-induced skeletal muscle injury in rats, by using a biopsy needle, which could be easily reproduced and highly mimics skeletal muscle lesions detected in human athletes. By means of histology, immunofluorescence and MRI imaging, we corroborated that our model reproduced the necrosis, inflammation and regeneration processes observed in dystrophic mdx-mice, a model of spontaneous muscle injury, and realistically mimicked the muscle lesions observed in professional athletes. Surgically-injured rat skeletal muscles demonstrated the longitudinal process of muscle regeneration and fibrogenesis as stated by Myosin Heavy Chain developmental (MHCd) and collagen-I protein expression. MRI imaging analysis demonstrated that our muscle injury model reproduces the grade I-II type lesions detected in professional soccer players, including edema around the central tendon and the typically high signal feather shape along muscle fibers. A significant reduction of 30% in maximum tetanus force was also registered after 2 weeks of muscle injury. This new model represents an excellent approach to the study of the mechanisms of muscle injury and repair, and could open new avenues for developing innovative therapeutic approaches to skeletal muscle regeneration in sports medicine.

  1. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  2. Implementation of a portable and reproducible parallel pseudorandom number generator

    SciTech Connect

    Pryor, D.V.; Cuccaro, S.A.; Mascagni, M.; Robinson, M.L.

    1994-12-31

    The authors describe in detail the parallel implementation of a family of additive lagged-Fibonacci pseudorandom number generators. The theoretical structure of these generators is exploited to preserve their well-known randomness properties and to provide a parallel system in of distinct cycles. The algorithm presented here solves the reproducibility problem for a far larger class of parallel Monte Carlo applications than has been previously possible. In particular, Monte Carlo applications that undergo ``splitting`` can be coded to be reproducible, independent both of the number of processors and the execution order of the parallel processes. A library of portable C routines (available from the authors) that implements these ideas is also described.

  3. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  4. Reproducing kernel particle method for free and forced vibration analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J. X.; Zhang, H. Y.; Zhang, L.

    2005-01-01

    A reproducing kernel particle method (RKPM) is presented to analyze the natural frequencies of Euler-Bernoulli beams as well as Kirchhoff plates. In addition, RKPM is also used to predict the forced vibration responses of buried pipelines due to longitudinal travelling waves. Two different approaches, Lagrange multipliers as well as transformation method , are employed to enforce essential boundary conditions. Based on the reproducing kernel approximation, the domain of interest is discretized by a set of particles without the employment of a structured mesh, which constitutes an advantage over the finite element method. Meanwhile, RKPM also exhibits advantages over the classical Rayleigh-Ritz method and its counterparts. Numerical results presented here demonstrate the effectiveness of this novel approach for both free and forced vibration analysis.

  5. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    SciTech Connect

    2010-07-02

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  6. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  7. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2016-07-12

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  8. Intersubject variability and reproducibility of 15O PET studies.

    PubMed

    Coles, Jonathan P; Fryer, Tim D; Bradley, Peter G; Nortje, Jurgens; Smielewski, Peter; Rice, Kenneth; Clark, John C; Pickard, John D; Menon, David K

    2006-01-01

    Oxygen-15 positron emission tomography (15O PET) can provide important data regarding patients with head injury. We provide reference data on intersubject variability and reproducibility of cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolism (CMRO2) and oxygen extraction fraction (OEF) in patients and healthy controls, and explored alternative ways of assessing reproducibility within the context of a single PET study. In addition, we used independent measurements of CBF and CMRO2 to investigate the effect of mathematical correlation on the relationship between flow and metabolism. In patients, intersubject coefficients of variation (CoV) for CBF, CMRO2 and OEF were larger than in controls (32.9%+/-2.2%, 23.2%+/-2.0% and 22.5%+/-3.4% versus 13.5%+/-1.4%, 12.8%+/-1.1% and 7.3%+/-1.2%), while CoV for CBV were lower (15.2%+/-2.1% versus 22.5%+/-2.8%) (P<0.001). The CoV for the test-retest reproducibility of CBF, CBV, CMRO2 and OEF in patients were 2.1%+/-1.5%, 3.8%+/-3.0%, 3.7%+/-3.0% and 4.6%+/-3.5%, respectively. These were much lower than the intersubject CoV figures, and were similar to alternative measures of reproducibility obtained by fractionating data from a single study. The physiological relationship between flow and metabolism was preserved even when mathematically independent measures were used for analysis. These data provide a context for the design and interpretation of interventional PET studies. While ideally each centre should develop its own bank of such data, the figures provided will allow initial generic approximations of sample size for such studies.

  9. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    NASA Astrophysics Data System (ADS)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  10. Highly reproducible Bragg grating acousto-ultrasonic contact transducers

    NASA Astrophysics Data System (ADS)

    Saxena, Indu Fiesler; Guzman, Narciso; Lieberman, Robert A.

    2014-09-01

    Fiber optic acousto-ultrasonic transducers offer numerous applications as embedded sensors for impact and damage detection in industrial and aerospace applications as well as non-destructive evaluation. Superficial contact transducers with a sheet of fiber optic Bragg gratings has been demonstrated for guided wave ultrasound based measurements. It is reported here that this method of measurement provides highly reproducible guided ultrasound data of the test composite component, despite the optical fiber transducers not being permanently embedded in it.

  11. Reproducing the assembly of massive galaxies within the hierarchical cosmogony

    NASA Astrophysics Data System (ADS)

    Fontanot, Fabio; Monaco, Pierluigi; Silva, Laura; Grazian, Andrea

    2007-12-01

    In order to gain insight into the physical mechanisms leading to the formation of stars and their assembly in galaxies, we compare the predictions of the MOdel for the Rise of GAlaxies aNd Active nuclei (MORGANA) to the properties of K- and 850-μm-selected galaxies (such as number counts, redshift distributions and luminosity functions) by combining MORGANA with the spectrophotometric model GRASIL. We find that it is possible to reproduce the K- and 850-μm-band data sets at the same time and with a standard Salpeter initial mass function, and ascribe this success to our improved modelling of cooling in DM haloes. We then predict that massively star-forming discs are common at z ~ 2 and dominate the star formation rate, but most of them merge with other galaxies within ~100 Myr. Our preferred model produces an overabundance of bright galaxies at z < 1; this overabundance might be connected to the build-up of the diffuse stellar component in galaxy clusters, as suggested by Monaco et al., but a naive implementation of the mechanism suggested in that paper does not produce a sufficient slowdown of the evolution of these objects. Moreover, our model overpredicts the number of 1010-1011Msolar galaxies at z ~ 1; this is a common behaviour of theoretical models as shown by Fontana et al.. These findings show that, while the overall build-up of the stellar mass is correctly reproduced by galaxy formation models, the `downsizing' trend of galaxies is not fully reproduced yet. This hints to some missing feedback mechanism in order to reproduce at the same time the formation of both the massive and the small galaxies.

  12. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  13. Study of the reproducibility of the 2004 World Health Organization classification of urothelial neoplasms.

    PubMed

    Sharma, Pallavi; Kini, Hema; Pai, Radha R; Sahu, Kaushalya K; Kini, Jyoti

    2015-01-01

    The aim of the study was to evaluate urinary bladder biopsies showing papillary urothelial neoplastic lesions based on the 2004 WHO/ISUP classification of Urothelial Neoplasms of the Urinary Bladder, to assess the reproducibility of the bladder carcinoma grade. Fifty consecutive transurethral tumor resection biopsies were evaluated by four pathologists independently. The final diagnoses of each pathologist were subjected to statistical analysis to assess the degree of interobserver variability and reproducibility of this classification. Significant interobserver variation was found in the reporting of urothelial neoplasms. In 22 instances there was difference in opinion between PUNLMP and low-grade carcinoma, and in 59 instances between low and high grade carcinoma. The 4 observers never unanimously agreed on the diagnosis of PUNLMP.

  14. Macroscopic locality with equal bias reproduces with high fidelity a quantum distribution achieving the Tsirelson's bound

    NASA Astrophysics Data System (ADS)

    Gazi, Md. Rajjak; Banik, Manik; Das, Subhadipa; Rai, Ashutosh; Kunkri, Samir

    2013-11-01

    Two physical principles, macroscopic locality (ML) and information causality (IC), so far have been most successful in distinguishing quantum correlations from post-quantum correlations. However, there are also some post-quantum probability distributions which cannot be distinguished with the help of these principles. Thus, it is interesting to see whether consideration of these two principles, separately, along with some additional physically plausible constraints, can explain some interesting quantum features which are otherwise hard to reproduce. In this paper we show that in a Bell-Clauser-Horne-Shimony-Holt scenario, ML along with the constraint of equal bias for the concerned observables, almost reproduces the quantum joint probability distribution corresponding to a maximal quantum Bell violation, which is unique up to relabeling. From this example and earlier work of Cavalcanti, Salles, and Scarani, we conclude that IC and ML are inequivalent physical principles; satisfying one does not imply that the other is satisfied.

  15. Reproducibility and variability of the cost functions reconstructed from experimental recordings in multifinger prehension.

    PubMed

    Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2012-01-01

    The study examines whether the cost functions reconstructed from experimental recordings are reproducible over time. Participants repeated the trials on three days. By following Analytical Inverse Optimization procedures, the cost functions of finger forces were reconstructed for each day. The cost functions were found to be reproducible over time: application of a cost function C(i) to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) the 2nd order coefficients of the cost function showed negative linear relations with finger force magnitudes; (b) the finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space for all subjects and all testing sessions; (c) the data agreed well with the principle of superposition, i.e. the action of object prehension can be decoupled into the control of rotational equilibrium and slipping prevention. PMID:22364441

  16. High-Reproducibility and High-Accuracy Method for Automated Topic Classification

    NASA Astrophysics Data System (ADS)

    Lancichinetti, Andrea; Sirer, M. Irmak; Wang, Jane X.; Acuna, Daniel; Körding, Konrad; Amaral, Luís A. Nunes

    2015-01-01

    Much of human knowledge sits in large databases of unstructured text. Leveraging this knowledge requires algorithms that extract and record metadata on unstructured text documents. Assigning topics to documents will enable intelligent searching, statistical characterization, and meaningful classification. Latent Dirichlet allocation (LDA) is the state of the art in topic modeling. Here, we perform a systematic theoretical and numerical analysis that demonstrates that current optimization techniques for LDA often yield results that are not accurate in inferring the most suitable model parameters. Adapting approaches from community detection in networks, we propose a new algorithm that displays high reproducibility and high accuracy and also has high computational efficiency. We apply it to a large set of documents in the English Wikipedia and reveal its hierarchical structure.

  17. Combining Dissimilarities in a Hyper Reproducing Kernel Hilbert Space for Complex Human Cancer Prediction

    PubMed Central

    Martín-Merino, Manuel; Blanco, Ángela; De Las Rivas, Javier

    2009-01-01

    DNA microarrays provide rich profiles that are used in cancer prediction considering the gene expression levels across a collection of related samples. Support Vector Machines (SVM) have been applied to the classification of cancer samples with encouraging results. However, they rely on Euclidean distances that fail to reflect accurately the proximities among sample profiles. Then, non-Euclidean dissimilarities provide additional information that should be considered to reduce the misclassification errors. In this paper, we incorporate in the ν-SVM algorithm a linear combination of non-Euclidean dissimilarities. The weights of the combination are learnt in a (Hyper Reproducing Kernel Hilbert Space) HRKHS using a Semidefinite Programming algorithm. This approach allows us to incorporate a smoothing term that penalizes the complexity of the family of distances and avoids overfitting. The experimental results suggest that the method proposed helps to reduce the misclassification errors in several human cancer problems. PMID:19584909

  18. Representativity and reproducibility of DNA malignancy grading in different carcinomas.

    PubMed

    Böcking, A; Chatelain, R; Homge, M; Daniel, R; Gillissen, A; Wohltmann, D

    1989-04-01

    The reproducibility of the determination of the "DNA malignancy grade" (DNA-MG) was tested in 56 carcinomas of the colon, breast and lung while its representativity was tested on 195 slides from 65 tumors of the colon, breast and lung. DNA measurements were performed on Feulgen-stained smears with the TAS Plus TV-based image analysis system combined with an automated microscope. The variance of the DNA values of tumor cells around the 2c peak, the "2c deviation index" (2cDI), was taken as a basis for the computation of the DNA-MG, which ranges on a continuous scale from 0.01 to 3.00. The representativity, analyzed by comparison of the DNA-MGs measured in three different areas of the same tumor greater than or equal to 1.5 cm apart from each other, yielded an 81% agreement. No significant differences between DNA-MGs of these areas were found. The intraobserver and interobserver reproducibilities of the DNA grading system, investigated by repeated DNA measurements, were 83.9% and 82.2%, respectively. In comparison, histopathologic grading of the 27 breast cancers studied yielded 65% intraobserver and 57% interobserver reproducibilities and 66% representativity.

  19. Reproducibility of LCA models of crude oil production.

    PubMed

    Vafi, Kourosh; Brandt, Adam R

    2014-11-01

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  20. Git can facilitate greater reproducibility and increased transparency in science

    PubMed Central

    2013-01-01

    Background Reproducibility is the hallmark of good science. Maintaining a high degree of transparency in scientific reporting is essential not just for gaining trust and credibility within the scientific community but also for facilitating the development of new ideas. Sharing data and computer code associated with publications is becoming increasingly common, motivated partly in response to data deposition requirements from journals and mandates from funders. Despite this increase in transparency, it is still difficult to reproduce or build upon the findings of most scientific publications without access to a more complete workflow. Findings Version control systems (VCS), which have long been used to maintain code repositories in the software industry, are now finding new applications in science. One such open source VCS, Git, provides a lightweight yet robust framework that is ideal for managing the full suite of research outputs such as datasets, statistical code, figures, lab notes, and manuscripts. For individual researchers, Git provides a powerful way to track and compare versions, retrace errors, explore new approaches in a structured manner, while maintaining a full audit trail. For larger collaborative efforts, Git and Git hosting services make it possible for everyone to work asynchronously and merge their contributions at any time, all the while maintaining a complete authorship trail. In this paper I provide an overview of Git along with use-cases that highlight how this tool can be leveraged to make science more reproducible and transparent, foster new collaborations, and support novel uses. PMID:23448176

  1. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  2. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  3. Planar heterojunction perovskite solar cells with superior reproducibility.

    PubMed

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  4. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  5. Feasibility and Reproducibility of Echo Planar Spectroscopic Imaging on the Quantification of Hepatic Fat

    PubMed Central

    Lin, Yi-Ru; Chiu, Jian-Jia; Tsai, Shang-Yueh

    2014-01-01

    Objectives 1H-MRS is widely regarded as the most accurate noninvasive method to quantify hepatic fat content (HFC). When practical period of breath holding, and acquisition of HFC over multiple liver areas is considered, a fast MR spectroscopic imaging technique is desired. The aim of this study is to examine the feasibility and reproducibility of echo planar spectroscopic imaging (EPSI) on the quantification of HFC in subject with various HFCs. Methods Twenty two volunteers were examined in a 3T MR system. The acquisition time of proposed EPSI protocol was 18 seconds. The EPSI scans were repeated 8 times for each subject to test reproducibility. The peak of water and individual peaks of fat including methyl, methylene, and allylic peaks at 0.9, 1.3, and 2.0 ppm were fitted. Calculated amount of water and fat content were corrected for T2 relaxation. The total HFC was defined as the combination of individual peaks. Standard deviation (SD), coefficient of variance (COV) and fitting reliability of HFC quantified by LCModel were calculated. Results Our results show that the SDs of total HFC for all subjects are less than 2.5%. Fitting reliability is mostly under 10% and positively correlates with COV. Subjects separated into three subgroups according to quantified total HFC show that improved fitting reliability and reproducibility can be achieved on subjects with higher total HFC. Conclusions We have demonstrated feasibility of the proposed EPSI protocols on the quantification of HFC over a whole slice of liver with scan time in a single breath hold. PMID:25514348

  6. Reproducible and Consistent Quantification of the Saccharomyces cerevisiae Proteome by SWATH-mass spectrometry*

    PubMed Central

    Selevsek, Nathalie; Chang, Ching-Yun; Gillet, Ludovic C.; Navarro, Pedro; Bernhardt, Oliver M.; Reiter, Lukas; Cheng, Lin-Yang; Vitek, Olga; Aebersold, Ruedi

    2015-01-01

    Targeted mass spectrometry by selected reaction monitoring (S/MRM) has proven to be a suitable technique for the consistent and reproducible quantification of proteins across multiple biological samples and a wide dynamic range. This performance profile is an important prerequisite for systems biology and biomedical research. However, the method is limited to the measurements of a few hundred peptides per LC-MS analysis. Recently, we introduced SWATH-MS, a combination of data independent acquisition and targeted data analysis that vastly extends the number of peptides/proteins quantified per sample, while maintaining the favorable performance profile of S/MRM. Here we applied the SWATH-MS technique to quantify changes over time in a large fraction of the proteome expressed in Saccharomyces cerevisiae in response to osmotic stress. We sampled cell cultures in biological triplicates at six time points following the application of osmotic stress and acquired single injection data independent acquisition data sets on a high-resolution 5600 tripleTOF instrument operated in SWATH mode. Proteins were quantified by the targeted extraction and integration of transition signal groups from the SWATH-MS datasets for peptides that are proteotypic for specific yeast proteins. We consistently identified and quantified more than 15,000 peptides and 2500 proteins across the 18 samples. We demonstrate high reproducibility between technical and biological replicates across all time points and protein abundances. In addition, we show that the abundance of hundreds of proteins was significantly regulated upon osmotic shock, and pathway enrichment analysis revealed that the proteins reacting to osmotic shock are mainly involved in the carbohydrate and amino acid metabolism. Overall, this study demonstrates the ability of SWATH-MS to efficiently generate reproducible, consistent, and quantitatively accurate measurements of a large fraction of a proteome across multiple samples. PMID

  7. The validity and reproducibility of clinical assessment of nutritional status in the elderly.

    PubMed

    Duerksen, D R; Yeo, T A; Siemens, J L; O'Connor, M P

    2000-09-01

    Malnutrition is an important predictor of morbidity and mortality. In the non-elderly, a subjective global assessment (SGA) has been developed. It has a high inter-rater agreement, correlates with other measures of nutritional status, and predicts subsequent morbidity. The purpose of this study was to determine the validity and reproducibility of the SGA in a group of patients older than 70 y of age. Consecutive patients from four geriatric/rehabilitation units were considered for the study. Each patient underwent independent nutritional assessments by a geriatrician and senior medical resident. At the completion of the assessment, skinfold caliper measurements were obtained and the patient reclassified according to the results, which were then compared with objective measures of nutritional status. Six-month follow-up was obtained on all patients. The agreement between the two clinicians was 0.48 +/- 0.17 (unweighted kappa), which represents moderate agreement and is less than the reported agreement in nonelderly subjects. Skin calipers improved the agreement between clinicians but did not improve the correlation with other nutritional markers or prediction of morbidity and mortality. There was a correlation between a patient's severely malnourished state and mortality. In addition, patients with a body mass index (BMI) of <75% or >150% age/sex standardized norms had an increased mortality. The SGA is a reproducible and valid tool for determining nutritional status in the elderly. The reproducibility is less than in the nonelderly, which may relate to changes in body composition or ability to obtain an accurate nutritional history.

  8. REPRODUCIBILITY OF INTRA-ABDOMINAL PRESSURE MEASURED DURING PHYSICAL ACTIVITIES VIA A WIRELESS VAGINAL TRANSDUCER

    PubMed Central

    Egger, Marlene J.; Hamad, Nadia M.; Hitchcock, Robert W.; Coleman, Tanner J.; Shaw, Janet M.; Hsu, Yvonne; Nygaard, Ingrid E.

    2014-01-01

    Aims In the urodynamics laboratory setting, a wireless pressure transducer, developed to facilitate research exploring intra-abdominal pressure (IAP) and pelvic floor disorders, was highly accurate. We aimed to study reproducibility of IAP measured using this transducer in women during activities performed in an exercise science laboratory. Methods Fifty seven women (mean ± SD: age 30.4 ±9.3 years; body mass index=22.4 ± 2.68 kg/m2) completed two standardized activity sessions using the same transducer at least three days apart. Pressure data for 31 activities were transmitted wirelessly to a base station and analyzed for mean net maximal IAP, area under the curve (AUC) and first moment of the area (FMA.) Activities included typical exercises, lifting 13.6 to 18.2 kg, and simulated household tasks. Analysis for test-retest reliability included Bland-Altman plots with absolute limits of agreement (ALOA), Wilcoxon signed rank tests to assess significant differences between sessions, intraclass correlations, and kappa statistics to assess inter-session agreement in highest vs. other quintiles of maximal IAP. Results Few activities exhibited significant differences between sessions in maximal IAP, or in AUC and FMA values. For 13 activities, the agreement between repeat measures of maximal IAP was better than ± 10 cm H20; for 20 activities, better than ± 15 cm H20. ALOA increased with mean IAP. The highest quintile of IAP demonstrated fair/substantial agreement between sessions in 25 of 30 activities. Conclusion Reproducibility of IAP depends on the activity undertaken. Interventions geared towards lowering IAP should account for this, and maximize efforts to improve IAP reproducibility. PMID:25730430

  9. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions.

  10. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions. PMID:27196963

  11. Accurate Low-mass Stellar Models of KOI-126

    NASA Astrophysics Data System (ADS)

    Feiden, Gregory A.; Chaboyer, Brian; Dotter, Aaron

    2011-10-01

    The recent discovery of an eclipsing hierarchical triple system with two low-mass stars in a close orbit (KOI-126) by Carter et al. appeared to reinforce the evidence that theoretical stellar evolution models are not able to reproduce the observational mass-radius relation for low-mass stars. We present a set of stellar models for the three stars in the KOI-126 system that show excellent agreement with the observed radii. This agreement appears to be due to the equation of state implemented by our code. A significant dispersion in the observed mass-radius relation for fully convective stars is demonstrated; indicative of the influence of physics currently not incorporated in standard stellar evolution models. We also predict apsidal motion constants for the two M dwarf companions. These values should be observationally determined to within 1% by the end of the Kepler mission.

  12. Reproducibility of transcranial magnetic stimulation metrics in the study of proximal upper limb muscles

    PubMed Central

    Sankarasubramanian, Vishwanath; Roelle, Sarah; Bonnett, Corin E; Janini, Daniel; Varnerin, Nicole; Cunningham, David A; Sharma, Jennifer S; Potter-Baker, Kelsey A; Wang, Xiaofeng; Yue, Guang H; Plow, Ela B

    2015-01-01

    Objective Reproducibility of transcranial magnetic stimulation (TMS) metrics is essential in accurately tracking recovery and disease. However, majority of evidence pertains to reproducibility of metrics for distal upper limb muscles. We investigate for the first time, reliability of corticospinal physiology for a large proximal muscle-the biceps brachii and relate how varying statistical analyses can influence interpretations. Methods 14 young right-handed healthy participants completed two sessions assessing resting motor threshold (RMT), motor evoked potentials (MEPs), motor map and intra-cortical inhibition (ICI) from the left biceps brachii. Analyses included paired t-tests, Pearson's, intra-class (ICC) and concordance correlation coefficients (CCC) and Bland-Altman plots. Results Unlike paired t-tests, ICC, CCC and Pearson's were >0.6 indicating good reliability for RMTs, MEP intensities and locations of map; however values were <0.3 for MEP responses and ICI. Conclusions Corticospinal physiology, defining excitability and output in terms of intensity of the TMS device, and spatial loci are the most reliable metrics for the biceps. MEPs and variables based on MEPs are less reliable since biceps receives fewer cortico-motor-neuronal projections. Statistical tests of agreement and associations are more powerful reliability indices than inferential tests. Significance Reliable metrics of proximal muscles when translated to a larger number of participants would serve to sensitively track and prognosticate function in neurological disorders such as stroke where proximal recovery precedes distal. PMID:26111434

  13. A novel methodology to reproduce previously recorded six-degree of freedom kinematics on the same diarthrodial joint.

    PubMed

    Moore, Susan M; Thomas, Maribeth; Woo, Savio L-Y; Gabriel, Mary T; Kilger, Robert; Debski, Richard E

    2006-01-01

    The objective of this study was to develop a novel method to more accurately reproduce previously recorded 6-DOF kinematics of the tibia with respect to the femur using robotic technology. Furthermore, the effect of performing only a single or multiple registrations and the effect of robot joint configuration were investigated. A single registration consisted of registering the tibia and femur with respect to the robot at full extension and reproducing all kinematics while multiple registrations consisted of registering the bones at each flexion angle and reproducing only the kinematics of the corresponding flexion angle. Kinematics of the knee in response to an anterior (134 N) and combined internal/external (+/-10 N m) and varus/valgus (+/-5 N m) loads were collected at 0 degrees , 15 degrees , 30 degrees , 60 degrees , and 90 degrees of flexion. A six axes, serial-articulated robotic manipulator (PUMA Model 762) was calibrated and the working volume was reduced to improve the robot's accuracy. The effect of the robot joint configuration was determined by performing single and multiple registrations for three selected configurations. For each robot joint configuration, the accuracy in position of the reproduced kinematics improved after multiple registrations (0.7+/-0.3, 1.2+/-0.5, and 0.9+/-0.2 mm, respectively) when compared to only a single registration (1.3+/-0.9, 2.0+/-1.0, and 1.5+/-0.7 mm, respectively) (p<0.05). The accuracy in position of each robot joint configuration was unique as significant differences were detected between each of the configurations. These data demonstrate that the number of registrations and the robot joint configuration both affect the accuracy of the reproduced kinematics. Therefore, when using robotic technology to reproduce previously recorded kinematics, it may be necessary to perform these analyses for each individual robotic system and for each diarthrodial joint, as different joints will require the robot to be placed in

  14. Reproducibility and validity of a food frequency questionnaire among pregnant women in a Mediterranean area

    PubMed Central

    2013-01-01

    Background Studies exploring the role of diet during pregnancy are still scarce, in part due to the complexity of measuring diet and to the lack of valid instruments. The aim of this study was to examine the reproducibility and validity (against biochemical biomarkers) of a semi-quantitative food frequency questionnaire (FFQ) in pregnant women. Methods Participants were 740 pregnant women from a population-based birth cohort study in Valencia (INMA Study). We compared nutrient and food intakes from FFQs estimated for two periods of pregnancy (reproducibility), and compared energy-adjusted intake of several carotenoids, folate, vitamin B12, vitamin C and α-tocopherol of the FFQ in the first trimester with their concentration in blood specimens (validity). Results Significant correlations for reproducibility were found for major food groups and nutrients but not for lycopene (r=0.06); the average correlation coefficients for daily intake were 0.51 for food groups and 0.61 for nutrients. For validity, statistically significant correlations were observed for vitamin C (0.18), α-carotene (0.32), β-carotene (0.22), lutein-zeaxantin (0.29) and β-cryptoxantin(0.26); non-significant correlations were observed for retinol, lycopene, α-tocopherol, vitamin B12 and folate (r≤0.12). When dietary supplement use was considered, correlations were substantially improved for folate (0.53) and to a lesser extent for vitamin B12 (0.12) and vitamin C (0.20). Conclusion This study supports that the FFQ has a good reproducibility for nutrient and food intake, and can provide a valid estimate of several important nutrients during pregnancy. PMID:23421854

  15. Accuracy and reproducibility of low dose insulin administration using pen-injectors and syringes

    PubMed Central

    Gnanalingham, M; Newland, P; Smith, C

    1998-01-01

    Many children with diabetes require small doses of insulin administered with syringes or pen-injector devices (at the Booth Hall Paediatric Diabetic Clinic, 20% of children aged 0-5 years receive 1-2 U insulin doses). To determine how accurately and reproducibly small doses are delivered, 1, 2, 5, and 10 U doses of soluble insulin (100 U/ml) were dispensed in random order 15times from five new NovoPens (1.5 ml), five BD-Pens (1.5 ml), and by five nurses using 30 U syringes. Each dose was weighed, and intended and actual doses compared. The two pen-injectors delivered less insulin than syringes, differences being inversely proportional to dose. For 1 U (mean (SD)): 0.89 (0.04) U (NovoPen), 0.92 (0.03) U (BD-Pen), 1.23 (0.09) U (syringe); and for 10 U: 9.8 (0.1) U (NovoPen), 9.9 (0.1) U (BD-Pen), 10.1 (0.1) U (syringe). The accuracy (percentage errors) of the pen-injectors was similar and more accurate than syringes delivering 1, 2, and 5 U of insulin. Errors for 1 U: 11(4)% (NovoPen), 8(3)% (BD-Pen), 23(9)% (syringe). The reproducibility (coefficient of variation) of actual doses was similar (< 7%) for all three devices, which were equally consistent at underdosing (pen-injectors) or overdosing (syringes) insulin. All three devices, especially syringes, are unacceptably inaccurate when delivering 1 U doses of insulin. Patients on low doses need to be educated that their dose may alter when they transfer from one device to another.

 PMID:9771255

  16. Fast and accurate generation of ab initio quality atomic charges using nonparametric statistical regression.

    PubMed

    Rai, Brajesh K; Bakken, Gregory A

    2013-07-15

    We introduce a class of partial atomic charge assignment method that provides ab initio quality description of the electrostatics of bioorganic molecules. The method uses a set of models that neither have a fixed functional form nor require a fixed set of parameters, and therefore are capable of capturing the complexities of the charge distribution in great detail. Random Forest regression is used to build separate charge models for elements H, C, N, O, F, S, and Cl, using training data consisting of partial charges along with a description of their surrounding chemical environments; training set charges are generated by fitting to the b3lyp/6-31G* electrostatic potential (ESP) and are subsequently refined to improve consistency and transferability of the charge assignments. Using a set of 210 neutral, small organic molecules, the absolute hydration free energy calculated using these charges in conjunction with Generalized Born solvation model shows a low mean unsigned error, close to 1 kcal/mol, from the experimental data. Using another large and independent test set of chemically diverse organic molecules, the method is shown to accurately reproduce charge-dependent observables--ESP and dipole moment--from ab initio calculations. The method presented here automatically provides an estimate of potential errors in the charge assignment, enabling systematic improvement of these models using additional data. This work has implications not only for the future development of charge models but also in developing methods to describe many other chemical properties that require accurate representation of the electronic structure of the system.

  17. Sensitivity studies of spin cut-off models on fission fragment observables

    NASA Astrophysics Data System (ADS)

    Thulliez, L.; Litaize, O.; Serot, O.

    2016-03-01

    A fission fragment de-excitation code, FIFRELIN, is being developed at CEA Cadarache. It allows probing the characteristics of the prompt emitted particles, neutrons and gammas, during the de-excitation process of fully accelerated fission fragments. The knowledge of the initial states of the fragments is important to accurately reproduce the fission fragment observables. In this paper a sensitivity study of various spin cut-off models, completely defining the initial fission fragment angular momentum distribution has been performed. This study shows that the choice of the model has a significant impact on gamma observables such as spectrum and multiplicity and almost none on the neutron observables.

  18. A colorimetric-based accurate method for the determination of enterovirus 71 titer.

    PubMed

    Pourianfar, Hamid Reza; Javadi, Arman; Grollo, Lara

    2012-12-01

    The 50 % tissue culture infectious dose (TCID50) is still one of the most commonly used techniques for estimating virus titers. However, the traditional TCID50 assay is time consuming, susceptible to subjective errors and generates only quantal data. Here, we describe a colorimetric-based approach for the titration of Enterovirus 71 (EV71) using a modified method for making virus dilutions. In summary, the titration of EV71 using MTT or MTS staining with a modified virus dilution method decreased the time of the assay and eliminated the subjectivity of observational results, improving accuracy, reproducibility and reliability of virus titration, in comparison with the conventional TCID50 approach (p < 0.01). In addition, the results provided evidence that there was better correlation between a plaquing assay and our approach when compared to the traditional TCID50 approach. This increased accuracy also improved the ability to predict the number of virus plaque forming units present in a solution. These improvements could be of use for any virological experimentation, where a quick accurate titration of a virus capable of causing cell destruction is required or a sensible estimation of the number of viral plaques based on TCID50 of a virus is desired.

  19. The puzzling Venusian polar atmospheric structure reproduced by a general circulation model.

    PubMed

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro; Kashimura, Hiroki; Imamura, Takeshi; Matsuda, Yoshihisa

    2016-01-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the midlatitudes at cloud-top levels (∼65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ∼60° latitude, which is a unique feature called 'cold collar' in the Venus atmosphere. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. Here we perform numerical simulations of the Venus atmospheric circulation using a general circulation model, and succeed in reproducing these puzzling features in close agreement with the observations. The cold collar and warm polar region are attributed to the residual mean meridional circulation enhanced by the thermal tide. The present results strongly suggest that the thermal tide is crucial for the structure of the Venus upper polar atmosphere at and above cloud levels. PMID:26832195

  20. The puzzling Venusian polar atmospheric structure reproduced by a general circulation model

    PubMed Central

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro; Kashimura, Hiroki; Imamura, Takeshi; Matsuda, Yoshihisa

    2016-01-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the midlatitudes at cloud-top levels (∼65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ∼60° latitude, which is a unique feature called ‘cold collar' in the Venus atmosphere. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. Here we perform numerical simulations of the Venus atmospheric circulation using a general circulation model, and succeed in reproducing these puzzling features in close agreement with the observations. The cold collar and warm polar region are attributed to the residual mean meridional circulation enhanced by the thermal tide. The present results strongly suggest that the thermal tide is crucial for the structure of the Venus upper polar atmosphere at and above cloud levels. PMID:26832195

  1. Quantum theory as the most robust description of reproducible experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel

    2014-08-01

    It is shown that the basic equations of quantum theory can be obtained from a straightforward application of logical inference to experiments for which there is uncertainty about individual events and for which the frequencies of the observed events are robust with respect to small changes in the conditions under which the experiments are carried out. There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature [45]. Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience. In this respect our task must be to account for such experience in a manner independent of individual subjective judgment and therefore objective in the sense that it can be unambiguously communicated in ordinary human language [46]. The physical content of quantum mechanics is exhausted by its power to formulate statistical laws governing observations under conditions specified in plain language [46]. The first two sentences of the first quote may be read as a suggestion to dispose of, in Mermin's words [47], the "bad habit" to take mathematical abstractions as the reality of the events (in the everyday sense of the word) that we experience through our senses. Although widely circulated, these sentences are reported by Petersen [45] and there is doubt that Bohr actually used this wording [48]. The last two sentences of the first quote and the second quote suggest that we should try to describe human experiences (confined to the realm of scientific inquiry) in a manner and language which is unambiguous and independent of the individual subjective judgment. Of course, the latter should not be construed to imply that the observed phenomena are independent of the choices made by the individual(s) in performing the scientific experiment [49].The third quote

  2. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well. PMID:26930054

  3. Radiative properties of astrophysical matter : a quest to reproduce astrophysical conditions on earth.

    SciTech Connect

    Bailey, James E.

    2010-10-01

    Experiments in terrestrial laboratories can be used to evaluate the physical models that interpret astronomical observations. The properties of matter in astrophysical objects are essential components of these models, but terrestrial laboratories struggle to reproduce the extreme conditions that often exist. Megajoule-class DOE/NNSA facilities such as the National Ignition Facility and Z can create unprecedented amounts of matter at extreme conditions, providing new capabilities to test astrophysical models with high accuracy. Experiments at these large facilities are challenging, and access is very competitive. However, the cylindrically-symmetric Z source emits radiation in all directions, enabling multiple physics experiments to be driven with a single Z discharge. This helps ameliorate access limitations. This article describes research efforts under way at Sandia National Laboratories Z facility investigating radiation transport through stellar interior matter, population kinetics of atoms exposed to the intense radiation emitted by accretion powered objects, and spectral line formation in white dwarf (WD) photospheres. Opacity quantifies the absorption of radiation by matter and strongly influences stellar structure and evolution, since radiation dominates energy transport deep inside stars. Opacity models have become highly sophisticated, but laboratory tests at the conditions existing inside stars have not been possible - until now. Z research is presently focused on measuring iron absorption at conditions relevant to the base of the solar convection zone, where the electron temperature and density are 190 eV and 9 x 10{sup 22} e/cc, respectively. Creating these conditions in a sample that is sufficiently large, long-lived, and uniform is extraordinarily challenging. A source of radiation that streams through the relatively-large samples can produce volumetric heating and thus, uniform conditions, but to achieve high temperatures a strong source is required

  4. ACCURATE ORBITAL INTEGRATION OF THE GENERAL THREE-BODY PROBLEM BASED ON THE D'ALEMBERT-TYPE SCHEME

    SciTech Connect

    Minesaki, Yukitaka

    2013-03-15

    We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.

  5. Accurate Orbital Integration of the General Three-body Problem Based on the d'Alembert-type Scheme

    NASA Astrophysics Data System (ADS)

    Minesaki, Yukitaka

    2013-03-01

    We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.

  6. Data assimilation experiment for reproducing localized delay signals derived from InSAR

    NASA Astrophysics Data System (ADS)

    Kinoshita, Y.; Furuya, M.

    2014-12-01

    InSAR phase signals are affected by the Earth's atmosphere like those of the GNSS. Therefore, InSAR can detect water vapor distribution with unprecedented spatial resolution if there are neither surface deformation signals or other errors, and thus is potentially useful for meteorological applications. However, there has been only a few studies using InSAR as a water vapor sensor (e.g. Hanssen et al., 1999, Kinoshita et al., 2013).We reported six case studies that detected localized water vapor signals with InSAR based on ALOS/PALSAR data (Kinoshita et al., JpGU 2013), some of which reached over 20 cm in the LOS direction within 10 km2. Each signal located at the very location of high rainfall intensity in the weather radar data. Such localized signals strongly suggest the existence of developed convective systems at the SAR observation time.To investigate the mechanism of the localized delay signals in the interferogram, we performed the WRF simulation in the case of Niigata on 25 August 2010. We used the JMA MSM data and the NCEP high-resolution SST data as the initial values. Two nested domains were used and horizontal resolutions of them were 3 km and 1 km, respectively. The WRF simulation could reproduce the convective system that extended east and west, and the shape of the reproduced convective system was similar to the localized signals in the interferogram, whereas the location of the reproduced convective system was about 30 km north of that of observed signals.To improve the location of the reproduced convective system, we performed the 4DVAR experiment implemented in the WRF data assimilation model. In this study, we assimilated zenith total delay data derived from the Japanese GNSS network, GEONET. Due to the limitation of the computational resource, we performed the 4DVAR data assimilation in the coarser domain (10 km) and then we downscaled the assimilated initial value to the finer domain (2.5 km). The simulation with the assimilation could

  7. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  8. Reproducibility of Gadolinium Enhancement Patterns and Wall Thickness in Hypertrophic Cardiomyopathy

    PubMed Central

    Rodriguez-Granillo, Gaston A.; Deviggiano, Alejandro; Capunay, Carlos; Zan, Macarena C. De; Carrascosa, Patricia

    2016-01-01

    Background Reproducibility data of the extent and patterns of late gadolinium enhancement (LGE) in hypertrophic cardiomyopathy (HCM) is limited. Objective To explore the reproducibility of regional wall thickness (WT), LGE extent, and LGE patterns in patients with HCM assessed with cardiac magnetic resonance (CMR). Methods The extent of LGE was assessed by the number of segments with LGE, and by the total LV mass with LGE (% LGE); and the pattern of LGE-CMR was defined for each segment. Results A total of 42 patients (672 segments) with HCM constituted the study population. The mean WT measurements showed a mean difference between observers of -0.62 ± 1.0 mm (6.1%), with limits of agreement of 1.36 mm; -2.60 mm and intraclass correlation coefficient (ICC) of 0.95 (95% CI 0.93-0.96). Maximum WT measurements showed a mean difference between observers of -0.19 ± 0.8 mm (0.9%), with limits of agreement of 1.32 mm; -1.70 mm, and an ICC of 0.95 (95% CI 0.91-0.98). The % LGE showed a mean difference between observers of -1.17 ± 1.2 % (21%), with limits of agreement of 1.16%; -3.49%, and an ICC of 0.94 (95% CI 0.88-0.97). The mean difference between observers regarding the number of segments with LGE was -0.40 ± 0.45 segments (11%), with limits of agreement of 0.50 segments; -1.31 segments, and an ICC of 0.97 (95% CI 0.94-0.99). Conclusions The number of segments with LGE might be more reproducible than the percent of the LV mass with LGE. PMID:27305110

  9. Validity and reproducibility of a food frequency questionnaire to assess food group intake in adolescents.

    PubMed

    Martinez, Marcelle Flores; Philippi, Sonia Tucunduva; Estima, Camilla; Leal, Greisse

    2013-09-01

    The objective of this study was to assess the validity and reproducibility of a food frequency questionnaire to assess intake of the food groups included in the food guide pyramid for adolescents (FFQ-FP). The final version of the FFQ-FP consisted of 50 food items. The study was carried out with a sample of 109 adolescents over a period of four months. A 24hr recall (24hr) was conducted four times and the FFQ-FP was conducted twice. Validity was determined by comparing the second FFQ-FP and the mean of the four 24hrs, while reproducibility was verified by comparing the results of the two FFQ-FPs. Statistical analysis was carried out using medians, standard deviations, Pearson and intraclass correlations and Kappa statistics to assess concordance. Best results were achieved for the rice (including bread, grains and starches), meats and sugars groups. Weakest correlation was observed for the variable vitamin C. The validity and reproducibility of the FFQ-FP was satisfactory for most variables. PMID:24068225

  10. Reproducibility of the World Health Organization 2008 criteria for myelodysplastic syndromes

    PubMed Central

    Senent, Leonor; Arenillas, Leonor; Luño, Elisa; Ruiz, Juan C.; Sanz, Guillermo; Florensa, Lourdes

    2013-01-01

    The reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is uncertain and its assessment was the major aim of this study. The different peripheral blood and bone marrow variables required for an adequate morphological classification were blindly evaluated by four cytomorphologists in samples from 50 patients with myelodysplastic syndromes. The degree of agreement among observers was calculated using intraclass correlation coefficient and the generalized kappa statistic for multiple raters. The degree of agreement for the percentages of blasts in bone marrow and peripheral blood, ring sideroblasts in bone marrow, and erythroid, granulocytic and megakaryocytic dysplastic cells was strong (P<0.001 in all instances). After stratifying the percentages according to the categories required for the assignment of World Health Organization subtypes, the degree of agreement was not statistically significant for cases with 5-9% blasts in bone marrow (P=0.07), 0.1-1% blasts in peripheral blood (P=0.47), or percentage of erythroid dysplastic cells (P=0.49). Finally, the interobserver concordance for World Health Organization-defined subtypes showed a moderate overall agreement (P<0.001), the reproducibility being lower for cases with refractory anemia with excess of blasts type 1 (P=0.05) and refractory anemia with ring sideroblasts (P=0.09). In conclusion, the reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is acceptable but the defining criteria for blast cells and features of erythroid dysplasia need to be refined. PMID:23065505

  11. Validity and reproducibility of a food frequency questionnaire to assess food group intake in adolescents.

    PubMed

    Martinez, Marcelle Flores; Philippi, Sonia Tucunduva; Estima, Camilla; Leal, Greisse

    2013-09-01

    The objective of this study was to assess the validity and reproducibility of a food frequency questionnaire to assess intake of the food groups included in the food guide pyramid for adolescents (FFQ-FP). The final version of the FFQ-FP consisted of 50 food items. The study was carried out with a sample of 109 adolescents over a period of four months. A 24hr recall (24hr) was conducted four times and the FFQ-FP was conducted twice. Validity was determined by comparing the second FFQ-FP and the mean of the four 24hrs, while reproducibility was verified by comparing the results of the two FFQ-FPs. Statistical analysis was carried out using medians, standard deviations, Pearson and intraclass correlations and Kappa statistics to assess concordance. Best results were achieved for the rice (including bread, grains and starches), meats and sugars groups. Weakest correlation was observed for the variable vitamin C. The validity and reproducibility of the FFQ-FP was satisfactory for most variables.

  12. Reproducibility of a web-based FFQ for 13- to 15-year-old Danish adolescents.

    PubMed

    Bjerregaard, Anne A; Tetens, Inge; Olsen, Sjurdur F; Halldorsson, Thorhallur I

    2016-01-01

    FFQ are widely used in large-scale studies to assess dietary intake. To aid interpretation of diet-disease associations assessment of validity must be performed. Reproducibility is one aspect of validity focusing on the stability of repeated assessment with the same method which may also reveal problems in instrument design or participant instructions. The aim of the present study was to evaluate the reproducibility of a web-based FFQ targeting Danish adolescents within the Danish National Birth Cohort (DNBC). Data for the present study were obtained from a prospective design nested within the DNBC. Adolescents aged 13 to 15 years old (n 48, 60 % girls) completed the FFQ twice 4 weeks apart. The proportion of adolescents consistently classified into the same tertile according to amount of food intake ranged from 45 % (fish) to 77 % (vegetables), whereas classification into opposite tertiles ranged from 0 % (fruit, oils and dressing) to 15 % (beverages). Overall, no significant differences were observed in intake of food groups or nutrients between the two completions of the FFQ. Mean crude Spearman correlation for all food groups was 0·56 and mean intra-class correlation for all food groups was 0·61. In conclusion, the reproducibility of the FFQ for Danish adolescents was acceptable. The study revealed that adolescents aged 13-15 years seemed capable of recalling consistently overall dietary habits and had some difficulties estimating the frequency of consumption of regularly consumed food items. PMID:26855775

  13. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  14. Repeatability and Reproducibility of Decisions by Latent Fingerprint Examiners

    PubMed Central

    Ulery, Bradford T.; Hicklin, R. Austin; Buscaglia, JoAnn; Roberts, Maria Antonia

    2012-01-01

    The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner) results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive) was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as “difficult” than for “easy” or “moderate” comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4); 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization). Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases. PMID:22427888

  15. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  16. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  17. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  18. Inter-study reproducibility of cardiovascular magnetic resonance tagging

    PubMed Central

    2013-01-01

    Background The aim of this study is to determine the test-retest reliability of the measurement of regional myocardial function by cardiovascular magnetic resonance (CMR) tagging using spatial modulation of magnetization. Methods Twenty-five participants underwent CMR tagging twice over 12 ± 7 days. To assess the role of slice orientation on strain measurement, two healthy volunteers had a first exam, followed by image acquisition repeated with slices rotated ±15 degrees out of true short axis, followed by a second exam in the true short axis plane. To assess the role of slice location, two healthy volunteers had whole heart tagging. The harmonic phase (HARP) method was used to analyze the tagged images. Peak midwall circumferential strain (Ecc), radial strain (Err), Lambda 1, Lambda 2, and Angle α were determined in basal, mid and apical slices. LV torsion, systolic and early diastolic circumferential strain and torsion rates were also determined. Results LV Ecc and torsion had excellent intra-, interobserver, and inter-study intra-class correlation coefficients (ICC range, 0.7 to 0.9). Err, Lambda 1, Lambda 2 and angle had excellent intra- and interobserver ICC than inter-study ICC. Angle had least inter-study reproducibility. Torsion rates had superior intra-, interobserver, and inter-study reproducibility to strain rates. The measurements of LV Ecc were comparable in all three slices with different short axis orientations (standard deviation of mean Ecc was 0.09, 0.18 and 0.16 at basal, mid and apical slices, respectively). The mean difference in LV Ecc between slices was more pronounced in most of the basal slices compared to the rest of the heart. Conclusions Intraobserver and interobserver reproducibility of all strain and torsion parameters was excellent. Inter-study reproducibility of CMR tagging by SPAMM varied between different parameters as described in the results above and was superior for Ecc and LV torsion. The variation in LV Ecc

  19. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    PubMed Central

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~<10%). Some sequences are particularly variable in specific structures (ASL exhibited variation of 28% in the cerebral white matter) or in thin structures (quantitative T2 varied by up to 73% in the caudate) due, in large part, to variability in automated ROI placement. The

  20. Right ventricular ejection fraction measured by first-pass intravenous krypton-81m: reproducibility and comparison with technetium-99m.

    PubMed

    Wong, D F; Natarajan, T K; Summer, W; Tibbits, P A; Beck, T; Koller, D; Kasecamp, W; Lamb, J; Olynyk, J; Philp, M S

    1985-11-01

    Study of the effects of various diseases and therapeutic manipulation of pulmonary vascular resistance on the right ventricle has been restricted by methodologic limitations. The radioactive gas in solution, krypton-81m was used to study the right ventricle and the technique was compared with a technetium-99m method. In 22 subjects, first-pass krypton-81m right ventricular ejection fraction, acquired both in list mode and electrocardiogram-gated frame mode, correlated well (r = 0.81 and 0.86, respectively, p less than 0.01) with that determined by technetium-99m first-pass studies over a broad range of ventricular function. The reproducibility of the technique was excellent (r = 0.84 and 0.95 for each acquisition mode, respectively). Krypton-81m first-pass studies provide accurate and reproducible estimates of right ventricular function. Use of krypton allows multiple measurements, with or without perturbations, over a short period of time.

  1. Host-associated differentiation in a highly polyphagous, sexually reproducing insect herbivore

    PubMed Central

    Antwi, Josephine B; Sword, Gregory A; Medina, Raul F

    2015-01-01

    Insect herbivores may undergo genetic divergence on their host plants through host-associated differentiation (HAD). Much of what we know about HAD involves insect species with narrow host ranges (i.e., specialists) that spend part or all their life cycle inside their hosts, and/or reproduce asexually (e.g., parthenogenetic insects), all of which are thought to facilitate HAD. However, sexually reproducing polyphagous insects can also exhibit HAD. Few sexually reproducing insects have been tested for HAD, and when they have insects from only a handful of potential host-plant populations have been tested, making it difficult to predict how common HAD is when one considers the entire species’ host range. This question is particularly relevant when considering insect pests, as host-associated populations may differ in traits relevant to their control. Here, we tested for HAD in a cotton (Gossypium hirsutum) pest, the cotton fleahopper (CFH) (Pseudatomoscelis seriatus), a sexually reproducing, highly polyphagous hemipteran insect. A previous study detected one incidence of HAD among three of its host plants. We used Amplified fragment length polymorphism (AFLP) markers to assess HAD in CFH collected from an expanded array of 13 host-plant species belonging to seven families. Overall, four genetically distinct populations were found. One genetically distinct genotype was exclusively associated with one of the host-plant species while the other three were observed across more than one host-plant species. The relatively low degree of HAD in CFH compared to the pea aphid, another hemipteran insect, stresses the likely importance of sexual recombination as a factor increasing the likelihood of HAD. PMID:26257868

  2. Segmental reproducibility of retinal blood flow velocity measurements using retinal function imager

    PubMed Central

    Chhablani, Jay; Bartsch, Dirk-Uwe; Kozak, Igor; Cheng, Lingyun; Alshareef, Rayan A; Rezeq, Sami S; Sampat, Kapil M; Garg, Sunir J; Burgansky-Eliash, Zvia; Freeman, William R

    2013-01-01

    Background To evaluate the reproducibility of blood flow velocity measurements of individual retinal blood vessel segments using retinal function imager (RFI). Methods Eighteen eyes of 15 healthy subjects were enrolled prospectively at three centers. All subjects underwent RFI imaging in two separate sessions 15 min apart by a single experienced photographer at each center. An average of five to seven serial RFI images were obtained. All images were transferred electronically to one center, and were analyzed by a single observer. Multiple blood vessel segments (each shorter than 100 μm) were co-localized on first and second session images taken at different times of the same fundus using built-in software. Velocities of corresponding segments were determined, and then the inter-session reproducibility of flow velocity was assessed by the concordance correlation co-efficient (CCC), coefficient of reproducibility (CR), and coefficient of variance (CV). Results Inter-session CCC for flow velocity was 0.97 (95% confidence interval (CI), 0.966 to 0.9797). The CR was 1.49 mm/sec (95% CI, 1.39 to 1.59 mm/sec), and CV was 10.9%. The average arterial blood flow velocity was 3.16 mm/sec, and average venous blood flow velocity was 3.15 mm/sec. The CR for arterial and venous blood flow velocity was 1.61 mm/sec and 1.27 mm/sec respectively. Conclusion RFI provides reproducible measurements for retinal blood flow velocity for individual blood vessel segments, with 10.9% variability. PMID:23700326

  3. Reproducibility of measurements of regional resting and hyperemic myocardial blood flow assessed with PET

    SciTech Connect

    Nagamachi, S.; Czernin, J.; Kim, A.S.

    1996-10-01

    PET with {sup 13}N-ammonia permits the noninvasive quantification of myocardial blood flow (MBF) in humans. The present study was done to assess the reproducibility of quantitative blood flow measurements at rest and during pharmacologically induced hyperemia in healthy individuals. Thirty healthy volunteers (26 men, 4 women) were studied. Paired measurements of MBF at rest (n = 21), during adenosine (n = 15) and during dipyridamole (n = 7) were performed using a two-compartment model for {sup 13}N-ammonia PET. The mean difference between baseline and follow-up blood flow (% difference) was calculated to assess reproducibility. No significant difference was observed between resting blood flow at baseline or follow-up (15.8% {plus_minus} 15.8%; p = ns). Baseline and follow-up resting blood flow were linearly correlated (r = 0.63, p < 0.005). Normalization of resting blood flow to the rate pressure product improved the reproducibility significantly (15.8% {plus_minus} 15.8% versus 10.1% {plus_minus} 10.5%, p < 0.05). Baseline and follow-up hyperemic myocardial blood flow did not differ (11.8% {plus_minus} 9.4%; p = ns) and were linearly correlated (r = 0.69, p < 0.0005). MBF at rest can be measured reproducibly with {sup 13}N-ammonia PET. The individual response to pharmacologic stress appears to be relatively consistent. Thus, serial blood flow measurements with {sup 13}N-ammonia PET can be used to quantify the effect of various interventions on MBF and vasodilatory reserve. 41 refs., 3 figs., 4 tabs.

  4. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire

    PubMed Central

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual’s dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual’s total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes. PMID:27626442

  5. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity.

  6. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire.

    PubMed

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual's dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual's total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes.

  7. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity. PMID:21302997

  8. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire.

    PubMed

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual's dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual's total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes. PMID:27626442

  9. Accurate energy levels for singly ionized platinum (Pt II)

    NASA Technical Reports Server (NTRS)

    Reader, Joseph; Acquista, Nicolo; Sansonetti, Craig J.; Engleman, Rolf, Jr.

    1988-01-01

    New observations of the spectrum of Pt II have been made with hollow-cathode lamps. The region from 1032 to 4101 A was observed photographically with a 10.7-m normal-incidence spectrograph. The region from 2245 to 5223 A was observed with a Fourier-transform spectrometer. Wavelength measurements were made for 558 lines. The uncertainties vary from 0.0005 to 0.004 A. From these measurements and three parity-forbidden transitions in the infrared, accurate values were determined for 28 even and 72 odd energy levels of Pt II.

  10. MR Reproducibility in the Assessment of Uterine Fibroids for Patients Scheduled for Uterine Artery Embolization

    SciTech Connect

    Volkers, Nicole A. Hehenkamp, Wouter J. K.; Spijkerboer, Anje M.; Moolhuijzen, Albert D.; Birnie, Erwin; Ankum, Willem M.; Reekers, Jim A.

    2008-03-15

    Magnetic resonance imaging (MRI) is increasingly applied in the evaluation of uterine fibroids. However, little is known about the reproducibility of MRI in the assessment of uterine fibroids. This study evaluates the inter- and intraobserver variation in the assessment of the uterine fibroids and concomitant adenomyosis in women scheduled for uterine artery embolization (UAE). Forty patients (mean age: 44.5 years) with symptomatic uterine fibroids who were scheduled for UAE underwent T{sub 1}- and T{sub 2}-weighted MRI. To study inter- and intraobserver agreement 40 MR images were evaluated independently by two observers and reevaluated by both observers 4 months later. Inter- and intraobserver agreement was calculated using Cohen's {kappa} statistic and intraclass correlation coefficient for categorical and continuous variables, respectively. Inter-observer agreement for uterine volumes ({kappa} = 0.99, p < 0.0001), dominant fibroid volumes ({kappa} = 0.98, p {<=} 0.0001), and number of fibroids ({kappa} = 0.88; CI, 0.77-0.93; p < 0.0001) was excellent. For the T{sub 1}- and T{sub 2}-weighted signal intensity of the dominant fibroid there was good agreement between the observers (87%; 95% CI, 71.9%-95.6%) and the intraobserver agreement was good for observer A (95%; 95% CI, 83.1%-99.4%) and moderate for observer B ({kappa} = 0.47). The interobserver agreement with respect to the presence of adenomyosis was good ({kappa} = 0.73, p < 0.0001), while both intraobserver agreements were fair to moderate (observer A, {kappa} = 0.55, p = 0.0003; and observer B, {kappa} = 0.66, p < 0.0001). In conclusion, MRI criteria used for the selection of suitable UAE patients show good inter- and intraobserver reproducibility.

  11. On the importance of having accurate data for astrophysical modelling

    NASA Astrophysics Data System (ADS)

    Lique, Francois

    2016-06-01

    The Herschel telescope and the ALMA and NOEMA interferometers have opened new windows of observation for wavelengths ranging from far infrared to sub-millimeter with spatial and spectral resolutions previously unmatched. To make the most of these observations, an accurate knowledge of the physical and chemical processes occurring in the interstellar and circumstellar media is essential.In this presentation, I will discuss what are the current needs of astrophysics in terms of molecular data and I will show that accurate molecular data are crucial for the proper determination of the physical conditions in molecular clouds.First, I will focus on collisional excitation studies that are needed for molecular lines modelling beyond the Local Thermodynamic Equilibrium (LTE) approach. In particular, I will show how new collisional data for the HCN and HNC isomers, two tracers of star forming conditions, have allowed solving the problem of their respective abundance in cold molecular clouds. I will also present the last collisional data that have been computed in order to analyse new highly resolved observations provided by the ALMA interferometer.Then, I will present the calculation of accurate rate constants for the F+H2 → HF+H and Cl+H2 ↔ HCl+H reactions, which have allowed a more accurate determination of the physical conditions in diffuse molecular clouds. I will also present the recent work on the ortho-para-H2 conversion due to hydrogen exchange that allow more accurate determination of the ortho-to-para-H2 ratio in the universe and that imply a significant revision of the cooling mechanism in astrophysical media.

  12. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable. PMID:26919473

  13. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools.

  14. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  15. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  16. [Study of the validity and reproducibility of passive ozone monitors].

    PubMed

    Cortez-Lugo, M; Romieu, I; Palazuelos-Rendón, E; Hernández-Avila, M

    1995-01-01

    The aim of this study was to evaluate the validity and reproducibility between ozone measurements obtained with passive ozone monitors and those registered with a continuous ozone monitor, to determine the applicability of passive monitors in epidemiological research. The study was carried out during November and December 1992. Indoor and outdoor classroom air ozone concentrations were analyzed using 28 passive monitors and using a continuous monitor. The correlation between both measurements was highly significant (r = 0.089, p < 0.001), indicating a very good validity. Also, the correlation between the measurements obtained with two different passive monitors exposed concurrently was very high (r = 0.97, p < 0.001), indicating a good reproducibility in the measurements of the passive monitors. The relative error between the concentrations measured by the passive monitors and those from the continuous monitor tended to decrease with increasing ozone concentrations. The results suggest that passive monitors should be used to determine cumulative exposure of ozone exceeding 100 ppb, corresponding to an exposure period greater than five days, if used to analyze indoor air.

  17. Data reproducibility of pace strategy in a laboratory test run.

    PubMed

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-06-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  18. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  19. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  20. Selection on soil microbiomes reveals reproducible impacts on plant function

    PubMed Central

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-01-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  1. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  2. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  3. Effect of Soil Moisture Content on the Splash Phenomenon Reproducibility

    PubMed Central

    Ryżak, Magdalena; Bieganowski, Andrzej; Polakowski, Cezary

    2015-01-01

    One of the methods for testing splash (the first phase of water erosion) may be an analysis of photos taken using so-called high-speed cameras. The aim of this study was to determine the reproducibility of measurements using a single drop splash of simulated precipitation. The height from which the drops fell resulted in a splash of 1.5 m. Tests were carried out using two types of soil: Eutric Cambisol (loamy silt) and Orthic Luvisol (sandy loam); three initial pressure heads were applied equal to 16 kPa, 3.1 kPa, and 0.1 kPa. Images for one, five, and 10 drops were recorded at a rate of 2000 frames per second. It was found that (i) the dispersion of soil caused by the striking of the 1st drop was significantly different from the splash impact caused by subsequent drops; (ii) with every drop, the splash phenomenon proceeded more reproducibly, that is, the number of particles of soil and/or water that splashed were increasingly close to each other; (iii) the number of particles that were detached during the splash were strongly correlated with its surface area; and (iv) the higher the water film was on the surface the smaller the width of the crown was. PMID:25785859

  4. Reproducibility of the 6-minute walk test in obese adults.

    PubMed

    Beriault, K; Carpentier, A C; Gagnon, C; Ménard, J; Baillargeon, J-P; Ardilouze, J-L; Langlois, M-F

    2009-10-01

    The six-minute walk test (6MWT) is an inexpensive, quick and safe tool to evaluate the functional capacity of patients with heart failure and chronic obstructive pulmonary disease. The aim of this study was to determine the reproducibility of the 6MWT in overweight and obese individuals. We thus undertook a prospective repeated-measure validity study taking place in our academic weight management outpatient clinic. The 6MWT was conducted twice the same day in 21 overweight or obese adult subjects (15 females and 6 males). Repeatability of walking distance was the primary outcome. Anthropometric measures, blood pressure and heart rate were also recorded. Participant's mean BMI was 37.2+/-9.8 kg/m(2) (range: 27.0-62.3 kg/m(2)). Walking distance in the morning (mean=452+/-90 m) and in the afternoon (mean=458+/-97 m) were highly correlated (r=0.948; 95% Confidence Interval 0.877-0.978; p<0.001). Walking distance was negatively correlated with BMI (r=-0.47, p=0.03), waist circumference (r=-0.43, p=0.05) and pre-test heart rate (r=-0.54, p=0.01). Our findings indicate that the 6MWT is highly reproducible in obese subjects and could thus be used as a fitness indicator in clinical studies and clinical care in this population.

  5. On the Accurate Prediction of CME Arrival At the Earth

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Hess, Phillip

    2016-07-01

    We will discuss relevant issues regarding the accurate prediction of CME arrival at the Earth, from both observational and theoretical points of view. In particular, we clarify the importance of separating the study of CME ejecta from the ejecta-driven shock in interplanetary CMEs (ICMEs). For a number of CME-ICME events well observed by SOHO/LASCO, STEREO-A and STEREO-B, we carry out the 3-D measurements by superimposing geometries onto both the ejecta and sheath separately. These measurements are then used to constrain a Drag-Based Model, which is improved through a modification of including height dependence of the drag coefficient into the model. Combining all these factors allows us to create predictions for both fronts at 1 AU and compare with actual in-situ observations. We show an ability to predict the sheath arrival with an average error of under 4 hours, with an RMS error of about 1.5 hours. For the CME ejecta, the error is less than two hours with an RMS error within an hour. Through using the best observations of CMEs, we show the power of our method in accurately predicting CME arrival times. The limitation and implications of our accurate prediction method will be discussed.

  6. Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5

    NASA Astrophysics Data System (ADS)

    Hirota, Nagio; Takayabu, Yukari

    2013-04-01

    Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5 Nagio Hirota1,2 and Yukari N. Takayabu2 (1) National Institute of Polar Research (NIPR) (2) Atmosphere and Ocean Research Institute (AORI), the University of Tokyo Reproducibility of precipitation distributions over extratropical continental regions by CMIP5 climate models in their historical runs are evaluated, in comparison with GPCP(V2.2), CMAP(V0911), daily gridded gauge data APHRODITE. Surface temperature, cloud radiative forcing, and atmospheric circulations are also compared with observations of CRU-UEA, CERES, and ERA-interim/ERA40/JRA reanalysis data. It is shown that many CMIP5 models underestimate and overestimate summer precipitation over West and East Eurasia, respectively. These precipitation biases correspond to moisture transport associated with a cyclonic circulation bias over the whole continent of Eurasia. Meanwhile, many models underestimate cloud over the Eurasian continent, and associated shortwave cloud radiative forcing result in a significant warm bias. Evaporation feedback amplify the warm bias over West Eurasia. These processes consistently explain the precipitation biases over the Erasian continent in summer. We also examined reproducibility of winter precipitation, but robust results are not obtained yet due to the large uncertainty in observation associated with the adjustment of snow measurement in windy condition. Better observational data sets are necessary for further model validation. Acknowledgment: This study is supported by the PMM RA of JAXA, Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology, Japan, and Environment Research and Technology Development Fund (A-1201) of the Ministry of the Environment, Japan.

  7. Modified chemiluminescent NO analyzer accurately measures NOX

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1978-01-01

    Installation of molybdenum nitric oxide (NO)-to-higher oxides of nitrogen (NOx) converter in chemiluminescent gas analyzer and use of air purge allow accurate measurements of NOx in exhaust gases containing as much as thirty percent carbon monoxide (CO). Measurements using conventional analyzer are highly inaccurate for NOx if as little as five percent CO is present. In modified analyzer, molybdenum has high tolerance to CO, and air purge substantially quenches NOx destruction. In test, modified chemiluminescent analyzer accurately measured NO and NOx concentrations for over 4 months with no denegration in performance.

  8. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  9. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    NASA Astrophysics Data System (ADS)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  10. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  11. Multi-centre reproducibility of diffusion MRI parameters for clinical sequences in the brain.

    PubMed

    Grech-Sollars, Matthew; Hales, Patrick W; Miyazaki, Keiko; Raschke, Felix; Rodriguez, Daniel; Wilson, Martin; Gill, Simrandip K; Banks, Tina; Saunders, Dawn E; Clayden, Jonathan D; Gwilliam, Matt N; Barrick, Thomas R; Morgan, Paul S; Davies, Nigel P; Rossiter, James; Auer, Dorothee P; Grundy, Richard; Leach, Martin O; Howe, Franklyn A; Peet, Andrew C; Clark, Chris A

    2015-04-01

    The purpose of this work was to assess the reproducibility of diffusion imaging, and in particular the apparent diffusion coefficient (ADC), intra-voxel incoherent motion (IVIM) parameters and diffusion tensor imaging (DTI) parameters, across multiple centres using clinically available protocols with limited harmonization between sequences. An ice-water phantom and nine healthy volunteers were scanned across fives centres on eight scanners (four Siemens 1.5T, four Philips 3T). The mean ADC, IVIM parameters (diffusion coefficient D and perfusion fraction f) and DTI parameters (mean diffusivity MD and fractional anisotropy FA), were measured in grey matter, white matter and specific brain sub-regions. A mixed effect model was used to measure the intra- and inter-scanner coefficient of variation (CV) for each of the five parameters. ADC, D, MD and FA had a good intra- and inter-scanner reproducibility in both grey and white matter, with a CV ranging between 1% and 7.4%; mean 2.6%. Other brain regions also showed high levels of reproducibility except for small structures such as the choroid plexus. The IVIM parameter f had a higher intra-scanner CV of 8.4% and inter-scanner CV of 24.8%. No major difference in the inter-scanner CV for ADC, D, MD and FA was observed when analysing the 1.5T and 3T scanners separately. ADC, D, MD and FA all showed good intra-scanner reproducibility, with the inter-scanner reproducibility being comparable or faring slightly worse, suggesting that using data from multiple scanners does not have an adverse effect compared with using data from the same scanner. The IVIM parameter f had a poorer inter-scanner CV when scanners of different field strengths were combined, and the parameter was also affected by the scan acquisition resolution. This study shows that the majority of diffusion MRI derived parameters are robust across 1.5T and 3T scanners and suitable for use in multi-centre clinical studies and trials.

  12. Multi-centre reproducibility of diffusion MRI parameters for clinical sequences in the brain

    PubMed Central

    Grech-Sollars, Matthew; Hales, Patrick W; Miyazaki, Keiko; Raschke, Felix; Rodriguez, Daniel; Wilson, Martin; Gill, Simrandip K; Banks, Tina; Saunders, Dawn E; Clayden, Jonathan D; Gwilliam, Matt N; Barrick, Thomas R; Morgan, Paul S; Davies, Nigel P; Rossiter, James; Auer, Dorothee P; Grundy, Richard; Leach, Martin O; Howe, Franklyn A; Peet, Andrew C; Clark, Chris A

    2015-01-01

    The purpose of this work was to assess the reproducibility of diffusion imaging, and in particular the apparent diffusion coefficient (ADC), intra-voxel incoherent motion (IVIM) parameters and diffusion tensor imaging (DTI) parameters, across multiple centres using clinically available protocols with limited harmonization between sequences. An ice–water phantom and nine healthy volunteers were scanned across fives centres on eight scanners (four Siemens 1.5T, four Philips 3T). The mean ADC, IVIM parameters (diffusion coefficient D and perfusion fraction f) and DTI parameters (mean diffusivity MD and fractional anisotropy FA), were measured in grey matter, white matter and specific brain sub-regions. A mixed effect model was used to measure the intra- and inter-scanner coefficient of variation (CV) for each of the five parameters. ADC, D, MD and FA had a good intra- and inter-scanner reproducibility in both grey and white matter, with a CV ranging between 1% and 7.4%; mean 2.6%. Other brain regions also showed high levels of reproducibility except for small structures such as the choroid plexus. The IVIM parameter f had a higher intra-scanner CV of 8.4% and inter-scanner CV of 24.8%. No major difference in the inter-scanner CV for ADC, D, MD and FA was observed when analysing the 1.5T and 3T scanners separately. ADC, D, MD and FA all showed good intra-scanner reproducibility, with the inter-scanner reproducibility being comparable or faring slightly worse, suggesting that using data from multiple scanners does not have an adverse effect compared with using data from the same scanner. The IVIM parameter f had a poorer inter-scanner CV when scanners of different field strengths were combined, and the parameter was also affected by the scan acquisition resolution. This study shows that the majority of diffusion MRI derived parameters are robust across 1.5T and 3T scanners and suitable for use in multi-centre clinical studies and trials. © 2015 The Authors NMR in

  13. Reproducible, Scalable Fusion Gene Detection from RNA-Seq.

    PubMed

    Arsenijevic, Vladan; Davis-Dusenbery, Brandi N

    2016-01-01

    Chromosomal rearrangements resulting in the creation of novel gene products, termed fusion genes, have been identified as driving events in the development of multiple types of cancer. As these gene products typically do not exist in normal cells, they represent valuable prognostic and therapeutic targets. Advances in next-generation sequencing and computational approaches have greatly improved our ability to detect and identify fusion genes. Nevertheless, these approaches require significant computational resources. Here we describe an approach which leverages cloud computing technologies to perform fusion gene detection from RNA sequencing data at any scale. We additionally highlight methods to enhance reproducibility of bioinformatics analyses which may be applied to any next-generation sequencing experiment. PMID:26667464

  14. [Expansion of undergraduate nursing and the labor market: reproducing inequalities?].

    PubMed

    Silva, Kênia Lara; de Sena, Roseni Rosângela; Tavares, Tatiana Silva; Wan der Maas, Lucas

    2012-01-01

    This study aimed to analyze the relationship between the increase in the number of degree courses in nursing and the nursing job market. It is a descriptive exploratory study with a quantitative approach, which used data on Undergraduate Nursing courses, supply of nurses, connection with health facilities, and formal jobs in nursing in the state of Minas Gerais. The evolution of Undergraduate Nursing courses reveals a supply and demand decline in recent years. Such context is determined by the nurse's labor market being influenced by the contradiction of a professional quantitative surplus, particularly in the state's less developed areas, as opposed to a low percentage of nurses to care for the population's health. These characteristics of the nursing labor market reproduce inequalities furthermore aspects such as the regulation of nursing education and the creation of new jobs need to be discussed further.

  15. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  16. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

  17. GigaDB: promoting data dissemination and reproducibility

    PubMed Central

    Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612

  18. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements. PMID:27013736

  19. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily Kara; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  20. Whole blood metal ion measurement reproducibility between different laboratories.

    PubMed

    Rahmé, Michel; Lavigne, Martin; Barry, Janie; Cirtiu, Ciprian Mihai; Bélanger, Patrick; Vendittoli, Pascal-André

    2014-11-01

    Monitoring patients' metal ion blood concentrations can be useful in cases of problematic metal on metal hip implants. Our objective was to evaluate the reproducibility of metal ion level values measured by two different laboratories. Whole blood samples were collected in 46 patients with metal on metal hip arthroplasty. For each patients, two whole blood samples were collected and analyzed by two laboratories. Laboratory 1 had higher results than laboratory 2. There was a clinically significant absolute difference between the two laboratories, above the predetermined threshold, 35% of Cr samples and 38% of Co samples. All laboratories do not use the same technologies for their measurements. Therefore, decision to revise a metal on metal hip arthroplasty should rely on metal ion trends and have to be done in the same laboratory.

  1. Initial evaluations of the reproducibility of vapor-diffusion crystallization.

    PubMed

    Newman, Janet; Xu, Jian; Willis, Michael C

    2007-07-01

    Experiments were set up to test how the crystallization drop size affects the crystallization process; in the test cases studied, increasing the drop size led to increasing numbers of crystals. Other data produced from a high-throughput automation-system run were analyzed in order to gauge the effect of replication on the success of crystallization screening. With over 40-fold multiplicity, lysozyme was found to crystallize in over half of the conditions in a standard 96-condition screen. However, despite the fact that industry-standard lysozyme was used in our tests, it was rare that we obtained crystals reproducibly; this suggests that replication whilst screening might improve the success rate of macromolecular crystallization.

  2. Investigating the reproducibility of a complex multifocal radiosurgery treatment

    PubMed Central

    Niebanck, M; Juang, T; Newton, J; Adamovics, J; Wang, Z; Oldham, M

    2013-01-01

    Stereotactic radiosurgery has become a widely used technique to treat solid tumors and secondary metastases of the brain. Multiple targets can be simultaneously treated with a single isocenter in order to reduce the set-up time to improve patient comfort and workflow. In this study, a 5-arc multifocal RapidArc treatment was delivered to multiple PRESAGE® dosimeters in order to explore the repeatability of the treatment. The three delivery measurements agreed well with each other, with less than 3% standard deviation of dose in the target. The deliveries also agreed well with the treatment plan, with gamma passing rates greater than 90% (5% dose-difference, and 2 mm distance-to-agreement criteria). The optical-CT PRESAGE® system provided a reproducible measurement for treatment verification, provided measurements were made immediately following treatment. PMID:27081397

  3. New model for datasets citation and extraction reproducibility in VAMDC

    NASA Astrophysics Data System (ADS)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  4. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    SciTech Connect

    Dunn, Nicholas J. H.; Noid, W. G.

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.

  5. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    NASA Astrophysics Data System (ADS)

    Dunn, Nicholas J. H.; Noid, W. G.

    2015-12-01

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed "pressure-matching" variational principle to determine a volume-dependent contribution to the potential, UV(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing UV, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that UV accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the "simplicity" of the model.

  6. Reproducibility of MRI segmentation using a feature space method

    NASA Astrophysics Data System (ADS)

    Soltanian-Zadeh, Hamid; Windham, Joe P.; Scarpace, Lisa; Murnock, Tanya

    1998-06-01

    This paper presents reproducibility studies for the segmentation results obtained by our optimal MRI feature space method. The steps of the work accomplished are as follows. (1) Eleven patients with brain tumors were imaged by a 1.5 T General Electric Signa MRI System. Four T2- weighted and two T1-weighted images (before and after Gadolinium injection) were acquired for each patient. (2) Images of a slice through the center of the tumor were selected for processing. (3) Patient information was removed from the image headers and new names (unrecognizable by the image analysts) were given to the images. These images were blindly analyzed by the image analysts. (4) Segmentation results obtained by the two image analysts at two time points were compared to assess the reproducibility of the segmentation method. For each tissue segmented in each patient study, a comparison was done by kappa statistics and a similarity measure (an approximation of kappa statistics used by other researchers), to evaluate the number of pixels that were in both of the segmentation results obtained by the two image analysts (agreement) relative to the number of pixels that were not in both (disagreement). An overall agreement comparison was done by finding means and standard deviations of kappa statistics and the similarity measure found for each tissue type in the studies. The kappa statistics for white matter was the largest (0.80) followed by those of gray matter (0.68), partial volume (0.67), total lesion (0.66), and CSF (0.44). The similarity measure showed the same trend but it was always higher than kappa statistics. It was 0.85 for white matter, 0.77 for gray matter, 0.73 for partial volume, 0.72 for total lesion, and 0.47 for CSF.

  7. Galaxy Zoo: reproducing galaxy morphologies via machine learning

    NASA Astrophysics Data System (ADS)

    Banerji, Manda; Lahav, Ofer; Lintott, Chris J.; Abdalla, Filipe B.; Schawinski, Kevin; Bamford, Steven P.; Andreescu, Dan; Murray, Phil; Raddick, M. Jordan; Slosar, Anze; Szalay, Alex; Thomas, Daniel; Vandenberg, Jan

    2010-07-01

    We present morphological classifications obtained using machine learning for objects in the Sloan Digital Sky Survey DR6 that have been classified by Galaxy Zoo into three classes, namely early types, spirals and point sources/artefacts. An artificial neural network is trained on a subset of objects classified by the human eye, and we test whether the machine-learning algorithm can reproduce the human classifications for the rest of the sample. We find that the success of the neural network in matching the human classifications depends crucially on the set of input parameters chosen for the machine-learning algorithm. The colours and parameters associated with profile fitting are reasonable in separating the objects into three classes. However, these results are considerably improved when adding adaptive shape parameters as well as concentration and texture. The adaptive moments, concentration and texture parameters alone cannot distinguish between early type galaxies and the point sources/artefacts. Using a set of 12 parameters, the neural network is able to reproduce the human classifications to better than 90 per cent for all three morphological classes. We find that using a training set that is incomplete in magnitude does not degrade our results given our particular choice of the input parameters to the network. We conclude that it is promising to use machine-learning algorithms to perform morphological classification for the next generation of wide-field imaging surveys and that the Galaxy Zoo catalogue provides an invaluable training set for such purposes. This publication has been made possible by the participation of more than 100000 volunteers in the Galaxy Zoo project. Their contributions are individually acknowledged at http://www.galaxyzoo.org/Volunteers.aspx. E-mail: mbanerji@ast.cam.ac.uk ‡ Einstein Fellow.

  8. A workflow for reproducing mean benthic gas fluxes

    NASA Astrophysics Data System (ADS)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  9. Repeatability and Reproducibility of Manual Choroidal Volume Measurements Using Enhanced Depth Imaging Optical Coherence Tomography

    PubMed Central

    Chhablani, Jay; Barteselli, Giulio; Wang, Haiyan; El-Emam, Sharif; Kozak, Igor; Doede, Aubrey L.; Bartsch, Dirk-Uwe; Cheng, Lingyun; Freeman, William R.

    2012-01-01

    Purpose To evaluate the repeatability and reproducibility of manual choroidal volume (CV) measurements by spectral domain- optical coherence tomography (SD-OCT) using enhanced depth imaging (EDI). Methods Sixty eyes of 32 patients with or without any ocular chorioretinal diseases were enrolled prospectively. Thirty-one choroidal scans were performed on each eye, centered at the fovea, using a raster protocol. Two masked observers demarcated choroidal boundaries by using built-in automated retinal segmentation software on two separate sessions. Observers were masked to each other's and their own previous readings. A standardized grid centered on the fovea was positioned automatically by OCT software, and values for average CVs and total CVs in three concentric rings were noted. The agreement between the intraobserver measurements or interobserver measurements was assessed using the concordance correlation coefficient (CCC). Bland-Altman plots were used to assess the clinically relevant magnitude of differences between inter- and intraobserver measurements. Results The interobserver CCC for the overall average CV was very high, 0.9956 (95% confidence interval [CI], 0.991–0.9968). CCCs for all three Early Treatment Diabetic Retinopathy Study concentric rings between two graders was 0.98 to 0.99 (95% CI, 0.97–0.98). Similarly intraobserver repeatability of two graders also ranged from 0.98 to 0.99. The interobserver coefficient of reproducibility was approximately 0.42 (95% CI, 0.34–0.5 mm3) for the average CV. Conclusions CV measurement by manual segmentation using built-in automated retinal segmentation software on EDI-SD-OCT is highly reproducible and repeatable and has a very small range of variability. PMID:22427584

  10. Toward Transparent and Reproducible Science: Using Open Source "Big Data" Tools for Water Resources Assessment

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Zulkafli, Z. D.; Vitolo, C.

    2014-12-01

    Transparency and reproducibility are fundamental properties of good science. In the current era of large and diverse datasets and long and complex workflows for data analysis and inference, ensuring such transparency and reproducibility is challenging. Hydrological science is a good case in point, because the discipline typically uses a large variety of datasets ranging from local observations to large-scale remotely sensed products. These data are often obtained from various different sources, and integrated using complex yet uncertain modelling tools. In this paper, we present and discuss methods of ensuring transparency and reproducibility in scientific workflows for hydrological data analysis for the purpose of water resources assessment, using relevant examples of emerging open source "big data" tools. First, we discuss standards for data storage, access, and processing that allow improving the modularity of a hydrological analysis workflow. In particular standards emerging from the Open Geospatial Consortium, such as the Sensor Observation Service, the Web Coverage Service, hold promise. However, some bottlenecks such as the availability of data models and the ability to work with spatio-temperal subsets of large datasets, need further development. Next, we focus on available methods to build transparent data processing workflows. Again, standards such as OGC's Web Processing Service are being developed to facilitate web-based analytics. Yet, in practice, the experimental nature of these standards and web services in general often requires a more pragmatic approach. The availability of web technologies in popular open source data analysis environments such as R and Python often makes them an attractive solution for workflow creation and sharing. Lastly, we elaborate on the potential of open source solutions hold in the context of participatory approaches to data collection and knowledge generation. Using examples from the tropical Andes and the Himalayas, we

  11. Can Appraisers Rate Work Performance Accurately?

    ERIC Educational Resources Information Center

    Hedge, Jerry W.; Laue, Frances J.

    The ability of individuals to make accurate judgments about others is examined and literature on this subject is reviewed. A wide variety of situational factors affects the appraisal of performance. It is generally accepted that the purpose of the appraisal influences the accuracy of the appraiser. The instrumentation, or tools, available to the…

  12. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  13. Reproducibility of UAV-based earth surface topography based on structure-from-motion algorithms.

    NASA Astrophysics Data System (ADS)

    Clapuyt, François; Vanacker, Veerle; Van Oost, Kristof

    2014-05-01

    A representation of the earth surface at very high spatial resolution is crucial to accurately map small geomorphic landforms with high precision. Very high resolution digital surface models (DSM) can then be used to quantify changes in earth surface topography over time, based on differencing of DSMs taken at various moments in time. However, it is compulsory to have both high accuracy for each topographic representation and consistency between measurements over time, as DSM differencing automatically leads to error propagation. This study investigates the reproducibility of reconstructions of earth surface topography based on structure-from-motion (SFM) algorithms. To this end, we equipped an eight-propeller drone with a standard reflex camera. This equipment can easily be deployed in the field, as it is a lightweight, low-cost system in comparison with classic aerial photo surveys and terrestrial or airborne LiDAR scanning. Four sets of aerial photographs were created for one test field. The sets of airphotos differ in focal length, and viewing angles, i.e. nadir view and ground-level view. In addition, the importance of the accuracy of ground control points for the construction of a georeferenced point cloud was assessed using two different GPS devices with horizontal accuracy at resp. the sub-meter and sub-decimeter level. Airphoto datasets were processed with SFM algorithm and the resulting point clouds were georeferenced. Then, the surface representations were compared with each other to assess the reproducibility of the earth surface topography. Finally, consistency between independent datasets is discussed.

  14. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  15. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions

    PubMed Central

    2016-01-01

    Objectives To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Methods Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005–Dec 2007 and Nov 2013–Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Results Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3–4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). Clinical Significance The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance. PMID:27556230

  16. Reproducing static and dynamic biodiversity patterns in tropical forests: the critical role of environmental variance.

    PubMed

    Fung, Tak; O'Dwyer, James P; Rahman, Kassim Abd; Fletcher, Christine D; Chisholm, Ryan A

    2016-05-01

    Ecological communities are subjected to stochasticity in the form of demographic and environmental variance. Stochastic models that contain only demographic variance (neutral models) provide close quantitative fits to observed species-abundance distributions (SADs) but substantially underestimate observed temporal species-abundance fluctuations. To provide a holistic assessment of whether models with demographic and environmental variance perform better than neutral models, the fit of both to SADs and temporal species-abundance fluctuations at the same time has to be tested quantitatively. In this study, we quantitatively test how closely a model with demographic and environmental variance reproduces total numbers of species, total abundances, SADs and temporal species-abundance fluctuations for two tropical forest tree communities, using decadal data from long-term monitoring plots and considering individuals larger than two size thresholds for each community. We find that the model can indeed closely reproduce these static and dynamic patterns of biodiversity in the two communities for the two size thresholds, with better overall fits than corresponding neutral models. Therefore, our results provide evidence that stochastic models incorporating demographic and environmental variance can simultaneously capture important static and dynamic biodiversity patterns arising in tropical forest communities. PMID:27349097

  17. Reproducing static and dynamic biodiversity patterns in tropical forests: the critical role of environmental variance.

    PubMed

    Fung, Tak; O'Dwyer, James P; Rahman, Kassim Abd; Fletcher, Christine D; Chisholm, Ryan A

    2016-05-01

    Ecological communities are subjected to stochasticity in the form of demographic and environmental variance. Stochastic models that contain only demographic variance (neutral models) provide close quantitative fits to observed species-abundance distributions (SADs) but substantially underestimate observed temporal species-abundance fluctuations. To provide a holistic assessment of whether models with demographic and environmental variance perform better than neutral models, the fit of both to SADs and temporal species-abundance fluctuations at the same time has to be tested quantitatively. In this study, we quantitatively test how closely a model with demographic and environmental variance reproduces total numbers of species, total abundances, SADs and temporal species-abundance fluctuations for two tropical forest tree communities, using decadal data from long-term monitoring plots and considering individuals larger than two size thresholds for each community. We find that the model can indeed closely reproduce these static and dynamic patterns of biodiversity in the two communities for the two size thresholds, with better overall fits than corresponding neutral models. Therefore, our results provide evidence that stochastic models incorporating demographic and environmental variance can simultaneously capture important static and dynamic biodiversity patterns arising in tropical forest communities.

  18. Visual-Stratigraphic Dating of the GISP2 Ice Core: Basis, Reproducibility, and Application

    NASA Technical Reports Server (NTRS)

    Alley, R. B.; Shuman, C. A.; Meese, D. A.; Gow, A. J.; Taylor, K. C.; Cuffey, K. M.; Fitzpatrick, J. J.; Grootes, P. M.; Zielinski, G. A.; Ram, M.; Spinelli, G.; Elder, B.

    1997-01-01

    Annual layers are visible in the Greenland Ice Sheet Project 2 ice core from central Greenland, allowing rapid dating of the core. Changes in bubble and grain structure caused by near-surface, primarily summertime formation of hoar complexes provide the main visible annual marker in the Holocene, and changes in "cloudiness" of the ice correlated with dustiness mark Wisconsinan annual cycles; both markers are evident and have been intercalibrated in early Holocene ice. Layer counts are reproducible between different workers and for one worker at different times, with 1% error over century-length times in the Holocene. Reproducibility is typically 5% in Wisconsinan ice-age ice and decreases with increasing age and depth. Cumulative ages from visible stratigraphy are not significantly different from independent ages of prominent events for ice older than the historical record and younger than approximately 50,000 years. Visible observations are not greatly degraded by "brittle ice" or many other core-quality problems, allowing construction of long, consistently sampled time series. High accuracy requires careful study of the core by dedicated observers.

  19. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  20. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  1. A highly accurate ab initio potential energy surface for methane

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-01

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of 12CH4 reproduced with a root-mean-square error of 0.70 cm-1. The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  2. A highly accurate ab initio potential energy surface for methane.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-14

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of (12)CH4 reproduced with a root-mean-square error of 0.70 cm(-1). The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement. PMID:27634258

  3. Towards Accurate Molecular Modeling of Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Chantawansri, T. L.; Andzelm, J.; Taylor, D.; Byrd, E.; Rice, B.

    2010-03-01

    There is substantial interest in identifying the controlling factors that influence the susceptibility of polymer bonded explosives (PBXs) to accidental initiation. Numerous Molecular Dynamics (MD) simulations of PBXs using the COMPASS force field have been reported in recent years, where the validity of the force field in modeling the solid EM fill has been judged solely on its ability to reproduce lattice parameters, which is an insufficient metric. Performance of the COMPASS force field in modeling EMs and the polymeric binder has been assessed by calculating structural, thermal, and mechanical properties, where only fair agreement with experimental data is obtained. We performed MD simulations using the COMPASS force field for the polymer binder hydroxyl-terminated polybutadiene and five EMs: cyclotrimethylenetrinitramine, 1,3,5,7-tetranitro-1,3,5,7-tetra-azacyclo-octane, 2,4,6,8,10,12-hexantirohexaazazisowurzitane, 2,4,6-trinitro-1,3,5-benzenetriamine, and pentaerythritol tetranitate. Predicted EM crystallographic and molecular structural parameters, as well as calculated properties for the binder will be compared with experimental results for different simulation conditions. We also present novel simulation protocols, which improve agreement between experimental and computation results thus leading to the accurate modeling of PBXs.

  4. A highly accurate ab initio potential energy surface for methane.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-14

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of (12)CH4 reproduced with a root-mean-square error of 0.70 cm(-1). The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  5. Accurate ab initio vibrational energies of methyl chloride

    SciTech Connect

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH{sub 3}{sup 35}Cl and CH{sub 3}{sup 37}Cl. The respective PESs, CBS-35{sup  HL}, and CBS-37{sup  HL}, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY {sub 3}Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35{sup  HL} and CBS-37{sup  HL} PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm{sup −1}, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH{sub 3}Cl without empirical refinement of the respective PESs.

  6. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  7. General theory of experiment containing reproducible data: The reduction to an ideal experiment

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Raoul R.; Zhang, Wei; Striccoli, Domenico

    2015-10-01

    The authors suggest a general theory for consideration of all experiments associated with measurements of reproducible data in one unified scheme. The suggested algorithm does not contain unjustified suppositions and the final function that is extracted from these measurements can be compared with hypothesis that is suggested by the theory adopted for the explanation of the object/phenomenon studied. This true function is free from the influence of the apparatus (instrumental) function and when the "best fit", or the most acceptable hypothesis, is absent, can be presented as a segment of the Fourier series. The discrete set of the decomposition coefficients describes the final function quantitatively and can serve as an intermediate model that coincides with the amplitude-frequency response (AFR) of the object studied. It can be used by theoreticians also for comparison of the suggested theory with experimental observations. Two examples (Raman spectra of the distilled water and exchange by packets between two wireless sensor nodes) confirm the basic elements of this general theory. From this general theory the following important conclusions follow: 1. The Prony's decomposition should be used in detection of the quasi-periodic processes and for quantitative description of reproducible data. 2. The segment of the Fourier series should be used as the fitting function for description of observable data corresponding to an ideal experiment. The transition from the initial Prony's decomposition to the conventional Fourier transform implies also the elimination of the apparatus function that plays an important role in the reproducible data processing. 3. The suggested theory will be helpful for creation of the unified metrological standard (UMS) that should be used in comparison of similar data obtained from the same object studied but in different laboratories with the usage of different equipment. 4. Many cases when the conventional theory confirms the experimental

  8. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    PubMed

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  9. Two highly accurate methods for pitch calibration

    NASA Astrophysics Data System (ADS)

    Kniel, K.; Härtig, F.; Osawa, S.; Sato, O.

    2009-11-01

    Among profiles, helix and tooth thickness pitch is one of the most important parameters of an involute gear measurement evaluation. In principle, coordinate measuring machines (CMM) and CNC-controlled gear measuring machines as a variant of a CMM are suited for these kinds of gear measurements. Now the Japan National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) and the German national metrology institute the Physikalisch-Technische Bundesanstalt (PTB) have each developed independently highly accurate pitch calibration methods applicable to CMM or gear measuring machines. Both calibration methods are based on the so-called closure technique which allows the separation of the systematic errors of the measurement device and the errors of the gear. For the verification of both calibration methods, NMIJ/AIST and PTB performed measurements on a specially designed pitch artifact. The comparison of the results shows that both methods can be used for highly accurate calibrations of pitch standards.

  10. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  11. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  12. Preparation and accurate measurement of pure ozone.

    PubMed

    Janssen, Christof; Simone, Daniela; Guinet, Mickaël

    2011-03-01

    Preparation of high purity ozone as well as precise and accurate measurement of its pressure are metrological requirements that are difficult to meet due to ozone decomposition occurring in pressure sensors. The most stable and precise transducer heads are heated and, therefore, prone to accelerated ozone decomposition, limiting measurement accuracy and compromising purity. Here, we describe a vacuum system and a method for ozone production, suitable to accurately determine the pressure of pure ozone by avoiding the problem of decomposition. We use an inert gas in a particularly designed buffer volume and can thus achieve high measurement accuracy and negligible degradation of ozone with purities of 99.8% or better. The high degree of purity is ensured by comprehensive compositional analyses of ozone samples. The method may also be applied to other reactive gases. PMID:21456766

  13. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  14. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  15. Line gas sampling system ensures accurate analysis

    SciTech Connect

    Not Available

    1992-06-01

    Tremendous changes in the natural gas business have resulted in new approaches to the way natural gas is measured. Electronic flow measurement has altered the business forever, with developments in instrumentation and a new sensitivity to the importance of proper natural gas sampling techniques. This paper reports that YZ Industries Inc., Snyder, Texas, combined its 40 years of sampling experience with the latest in microprocessor-based technology to develop the KynaPak 2000 series, the first on-line natural gas sampling system that is both compact and extremely accurate. This means the composition of the sampled gas must be representative of the whole and related to flow. If so, relative measurement and sampling techniques are married, gas volumes are accurately accounted for and adjustments to composition can be made.

  16. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  17. Accurate and occlusion-robust multi-view stereo

    NASA Astrophysics Data System (ADS)

    Zhu, Zhaokun; Stamatopoulos, Christos; Fraser, Clive S.

    2015-11-01

    This paper proposes an accurate multi-view stereo method for image-based 3D reconstruction that features robustness in the presence of occlusions. The new method offers improvements in dealing with two fundamental image matching problems. The first concerns the selection of the support window model, while the second centers upon accurate visibility estimation for each pixel. The support window model is based on an approximate 3D support plane described by a depth and two per-pixel depth offsets. For the visibility estimation, the multi-view constraint is initially relaxed by generating separate support plane maps for each support image using a modified PatchMatch algorithm. Then the most likely visible support image, which represents the minimum visibility of each pixel, is extracted via a discrete Markov Random Field model and it is further augmented by parameter clustering. Once the visibility is estimated, multi-view optimization taking into account all redundant observations is conducted to achieve optimal accuracy in the 3D surface generation for both depth and surface normal estimates. Finally, multi-view consistency is utilized to eliminate any remaining observational outliers. The proposed method is experimentally evaluated using well-known Middlebury datasets, and results obtained demonstrate that it is amongst the most accurate of the methods thus far reported via the Middlebury MVS website. Moreover, the new method exhibits a high completeness rate.

  18. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-10-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  19. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-04-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  20. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters.

    PubMed

    Wang, Hai-Yi; Su, Zi-Hua; Xu, Xiao; Sun, Zhi-Peng; Duan, Fei-Xue; Song, Yuan-Yuan; Li, Lu; Wang, Ying-Wei; Ma, Xin; Guo, Ai-Tao; Ma, Lin; Ye, Hui-Yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan-rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K( trans) &Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  1. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  2. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models. PMID:27111139

  3. Accurate phase-shift velocimetry in rock

    NASA Astrophysics Data System (ADS)

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  4. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  5. Oral Reading Observation System Observer's Training Manual.

    ERIC Educational Resources Information Center

    Brady, Mary Ella; And Others

    A self-instructional program for use by teachers of the handicapped, this training manual was developed to teach accurate coding with the Oral Reading Observation System (OROS)an observation system designed to code teacher-pupil verbal interaction during oral reading instruction. The body of the manual is organized to correspond to the nine…

  6. Isotope and trace element proxies in sclerosponge skeletons: reproducibility and alteration through sampling

    NASA Astrophysics Data System (ADS)

    Kuhlmann, K.; Haase-Schramm, A.; Böhm, F.; Eisenhauer, A.; Joachimski, M. M.; Dullo, W.-C.

    2003-04-01

    During the last decade sclerosponge skeletons have been increasingly used as proxy recorders, e.g. for reconstructions of mixed layer temperature histories, variations of the carbon isotopic composition of seawater or trace metal input to the oceans. We investigated the influences of drilling and bleaching on the reproducibility of the most commonly used proxies in the skeletons of Ceratoporella nicholsoni: δ13C, δ18O and Sr/Ca ratios. We further compare proxy records from different specimens that were correlated by U-Th dating. We find a good reproducibility for δ13C and Sr/Ca ratios. On the other hand, δ18O records show no reproducible trends and do not correlate with the Sr/Ca records. Bleaching alters the isotopic composition of the samples and decreases the reproducibilty. Sr/Ca ratios are not affected by bleaching. XRD analysis shows that fast sample drilling in the dense aragonitic skeletons can produce up to about 1 percent of calcite. Isotope values from samples drilled with different drill speeds show no significant variation, even at elevated calcite contents. Analysis of the organic carbon content shows a 3 cm wide zone in the youngest part of the skeletons with slightly elevated values (0.25 percent). In the older skeletal parts organic carbon contents are lower (0.1 percent). X-ray radiographs show no porosity change with increasing age of the skeleton except for a thin (<5 mm) rim with higher porosities below the oral surface. Overall, porosities are very low (<4 percent). These observations largely exclude early diagenetic influences through secondary cementation or contamination by organic carbon phases. We conclude that the skeletons of C. nicholsoni are very well suited as recorder of environmental proxies like Sr/Ca and δ13C.

  7. Influencing Factors on Reproducibility and Stability of MRI NIPAM Polymer Gel Dosimeter

    PubMed Central

    Pak, Farideh; Farajollahi, Alireza; Movafaghi, Ali; Naseri, Alireza

    2013-01-01

    Introduction: At present, the polymer gel dosimeter is considered to be the best possible dosimeter for measuring 3-dimesional radiation dose distribution in radiotherapy. These gels are normally toxic; therefore, manufacturing, handling and discarding them require special attention. In order to find less toxic recipe, N-isopropyle acrylamide polymer gel (NIPAM) was introduced. In this study, the reproducibility and stability of NIPAM polymer gel dose response together with some influencing factors related to MR imaging were studied. Methods: The NIPAM gel was prepared according to a method, described by senden et al in 2006. The gels were irradiated approximately 2 h after manufacturing and MR images of the gel were made 24 h after irradiation. The effects of different batches, post-irradiation time and the MRI room temperature on reproducibility and stability of polymer gel dose response were explored by analyzing the NMR response (R2) of the gel. Results:: In a fixed temperature, the response of the gel was found to be stable 24 h after irradiation. The results showed that the dose response of the NIPAM polymer gel is highly reproducible in the same and different batches of chemical. No inhomogeneity was observed for magnetic fields in the specified position of measurements and 5°C fluctuation was recorded for MRI room temperature. Conclusion: Fluctuation in MRI room temperature necessitates that stringent attention to be paid to controlling the gel temperature at the time of imaging. The new formulation of polymer gel ensures stability of the gels’ spatial resolution and makes it a suitable dosimeter for distant or remote measurements. PMID:24455479

  8. Short-Term Displacement and Reproducibility of the Breast and Nodal Targets Under Active Breathing Control

    SciTech Connect

    Moran, Jean M. . E-mail: jmmoran@med.umich.edu; Balter, James M.; Ben-David, Merav A.; Marsh, Robin B. C; Herk, Marcel van; Pierce, Lori J.

    2007-06-01

    Purpose: The short-term displacement and reproducibility of the breast or chest wall, and the internal mammary (IM), infraclavicular (ICV), and supraclavicular (SCV) nodal regions have been assessed as a function of breath-hold state using an active breathing control (ABC) device for patients receiving loco-regional breast radiation therapy. Methods and Materials: Ten patients underwent computed tomographic scanning using an ABC device at breath-hold states of end-exhale and 20%, 40%, 60%, and 80% of vital capacity (VC). Patients underwent scanning before treatment and at one third and two thirds of the way through treatment. A regional registration was performed for each target using a rigid-body transformation with mutual information as a metric. Results: Between exhale and 40% of VC, the mean displacement was 0.27/0.34, 0.24/0.31, 0.22/0.19, and 0.13/0.19 cm anterior/superior for the breast or chest wall, and IM, ICV, and SCV nodes, respectively. At 80% of VC, the mean displacement from exhale was 0.84/.88, 0.76/.79, 0.70/0.79, and 0.54/0.56 cm anterior/superior for the breast or chest wall, and IM, ICV, and SCV nodes, respectively. The short-term reproducibility (standard deviation) was <0.3 and {<=}0.4 cm for 40% and 80% of VC, respectively. Displacements up to 1.9 cm were observed for individual patients. Conclusions: The short-term reproducibility of target position is {<=}0.4 cm using ABC for all structures for all breath-hold states. This information can be used to guide treatment planning optimization studies that consider the effect of motion on target and normal tissue doses with and without active breathing control.

  9. [Reproducibility of cytologic diagnosis: study of CRISAP Ile-de-France].

    PubMed

    Barrès, D; Bergeron, C

    2000-02-01

    The cervical Pap smear is the method of choice for early detection of precancerous cervical lesions. However, the sensitivity of the Pap smear is not perfect, nor is the specificity, in particular for minor atypia; quality control is the best method to improve the efficiency of the test. The "Centre de regroupement informatique et statistique des données en anatomie pathologique" (CRISAP) Ile-de-France has initiated an external quality control with a protocol aimed at assessing diagnosis reproducibility among observers. Thirteen pathologists agreed to participate on a voluntary basis fur this protocol, which consisted of the rereading of 650 slides chosen at random. Each participant reread 50 cases and sent 50 cases to be reread anonymously. The diagnosis was given according to the Bethesda classification. The reproducibility was assessed using the percentage of agreement and kappa statistics. The percentage of agreement for the whole group was 65% and the weighted kappa 0.66. When normal and unsatisfactory cases were combined as negative and atypical squamous cells of undetermined significance (ASCUS), low-grade squamous intraepithelial lesion (LGSIL) and high-grade squamous intraepithelial lesion (HGSIL) as positive, the percentage of agreement was 83% and Kappa 0.66. When normal, unsatisfactory and ASCUS were combined as negative and LGSIL and HGSIL as positive, the percentage agreement raised to 90% and kappa to 0.76. This attempt at assessing the reproducibility of cytologic diagnosis using an informal method had led to good participation among participants. A fairly good agreement for overall cytologic diagnosis was found, though some variability remained concerning the diagnosis of unsatisfactory and those labelled as ASCUS. These results should lead the participants to reflect upon and eventually reassess their criteria on these specific cases.

  10. On the reproducibility of protein crystal structures: five atomic resolution structures of trypsin

    PubMed Central

    Liebschner, Dorothee; Dauter, Miroslawa; Brzuszkiewicz, Anna; Dauter, Zbigniew

    2013-01-01

    Structural studies of proteins usually rely on a model obtained from one crystal. By investigating the details of this model, crystallographers seek to obtain insight into the function of the macromolecule. It is therefore important to know which details of a protein structure are reproducible or to what extent they might differ. To address this question, the high-resolution structures of five crystals of bovine trypsin obtained under analogous conditions were compared. Global parameters and structural details were investigated. All of the models were of similar quality and the pairwise merged intensities had large correlation coefficients. The Cα and backbone atoms of the structures superposed very well. The occupancy of ligands in regions of low thermal motion was reproducible, whereas solvent molecules containing heavier atoms (such as sulfur) or those located on the surface could differ significantly. The coordination lengths of the calcium ion were conserved. A large proportion of the multiple conformations refined to similar occupancies and the residues adopted similar orientations. More than three quarters of the water-molecule sites were conserved within 0.5 Å and more than one third were conserved within 0.1 Å. An investigation of the protonation states of histidine residues and carboxylate moieties was consistent for all of the models. Radiation-damage effects to disulfide bridges were observed for the same residues and to similar extents. Main-chain bond lengths and angles averaged to similar values and were in agreement with the Engh and Huber targets. Other features, such as peptide flips and the double conformation of the inhibitor molecule, were also reproducible in all of the trypsin structures. Therefore, many details are similar in models obtained from different crystals. However, several features of residues or ligands located in flexible parts of the macromolecule may vary significantly, such as side-chain orientations and the occupancies

  11. A reproducible grading scale for histological assessment of inflammation in ulcerative colitis

    PubMed Central

    Geboes, K; Riddell, R; Ost, A; Jensfelt, B; Persson, T; Lofberg, R

    2000-01-01

    BACKGROUND—Evaluation of histological activity in ulcerative colitis needs to be reproducible but has rarely been tested. This could be useful both clinically and in clinical trials.
AIM—To develop reproducible criteria which are valid in the assessment of acute inflammation (activity) and chronicity, and to evaluate these features in an interobserver variability study.
METHODS—A six grade classification system for inflammation was developed which could also be fine tuned within each grade. The grades were: 0, structural change only; 1, chronic inflammation; 2, lamina propria neutrophils; 3, neutrophils in epithelium; 4, crypt destruction; and 5, erosions or ulcers. Ninety nine haematoxylin-eosin sections from endoscopically inflamed and non-inflamed mucosa from patients with distal ulcerative colitis were assessed in two separate readings by three pathologists independently and without knowledge of the clinical status. Interobserver agreement was compared pairwise using kappa statistics.
RESULTS—Initially, kappa values between the observers were 0.20, 0.42, and 0.26, which are too low to be of value. Following development of a semiquantitative pictorial scale for each criterion, kappa values improved to 0.62, 0.70, and 0.59. For activity defined by neutrophils between epithelial cells, kappa values were 0.903, 1.000, and 0.907. Complete agreement was reached in 64% of samples of endoscopically normal and in 66% of endoscopically inflamed tissue. Neutrophils in epithelium correlated with the presence of crypt destruction and ulceration.
CONCLUSION—A histological activity system was developed for ulcerative colitis that showed good reproducibility and modest agreement with the endoscopic grading system which it complemented. It has potential value both clinically and in clinical trials.


Keywords: ulcerative colitis; biopsy; inflammation; scoring system PMID:10940279

  12. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    NASA Astrophysics Data System (ADS)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  13. Reproducible increases in blood pressure during intermittent noise exposure: underlying haemodynamic mechanisms specific to passive coping.

    PubMed

    Sawada, Y

    1993-01-01

    The purpose of the present study was to investigate the reproducibility of the increases in blood pressure found in our recent study on exposure to intermittent noise, to confirm the haemodynamic mechanism raising blood pressure (via an increase in peripheral vascular resistance expected to be specific to passive coping), and to assess baroreceptor cardiac reflex sensitivity in connection with the blood pressure elevation. A group of 16 young normotensive men participated in the experiment and underwent a 10-min intermittent exposure to pink noise at 100 dB (sound pressure level). The subjects also underwent three other stresses: a 1-min cold pressor test, a 3-min isometric handgrip and 3-min of mental arithmetic. The results indicated that blood pressure was elevated reproducibly for most of the noise exposure periods and that peripheral vascular resistance increased simultaneously, as expected. Baroreflex sensitivity was not suppressed. The results, as a whole, were in agreement with our recent findings for exposure to a similar type of noise and thus the reproducibility was corroborated. The mechanism raising blood pressure was similar in the cold pressor test. Conversely, during the isometric handgrip and mental arithmetic, blood pressure elevations were attributable mainly to increases in cardiac output. The implications of the opposing haemodynamic mechanisms raising blood pressure among the four stressful tasks have been discussed in relation to active versus passive coping required for each task. Differences in the magnitude of suppression observed in baroreflex sensitivity among the tasks have also been discussed in the context of defence reactions. PMID:8299606

  14. Reproducibility of Brain Morphometry from Short-Term Repeat Clinical MRI Examinations: A Retrospective Study

    PubMed Central

    Liu, Hon-Man; Chen, Shan-Kai; Chen, Ya-Fang; Lee, Chung-Wei; Yeh, Lee-Ren

    2016-01-01

    Purpose To assess the inter session reproducibility of automatic segmented MRI-derived measures by FreeSurfer in a group of subjects with normal-appearing MR images. Materials and Methods After retrospectively reviewing a brain MRI database from our institute consisting of 14,758 adults, those subjects who had repeat scans and had no history of neurodegenerative disorders were selected for morphometry analysis using FreeSurfer. A total of 34 subjects were grouped by MRI scanner model. After automatic segmentation using FreeSurfer, label-wise comparison (involving area, thickness, and volume) was performed on all segmented results. An intraclass correlation coefficient was used to estimate the agreement between sessions. Wilcoxon signed rank test was used to assess the population mean rank differences across sessions. Mean-difference analysis was used to evaluate the difference intervals across scanners. Absolute percent difference was used to estimate the reproducibility errors across the MRI models. Kruskal-Wallis test was used to determine the across-scanner effect. Results The agreement in segmentation results for area, volume, and thickness measurements of all segmented anatomical labels was generally higher in Signa Excite and Verio models when compared with Sonata and TrioTim models. There were significant rank differences found across sessions in some labels of different measures. Smaller difference intervals in global volume measurements were noted on images acquired by Signa Excite and Verio models. For some brain regions, significant MRI model effects were observed on certain segmentation results. Conclusions Short-term scan-rescan reliability of automatic brain MRI morphometry is feasible in the clinical setting. However, since repeatability of software performance is contingent on the reproducibility of the scanner performance, the scanner performance must be calibrated before conducting such studies or before using such software for retrospective

  15. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder

    PubMed Central

    Richardson, Jason R.; Taylor, Michele M.; Shalat, Stuart L.; Guillot, Thomas S.; Caudle, W. Michael; Hossain, Muhammad M.; Mathews, Tiffany A.; Jones, Sara R.; Cory-Slechta, Deborah A.; Miller, Gary W.

    2015-01-01

    Attention-deficit hyperactivity disorder (ADHD) is estimated to affect 8–12% of school-age children worldwide. ADHD is a complex disorder with significant genetic contributions. However, no single gene has been linked to a significant percentage of cases, suggesting that environmental factors may contribute to ADHD. Here, we used behavioral, molecular, and neurochemical techniques to characterize the effects of developmental exposure to the pyrethroid pesticide deltamethrin. We also used epidemiologic methods to determine whether there is an association between pyrethroid exposure and diagnosis of ADHD. Mice exposed to the pyrethroid pesticide deltamethrin during development exhibit several features reminiscent of ADHD, including elevated dopamine transporter (DAT) levels, hyperactivity, working memory and attention deficits, and impulsive-like behavior. Increased DAT and D1 dopamine receptor levels appear to be responsible for the behavioral deficits. Epidemiologic data reveal that children aged 6–15 with detectable levels of pyrethroid metabolites in their urine were more than twice as likely to be diagnosed with ADHD. Our epidemiologic finding, combined with the recapitulation of ADHD behavior in pesticide-treated mice, provides a mechanistic basis to suggest that developmental pyrethroid exposure is a risk factor for ADHD.—Richardson, J. R., Taylor, M. M., Shalat, S. L., Guillot III, T. S., Caudle, W. M., Hossain, M. M., Mathews, T. A., Jones, S. R., Cory-Slechta, D. A., Miller, G. W. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder. PMID:25630971

  16. How to Obtain Reproducible Results for Lithium Sulfur Batteries

    SciTech Connect

    Zheng, Jianming; Lu, Dongping; Gu, Meng; Wang, Chong M.; Zhang, Jiguang; Liu, Jun; Xiao, Jie

    2013-01-01

    The basic requirements for getting reliable Li-S battery data have been discussed in this work. Unlike Li-ion batteries, electrolyte-rich environment significantly affects the cycling stability of Li-S batteries prepared and tested under the same conditions. The reason has been assigned to the different concentrations of polysulfide-containing electrolytes in the cells, which have profound influences on both sulfur cathode and lithium anode. At optimized S/E ratio of 50 g L-1, a good balance among electrolyte viscosity, wetting ability, diffusion rate dissolved polysulfide and nucleation/growth of short-chain Li2S/Li2S2 has been built along with largely reduced contamination on the lithium anode side. Accordingly, good cyclability, high reversible capacity and Coulombic efficiency are achieved in Li-S cell with controlled S/E ratio without any additive. Other factors such as sulfur content in the composite and sulfur loading on the electrode also need careful concern in Li-S system in order to generate reproducible results and gauge the various methods used to improve Li-S battery technology.

  17. The rapid reproducers paradox: population control and individual procreative rights.

    PubMed

    Wissenburg, M

    1998-01-01

    This article argues that population policies need to be evaluated from macro and micro perspectives and to consider individual rights. Ecological arguments that are stringent conditions of liberal democracy are assessed against a moral standard. The moral standard is applied to a series of reasons for limiting procreative rights in the cause of sustainability. The focus is directly on legally enforced antinatalist measures and not on indirect policies with incentives and disincentives. The explicit assumption is that population policy violates the fairness to individuals for societal gain and that population policies are incompatible with stringent conditions of liberal democracy. The author identifies the individual-societal tradeoff as the "rapid reproducers paradox." The perfect sustainable population level is either not possible or is a repugnant alternative. 12 ecological arguments are presented, and none are found compatible with notions of a liberal democracy. Three alternative antinatalist options are the acceptance of less rigid and still coercive policies, amendments to the conception of liberal democracy, or loss of hope and choice of noncoercive solutions to sustainability, none of which is found viable. If voluntary abstinence and distributive solutions fail, then frugal demand options and technological supply options both will be necessary.

  18. A silicon retina that reproduces signals in the optic nerve.

    PubMed

    Zaghloul, Kareem A; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor-and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  19. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  20. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  1. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles.

  2. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-04-27

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics.

  3. Reproducibility of tactile assessments for children with unilateral cerebral palsy.

    PubMed

    Auld, Megan Louise; Ware, Robert S; Boyd, Roslyn Nancy; Moseley, G Lorimer; Johnston, Leanne Marie

    2012-05-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were Semmes-Weinstein monofilaments: 75% (90%); single-point localization: 69% (97%); static two-point discrimination: 93% (97%); and moving two-point discrimination: 87% (97%). Test-retest reliability for registration and unilateral spatial tactile perception tests was high in children with CP (intraclass correlation coefficient [ICC] = 0.79-0.96). Two tests demonstrated a learning effect for children with CP, double simultaneous and tactile texture perception. Stereognosis had a ceiling effect for TDC (ICC = 0) and variability for children with CP (% exact agreement = 47%-50%). The Semmes-Weinstein monofilaments, single-point localization, and both static and moving two-point discrimination are recommended for use in practice and research. Although recommended to provide a comprehensive assessment, the measures of double simultaneous, stereognosis, and tactile texture perception may not be responsive to change over time in children with unilateral CP.

  4. Magnetofection: A Reproducible Method for Gene Delivery to Melanoma Cells

    PubMed Central

    Prosen, Lara; Prijic, Sara; Music, Branka; Lavrencak, Jaka; Cemazar, Maja; Sersa, Gregor

    2013-01-01

    Magnetofection is a nanoparticle-mediated approach for transfection of cells, tissues, and tumors. Specific interest is in using superparamagnetic iron oxide nanoparticles (SPIONs) as delivery system of therapeutic genes. Magnetofection has already been described in some proof-of-principle studies; however, fine tuning of the synthesis of SPIONs is necessary for its broader application. Physicochemical properties of SPIONs, synthesized by the co-precipitation in an alkaline aqueous medium, were tested after varying different parameters of the synthesis procedure. The storage time of iron(II) sulfate salt, the type of purified water, and the synthesis temperature did not affect physicochemical properties of SPIONs. Also, varying the parameters of the synthesis procedure did not influence magnetofection efficacy. However, for the pronounced gene expression encoded by plasmid DNA it was crucial to functionalize poly(acrylic) acid-stabilized SPIONs (SPIONs-PAA) with polyethyleneimine (PEI) without the adjustment of its elementary alkaline pH water solution to the physiological pH. In conclusion, the co-precipitation of iron(II) and iron(III) sulfate salts with subsequent PAA stabilization, PEI functionalization, and plasmid DNA binding is a robust method resulting in a reproducible and efficient magnetofection. To achieve high gene expression is important, however, the pH of PEI water solution for SPIONs-PAA functionalization, which should be in the alkaline range. PMID:23862136

  5. Reproducibility of Vibrionaceae population structure in coastal bacterioplankton

    PubMed Central

    Szabo, Gitta; Preheim, Sarah P; Kauffman, Kathryn M; David, Lawrence A; Shapiro, Jesse; Alm, Eric J; Polz, Martin F

    2013-01-01

    How reproducibly microbial populations assemble in the wild remains poorly understood. Here, we assess evidence for ecological specialization and predictability of fine-scale population structure and habitat association in coastal ocean Vibrionaceae across years. We compare Vibrionaceae lifestyles in the bacterioplankton (combinations of free-living, particle, or zooplankton associations) measured using the same sampling scheme in 2006 and 2009 to assess whether the same groups show the same environmental association year after year. This reveals complex dynamics with populations falling primarily into two categories: (i) nearly equally represented in each of the two samplings and (ii) highly skewed, often to an extent that they appear exclusive to one or the other sampling times. Importantly, populations recovered at the same abundance in both samplings occupied highly similar habitats suggesting predictable and robust environmental association while skewed abundances of some populations may be triggered by shifts in ecological conditions. The latter is supported by difference in the composition of large eukaryotic plankton between years, with samples in 2006 being dominated by copepods, and those in 2009 by diatoms. Overall, the comparison supports highly predictable population-habitat linkage but highlights the fact that complex, and often unmeasured, environmental dynamics in habitat occurrence may have strong effects on population dynamics. PMID:23178668

  6. Reproducibility of cold provocation in patients with Raynaud's phenomenon.

    PubMed

    Wigley, F M; Malamet, R; Wise, R A

    1987-08-01

    Twenty-five patients with Raynaud's phenomenon had serial cold challenges during a double blinded drug trial. The data were analyzed to determine the reproducibility of cold provocation in the induction of critical closure of the digital artery in patients with Raynaud's phenomenon. Finger systolic pressure (FSP) was measured after local digital cooling using a digital strain gauge placed around the distal phalanx. Nineteen of 25 patients completed the study. The prevalence of inducing a Raynaud's attack decreased with each successive cold challenge from 74% of patients at initial challenge to 42% at the 3rd challenge. A lower temperature was required to induce a Raynaud's attack at last challenge (10.6 +/- 0.6 degrees C) compared to the first cold challenge (13.2 +/- 1.0 degrees C). Our data demonstrate adaptation to a laboratory cold challenge through the winter months in patients with Raynaud's phenomenon and show it is an important factor in objectively assessing drug efficacy in the treatment of Raynaud's phenomenon.

  7. Reproducing natural spider silks' copolymer behavior in synthetic silk mimics.

    PubMed

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-12-10

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia , indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  8. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  9. A silicon retina that reproduces signals in the optic nerve

    NASA Astrophysics Data System (ADS)

    Zaghloul, Kareem A.; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor—and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  10. Stochastic simulations of minimal self-reproducing cellular systems.

    PubMed

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2007-10-29

    This paper is a theoretical attempt to gain insight into the problem of how self-assembling vesicles (closed bilayer structures) could progressively turn into minimal self-producing and self-reproducing cells, i.e. into interesting candidates for (proto)biological systems. With this aim, we make use of a recently developed object-oriented platform to carry out stochastic simulations of chemical reaction networks that take place in dynamic cellular compartments. We apply this new tool to study the behaviour of different minimal cell models, making realistic assumptions about the physico-chemical processes and conditions involved (e.g. thermodynamic equilibrium/non-equilibrium, variable volume-to-surface relationship, osmotic pressure, solute diffusion across the membrane due to concentration gradients, buffering effect). The new programming platform has been designed to analyse not only how a single protometabolic cell could maintain itself, grow or divide, but also how a collection of these cells could 'evolve' as a result of their mutual interactions in a common environment. PMID:17510021

  11. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  12. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  13. Optimizing reproducibility evaluation for random amplified polymorphic DNA markers.

    PubMed

    Ramos, J R; Telles, M P C; Diniz-Filho, J A F; Soares, T N; Melo, D B; Oliveira, G

    2008-01-01

    The random amplified polymorphic DNA (RAPD) technique is often criticized because it usually shows low levels of repeatability; thus it can generate spurious bands. These problems can be partially overcome by rigid laboratory protocols and by performing repeatability tests. However, because it is expensive and time-consuming to obtain genetic data twice for all individuals, a few randomly chosen individuals are usually selected for a priori repeatability analysis, introducing a potential bias in genetic parameter estimates. We developed a procedure to optimize repeatability analysis based on RAPD data, which was applied to evaluate genetic variability in three local populations of Tibochina papyrus, an endemic Cerrado plant found in elevated rocky fields in Brazil. We used a simulated annealing procedure to select the smallest number of individuals that contain all bands and repeated the analyses only for those bands that were reproduced in these individuals. We compared genetic parameter estimates using HICKORY and POPGENE softwares on an unreduced data set and on data sets in which we eliminated bands based on repeatability of individuals selected by simulated annealing and based on three randomly selected individuals. Genetic parameter estimates were very similar when we used the optimization procedure to reduce the number of bands analyzed, but as expected, selecting only three individuals to evaluate the repeatability of bands produced very different estimates. We conclude that the problems of repeatability attributed to RAPD markers could be due to bias in the selection of loci and primers and not necessarily to the RAPD technique per se. PMID:19065774

  14. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  15. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  16. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  17. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  18. Reproducing stone monument photosynthetic-based colonization under laboratory conditions.

    PubMed

    Miller, Ana Zélia; Laiz, Leonila; Gonzalez, Juan Miguel; Dionísio, Amélia; Macedo, Maria Filomena; Saiz-Jimenez, Cesareo

    2008-11-01

    In order to understand the biodeterioration process occurring on stone monuments, we analyzed the microbial communities involved in these processes and studied their ability to colonize stones under controlled laboratory experiments. In this study, a natural green biofilm from a limestone monument was cultivated, inoculated on stone probes of the same lithotype and incubated in a laboratory chamber. This incubation system, which exposes stone samples to intermittently sprinkling water, allowed the development of photosynthetic biofilms similar to those occurring on stone monuments. Denaturing gradient gel electrophoresis (DGGE) analysis was used to evaluate the major microbial components of the laboratory biofilms. Cyanobacteria, green microalgae, bacteria and fungi were identified by DNA-based molecular analysis targeting the 16S and 18S ribosomal RNA genes. The natural green biofilm was mainly composed by the Chlorophyta Chlorella, Stichococcus, and Trebouxia, and by Cyanobacteria belonging to the genera Leptolyngbya and Pleurocapsa. A number of bacteria belonging to Alphaproteobacteria, Bacteroidetes and Verrucomicrobia were identified, as well as fungi from the Ascomycota. The laboratory colonization experiment on stone probes showed a colonization pattern similar to that occurring on stone monuments. The methodology described in this paper allowed to reproduce a colonization equivalent to the natural biodeteriorating process.

  19. Library preparation for highly accurate population sequencing of RNA viruses

    PubMed Central

    Acevedo, Ashley; Andino, Raul

    2015-01-01

    Circular resequencing (CirSeq) is a novel technique for efficient and highly accurate next-generation sequencing (NGS) of RNA virus populations. The foundation of this approach is the circularization of fragmented viral RNAs, which are then redundantly encoded into tandem repeats by ‘rolling-circle’ reverse transcription. When sequenced, the redundant copies within each read are aligned to derive a consensus sequence of their initial RNA template. This process yields sequencing data with error rates far below the variant frequencies observed for RNA viruses, facilitating ultra-rare variant detection and accurate measurement of low-frequency variants. Although library preparation takes ~5 d, the high-quality data generated by CirSeq simplifies downstream data analysis, making this approach substantially more tractable for experimentalists. PMID:24967624

  20. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  1. Evaluation of global impact models' ability to reproduce runoff characteristics over the central United States

    NASA Astrophysics Data System (ADS)

    Giuntoli, Ignazio; Villarini, Gabriele; Prudhomme, Christel; Mallakpour, Iman; Hannah, David M.

    2015-09-01

    The central United States experiences a wide array of hydrological extremes, with the 1993, 2008, 2013, and 2014 flooding events and the 1988 and 2012 droughts representing some of the most recent extremes, and is an area where water availability is critical for agricultural production. This study aims to evaluate the ability of a set of global impact models (GIMs) from the Water Model Intercomparison Project to reproduce the regional hydrology of the central United States for the period 1963-2001. Hydrological indices describing annual daily maximum, medium and minimum flow, and their timing are extracted from both modeled daily runoff data by nine GIMs and from observed daily streamflow measured at 252 river gauges. We compare trend patterns for these indices, and their ability to capture runoff volume differences for the 1988 drought and 1993 flood. In addition, we use a subset of 128 gauges and corresponding grid cells to perform a detailed evaluation of the models on a gauge-to-grid cell basis. Results indicate that these GIMs capture the overall trends in high, medium, and low flows well. However, the models differ from observations with respect to the timing of high and medium flows. More specifically, GIMs that only include water balance tend to be closer to the observations than GIMs that also include the energy balance. In general, as it would be expected, the performance of the GIMs is the best when describing medium flows, as opposed to the two ends of the runoff spectrum. With regards to low flows, some of the GIMs have considerably large pools of zeros or low values in their time series, undermining their ability in capturing low flow characteristics and weakening the ensemble's output. Overall, this study provides a valuable examination of the capability of GIMs to reproduce observed regional hydrology over a range of quantities for the central United States.

  2. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  3. Impact of soil parameter and physical process on reproducibility of hydrological processes by land surface model in semiarid grassland

    NASA Astrophysics Data System (ADS)

    Miyazaki, S.; Yorozu, K.; Asanuma, J.; Kondo, M.; Saito, K.

    2014-12-01

    The land surface model (LSM) takes part in the land-atmosphere interaction on the earth system model for the climate change research. In this study, we evaluated the impact of soil parameters and physical process on reproducibility of hydrological process by LSM Minimal Advanced Treatments of Surface Interaction and RunOff (MATSIRO; Takata et al, 2003, GPC) forced by the meteorological data observed at grassland in semiarid climate in China and Mongolia. The testing of MATSIRO was carried out offline mode over the semiarid grassland sites at Tongyu (44.42 deg. N, 122.87 deg. E, altitude: 184m) in China, Kherlen Bayan Ulaan (KBU; 47.21 deg. N, 108.74 deg. E, altitude: 1235m) and Arvaikheer (46.23 N, 102.82E, altitude: 1,813m) in Mongolia. Although all sites locate semiarid grassland, the climate condition is different among sites, which the annual air temperature and precipitation are 5.7 deg. C and 388mm (Tongyu), 1.2 deg.C and 180mm (KBU), and 0.4 deg. C and 245mm(Arvaikheer). We can evaluate the effect of climate condition on the model performance. Three kinds of experiments have been carried out, which was run with the default parameters (CTL), the observed parameters (OBS) for soil physics and hydrology, and vegetation, and refined MATSIRO with the effect of ice in thermal parameters and unfrozen water below the freezing with same parameters as OBS run (OBSr). The validation data has been provided by CEOP(http://www.ceop.net/) , RAISE(http://raise.suiri.tsukuba.ac.jp/), GAME-AAN (Miyazaki et al., 2004, JGR) for Tongyu, KBU, and Arvaikheer, respectively. The reproducibility of the net radiation, the soil temperature (Ts), and latent heat flux (LE) were well reproduced by OBS and OBSr run. The change of soil physical and hydraulic parameter affected the reproducibility of soil temperature (Ts) and soil moisture (SM) as well as energy flux component especially for the sensible heat flux (H) and soil heat flux (G). The reason for the great improvement on the

  4. Procedure for accurate fabrication of tissue compensators with high-density material

    NASA Astrophysics Data System (ADS)

    Mejaddem, Younes; Lax, Ingmar; Adakkai K, Shamsuddin

    1997-02-01

    An accurate method for producing compensating filters using high-density material (Cerrobend) is described. The procedure consists of two cutting steps in a Styrofoam block: (i) levelling a surface of the block to a reference level; (ii) depth-modulated milling of the levelled block in accordance with pre-calculated thickness profiles of the compensator. The calculated thickness (generated by a dose planning system) can be reproduced within acceptable accuracy. The desired compensator thickness manufactured according to this procedure is reproduced to within 0.1 mm, corresponding to a 0.5% change in dose at a beam quality of 6 MV. The results of our quality control checks performed with the technique of stylus profiling measurements show an accuracy of 0.04 mm in the milling process over an arbitrary profile along the milled-out Styrofoam block.

  5. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  6. Accurate, in vivo NIR measurement of skeletal muscle oxygenation through fat

    NASA Astrophysics Data System (ADS)

    Jin, Chunguang; Zou, Fengmei; Ellerby, Gwenn E. C.; Scott, Peter; Peshlov, Boyan; Soller, Babs R.

    2010-02-01

    Noninvasive near infrared (NIR) spectroscopic measurement of muscle oxygenation requires the penetration of light through overlying skin and fat layers. We have previously demonstrated a dual-light source design and orthogonalization algorithm that corrects for inference from skin absorption and fat scattering. To achieve accurate muscle oxygen saturation (SmO2) measurement, one must select the appropriate source-detector distance (SD) to completely penetrate the fat layer. Methods: Six healthy subjects were supine for 15min to normalize tissue oxygenation across the body. NIR spectra were collected from the calf, shoulder, lower and upper thigh muscles with long SD distances of 30mm, 35mm, 40mm and 45mm. Spectral preprocessing with the short SD (3mm) spectrum preceded SmO2 calculation with a Taylor series expansion method. Three-way ANOVA was used to compare SmO2 values over varying fat thickness, subjects and SD distances. Results: Overlying fat layers varied in thickness from 4.9mm to 19.6mm across all subjects. SmO2 measured at the four locations were comparable for each subject (p=0.133), regardless of fat thickness and SD distance. SmO2 (mean+/-std dev) measured at calf, shoulder, low and high thigh were 62+/-3%, 59+/-8%, 61+/-2%, 61+/-4% respectively for SD distance of 30mm. In these subjects no significant influence of SD was observed (p=0.948). Conclusions: The results indicate that for our sensor design a 30mm SD is sufficient to penetrate through a 19mm fat layer and that orthogonalization with short SD effectively removed spectral interference from fat to result in a reproducible determination of SmO2.

  7. Progression of Parkinson's disease pathology is reproduced by intragastric administration of rotenone in mice.

    PubMed

    Pan-Montojo, Francisco; Anichtchik, Oleg; Dening, Yanina; Knels, Lilla; Pursche, Stefan; Jung, Roland; Jackson, Sandra; Gille, Gabriele; Spillantini, Maria Grazia; Reichmann, Heinz; Funk, Richard H W

    2010-01-01

    In patients with Parkinson's disease (PD), the associated pathology follows a characteristic pattern involving inter alia the enteric nervous system (ENS), the dorsal motor nucleus of the vagus (DMV), the intermediolateral nucleus of the spinal cord and the substantia nigra, providing the basis for the neuropathological staging of the disease. Here we report that intragastrically administered rotenone, a commonly used pesticide that inhibits Complex I of the mitochondrial respiratory chain, is able to reproduce PD pathological staging as found in patients. Our results show that low doses of chronically and intragastrically administered rotenone induce alpha-synuclein accumulation in all the above-mentioned nervous system structures of wild-type mice. Moreover, we also observed inflammation and alpha-synuclein phosphorylation in the ENS and DMV. HPLC analysis showed no rotenone levels in the systemic blood or the central nervous system (detection limit [rotenone]<20 nM) and mitochondrial Complex I measurements showed no systemic Complex I inhibition after 1.5 months of treatment. These alterations are sequential, appearing only in synaptically connected nervous structures, treatment time-dependent and accompanied by inflammatory signs and motor dysfunctions. These results strongly suggest that the local effect of pesticides on the ENS might be sufficient to induce PD-like progression and to reproduce the neuroanatomical and neurochemical features of PD staging. It provides new insight into how environmental factors could trigger PD and suggests a transsynaptic mechanism by which PD might spread throughout the central nervous system. PMID:20098733

  8. Progression of Parkinson's Disease Pathology Is Reproduced by Intragastric Administration of Rotenone in Mice

    PubMed Central

    Pan-Montojo, Francisco; Anichtchik, Oleg; Dening, Yanina; Knels, Lilla; Pursche, Stefan; Jung, Roland; Jackson, Sandra; Gille, Gabriele; Spillantini, Maria Grazia; Reichmann, Heinz; Funk, Richard H. W.

    2010-01-01

    In patients with Parkinson's disease (PD), the associated pathology follows a characteristic pattern involving inter alia the enteric nervous system (ENS), the dorsal motor nucleus of the vagus (DMV), the intermediolateral nucleus of the spinal cord and the substantia nigra, providing the basis for the neuropathological staging of the disease. Here we report that intragastrically administered rotenone, a commonly used pesticide that inhibits Complex I of the mitochondrial respiratory chain, is able to reproduce PD pathological staging as found in patients. Our results show that low doses of chronically and intragastrically administered rotenone induce alpha-synuclein accumulation in all the above-mentioned nervous system structures of wild-type mice. Moreover, we also observed inflammation and alpha-synuclein phosphorylation in the ENS and DMV. HPLC analysis showed no rotenone levels in the systemic blood or the central nervous system (detection limit [rotenone]<20 nM) and mitochondrial Complex I measurements showed no systemic Complex I inhibition after 1.5 months of treatment. These alterations are sequential, appearing only in synaptically connected nervous structures, treatment time-dependent and accompanied by inflammatory signs and motor dysfunctions. These results strongly suggest that the local effect of pesticides on the ENS might be sufficient to induce PD-like progression and to reproduce the neuroanatomical and neurochemical features of PD staging. It provides new insight into how environmental factors could trigger PD and suggests a transsynaptic mechanism by which PD might spread throughout the central nervous system. PMID:20098733

  9. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    SciTech Connect

    Al-Taie, H. Kelly, M. J.; Smith, L. W.; Lesage, A. A. J.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Smith, C. G.; See, P.

    2015-08-21

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  10. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    NASA Astrophysics Data System (ADS)

    Al-Taie, H.; Smith, L. W.; Lesage, A. A. J.; See, P.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Kelly, M. J.; Smith, C. G.

    2015-08-01

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  11. Reproducible, stable and fast electrochemical activity from easy to make graphene on copper electrodes.

    PubMed

    Bosch-Navarro, Concha; Laker, Zachary P L; Rourke, Jonathan P; Wilson, Neil R

    2015-11-28

    The electrochemical activity of graphene is of fundamental importance to applications from energy storage to sensing, but has proved difficult to unambiguously determine due to the challenges innate to fabricating well defined graphene electrodes free from contamination. Here, we report the electrochemical activity of chemical vapour deposition (CVD) graphene grown on copper foil without further treatment, through appropriate choice of electrolyte. Fast electron transfer kinetics are observed for both inner and outer sphere redox couples with fully covered graphene on copper electrodes (k° = 0.014 ± 0.001 cm s(-1) or k° = 0.012 ± 0.001 cm s(-1) for potassium ferrocyanide(II) and hexaamineruthenium(III) chloride, respectively). Unlike highly oriented pyrolytic graphite electrodes, the electrochemical response of the graphene on copper electrodes is stable, with no apparent electrode fouling even with inner sphere redox couples, and reproducible independent of the time between growth and measurement. Comparison between fully covered electrodes, and partial coverage of graphene with varying graphene grain sizes (from roughly 50 μm to <10 μm) shows that in this instance the basal plane of graphene is electrochemically active. These CVD grown graphene on copper electrodes are quick, cheap and reproducible to make and hence provide a convenient platform for further investigation of graphene electrochemistry and the effect of covalent and non-covalent modification. PMID:26477748

  12. Reproducible copy number variation patterns among single circulating tumor cells of lung cancer patients.

    PubMed

    Ni, Xiaohui; Zhuo, Minglei; Su, Zhe; Duan, Jianchun; Gao, Yan; Wang, Zhijie; Zong, Chenghang; Bai, Hua; Chapman, Alec R; Zhao, Jun; Xu, Liya; An, Tongtong; Ma, Qi; Wang, Yuyan; Wu, Meina; Sun, Yu; Wang, Shuhang; Li, Zhenxiang; Yang, Xiaodan; Yong, Jun; Su, Xiao-Dong; Lu, Youyong; Bai, Fan; Xie, X Sunney; Wang, Jie

    2013-12-24

    Circulating tumor cells (CTCs) enter peripheral blood from primary tumors and seed metastases. The genome sequencing of CTCs could offer noninvasive prognosis or even diagnosis, but has been hampered by low single-cell genome coverage of scarce CTCs. Here, we report the use of the recently developed multiple annealing and looping-based amplification cycles for whole-genome amplification of single CTCs from lung cancer patients. We observed characteristic cancer-associated single-nucleotide variations and insertions/deletions in exomes of CTCs. These mutations provided information needed for individualized therapy, such as drug resistance and phenotypic transition, but were heterogeneous from cell to cell. In contrast, every CTC from an individual patient, regardless of the cancer subtypes, exhibited reproducible copy number variation (CNV) patterns, similar to those of the metastatic tumor of the same patient. Interestingly, different patients with the same lung cancer adenocarcinoma (ADC) shared similar CNV patterns in their CTCs. Even more interestingly, patients of small-cell lung cancer have CNV patterns distinctly different from those of ADC patients. Our finding suggests that CNVs at certain genomic loci are selected for the metastasis of cancer. The reproducibility of cancer-specific CNVs offers potential for CTC-based cancer diagnostics.

  13. Comparability and reproducibility of apex cardiogram recorded with six different transducer systems.

    PubMed Central

    Willems, J L; Denef, B; Kesteloot, H; De Geest, H

    1979-01-01

    A comparison was made in 7 dogs of the results obtained by 6 different apex cardiographic transducers applied before, during, and after controlled infusion of angiotensin and isoprenaline. The electrocardiogram, internal phonocardiogram, aortic and left ventricular pressure using a Telco micromanometer, and apex cardiogram were recorded simultaneously on magnetic tape and paper. Digital computer techniques were used to derive various measurements. The comparison of the 6 transducer systems was made expecially with respect to measurements derived from the normalised derivative, calculated using total as well as developed pressure or displacement. Measurements derived from left ventricular pressure were very reproducible. Differences in results of 'contractility' indices varied between 0.5 and 1.9 per cent. Indices from the apex cardiogram using 6 different transducer systems showed variations up to 20 per cent, with mean values varying between 3.2 and 8.1 per cent. There was a systematic deviation for one transducer system, which was responsible for a significant part of the observed variability. It may be concluded that in order to assure maximal reproducibility, technical characteristics of the apex cardiograph transducer should be taken into account and an optimal recording technique should be used. PMID:465246

  14. Repeatability and Reproducibility of Anterior Segment Measurements in Normal Eyes Using Dual Scheimpflug Analyzer

    PubMed Central

    Altıparmak, Zeynep; Yağcı, Ramazan; Güler, Emre; Arslanyılmaz, Zeynel; Canbal, Metin; Hepşen, İbrahim F.

    2015-01-01

    Objectives: To assess the repeatability and reproducibility of anterior segment measurements including aberrometric measurements provided by a dual Scheimpflug analyzer (Galilei) system in normal eyes. Materials and Methods: Three repeated consecutive measurements were taken by two independent examiners. The following were evaluated: total corneal power and posterior corneal power, corneal higher-order wavefront aberrations (6.0 mm pupil), pachymetry at the central, paracentral, and peripheral zones, and anterior chamber depth (ACD). Repeatability was assessed by calculating the within-subject standard deviation, precision, repeatability, and intraclass correlation coefficient (ICC). Bland-Altman analysis was used for assessing reproducibility. Results: Thirty eyes of 30 patients were included. The best ICC values were for corneal pachymetry and ACD. For both observers, acceptable ICC was also achieved for the other parameters, the only exceptions being posterior corneal astigmatism and total high order aberration. The 95% LoA (Limits of Agreement) values for all measurements showed small variability between the two examiners. Conclusion: The Galilei system provided reliable measurements of anterior segment parameters. Therefore, the instrument can be confidently used for routine clinical use and research purposes. PMID:27800242

  15. Evaluation of the color stability of two techniques for reproducing artificial irides after microwave polymerization

    PubMed Central

    GOIATO, Marcelo Coelho; dos SANTOS, Daniela Micheline; MORENO, Amália; GENNARI-FILHO, Humberto; PELLIZZER, Eduardo Piza

    2011-01-01

    The use of ocular prostheses for ophthalmic patients aims to rebuild facial aesthetics and provide an artificial substitute to the visual organ. Natural intemperate conditions promote discoloration of artificial irides and many studies have attempted to produce irides with greater chromatic paint durability using different paint materials. Objectives The present study evaluated the color stability of artificial irides obtained with two techniques (oil painting and digital image) and submitted to microwave polymerization. Material and Methods Forty samples were fabricated simulating ocular prostheses. Each sample was constituted by one disc of acrylic resin N1 and one disc of colorless acrylic resin with the iris interposed between the discs. The irides in brown and blue color were obtained by oil painting or digital image. The color stability was determined by a reflection spectrophotometer and measurements were taken before and after microwave polymerization. Statistical analysis of the techniques for reproducin