Science.gov

Sample records for accurately reproduce experimental

  1. Accurate measurements of dynamics and reproducibility in small genetic networks

    PubMed Central

    Dubuis, Julien O; Samanta, Reba; Gregor, Thomas

    2013-01-01

    Quantification of gene expression has become a central tool for understanding genetic networks. In many systems, the only viable way to measure protein levels is by immunofluorescence, which is notorious for its limited accuracy. Using the early Drosophila embryo as an example, we show that careful identification and control of experimental error allows for highly accurate gene expression measurements. We generated antibodies in different host species, allowing for simultaneous staining of four Drosophila gap genes in individual embryos. Careful error analysis of hundreds of expression profiles reveals that less than ∼20% of the observed embryo-to-embryo fluctuations stem from experimental error. These measurements make it possible to extract not only very accurate mean gene expression profiles but also their naturally occurring fluctuations of biological origin and corresponding cross-correlations. We use this analysis to extract gap gene profile dynamics with ∼1 min accuracy. The combination of these new measurements and analysis techniques reveals a twofold increase in profile reproducibility owing to a collective network dynamics that relays positional accuracy from the maternal gradients to the pair-rule genes. PMID:23340845

  2. Automated morphometry provides accurate and reproducible virtual staging of liver fibrosis in chronic hepatitis C

    PubMed Central

    Calès, Paul; Chaigneau, Julien; Hunault, Gilles; Michalak, Sophie; Cavaro-Menard, Christine; Fasquel, Jean-Baptiste; Bertrais, Sandrine; Rousselet, Marie-Christine

    2015-01-01

    morphometric scores provide reproducible and accurate diagnoses of fibrosis stages via “virtual expert pathologist.” PMID:26110088

  3. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  4. Cycle accurate and cycle reproducible memory for an FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameh W.; Kapur, Mohit

    2016-03-15

    A method, system and computer program product are disclosed for using a Field Programmable Gate Array (FPGA) to simulate operations of a device under test (DUT). The DUT includes a device memory having a number of input ports, and the FPGA is associated with a target memory having a second number of input ports, the second number being less than the first number. In one embodiment, a given set of inputs is applied to the device memory at a frequency Fd and in a defined cycle of time, and the given set of inputs is applied to the target memory at a frequency Ft. Ft is greater than Fd and cycle accuracy is maintained between the device memory and the target memory. In an embodiment, a cycle accurate model of the DUT memory is created by separating the DUT memory interface protocol from the target memory storage array.

  5. A cricoid cartilage compression device for the accurate and reproducible application of cricoid pressure.

    PubMed

    Taylor, R J; Smurthwaite, G; Mehmood, I; Kitchen, G B; Baker, R D

    2015-01-01

    We describe the development and laboratory assessment of a refined prototype tactile feedback device for the safe and accurate application of cricoid pressure. We recruited 20 operating department practitioners and compared their performance of cricoid pressure on a training simulator using both the device and a manual unaided technique. The device significantly reduced the spread of the applied force: average (SE) root mean squared error decreased from 8.23 (0.48) N to 5.23 (0.32) N (p < 0.001). The average (SE) upwards bias in applied force also decreased, from 2.30 (0.74) N to 0.88 (0.48) N (p < 0.01). Most importantly, the percentage of force applications that deviated from target by more than 10 N decreased from 18% to 7% (p < 0.01). The device requires no prior training, is cheap to manufacture, is single-use and requires no power to operate, whilst ensuring that the correct force is always consistently applied. PMID:25267415

  6. Nebulizer calibration using lithium chloride: an accurate, reproducible and user-friendly method.

    PubMed

    Ward, R J; Reid, D W; Leonard, R F; Johns, D P; Walters, E H

    1998-04-01

    Conventional gravimetric (weight loss) calibration of jet nebulizers overestimates their aerosol output by up to 80% due to unaccounted evaporative loss. We examined two methods of measuring true aerosol output from jet nebulizers. A new adaptation of a widely available clinical assay for lithium (determined by flame photometry, LiCl method) was compared to an existing electrochemical method based on fluoride detection (NaF method). The agreement between the two methods and the repeatability of each method were examined. Ten Mefar jet nebulizers were studied using a Mefar MK3 inhalation dosimeter. There was no significant difference between the two methods (p=0.76) with mean aerosol output of the 10 nebulizers being 7.40 mg x s(-1) (SD 1.06; range 5.86-9.36 mg x s(-1)) for the NaF method and 7.27 mg x s(-1) (SD 0.82; range 5.52-8.26 mg x s(-1)) for the LiCl method. The LiCl method had a coefficient of repeatability of 13 mg x s(-1) compared with 3.7 mg x s(-1) for the NaF method. The LiCl method accurately measured true aerosol output and was considerably easier to use. It was also more repeatable, and hence more precise, than the NaF method. Because the LiCl method uses an assay that is routinely available from hospital biochemistry laboratories, it is easy to use and, thus, can readily be adopted by busy respiratory function departments. PMID:9623700

  7. Randomized block experimental designs can increase the power and reproducibility of laboratory animal experiments.

    PubMed

    Festing, Michael F W

    2014-01-01

    Randomized block experimental designs have been widely used in agricultural and industrial research for many decades. Usually they are more powerful, have higher external validity, are less subject to bias, and produce more reproducible results than the completely randomized designs typically used in research involving laboratory animals. Reproducibility can be further increased by using time as a blocking factor. These benefits can be achieved at no extra cost. A small experiment investigating the effect of an antioxidant on the activity of a liver enzyme in four inbred mouse strains, which had two replications (blocks) separated by a period of two months, illustrates this approach. The widespread failure to use these designs more widely in research involving laboratory animals has probably led to a substantial waste of animals, money, and scientific resources and slowed down the development of new treatments for human and animal diseases. PMID:25541548

  8. Accurate and reproducible detection of proteins in water using an extended-gate type organic transistor biosensor

    NASA Astrophysics Data System (ADS)

    Minamiki, Tsukuru; Minami, Tsuyoshi; Kurita, Ryoji; Niwa, Osamu; Wakida, Shin-ichi; Fukuda, Kenjiro; Kumaki, Daisuke; Tokito, Shizuo

    2014-06-01

    In this Letter, we describe an accurate antibody detection method using a fabricated extended-gate type organic field-effect-transistor (OFET), which can be operated at below 3 V. The protein-sensing portion of the designed device is the gate electrode functionalized with streptavidin. Streptavidin possesses high molecular recognition ability for biotin, which specifically allows for the detection of biotinylated proteins. Here, we attempted to detect biotinylated immunoglobulin G (IgG) and observed a shift of threshold voltage of the OFET upon the addition of the antibody in an aqueous solution with a competing bovine serum albumin interferent. The detection limit for the biotinylated IgG was 8 nM, which indicates the potential utility of the designed device in healthcare applications.

  9. Reproducibility and Variability of the Cost Functions Reconstructed from Experimental Recordings in Multi-Finger Prehension

    PubMed Central

    Niu, Xun; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2012-01-01

    The main goal of the study is to examine whether the cost (objective) functions reconstructed from experimental recordings in multi-finger prehension tasks are reproducible over time, i.e., whether the functions reflect stable preferences of the subjects and can be considered personal characteristics of motor coordination. Young, healthy participants grasped an instrumented handle with varied values of external torque, load and target grasping force and repeated the trials on three days: Day 1, Day 2, and Day 7. By following Analytical Inverse Optimization (ANIO) computation procedures, the cost functions for individual subjects were reconstructed from the experimental recordings (individual finger forces) for each day. The cost functions represented second-order polynomials of finger forces with non-zero linear terms. To check whether the obtained cost functions were reproducible over time a cross-validation was performed: a cost function obtained on Day i was applied to experimental data observed on Day j (i≠j). In spite of the observed day-to-day variability of the performance and the cost functions, the ANIO reconstructed cost functions were found to be reproducible over time: application of a cost function Ci to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) The 2nd order coefficients Ki of the cost function showed negative linear relations with finger force magnitudes. This fact may be interpreted as encouraging involvement of stronger fingers in tasks requiring higher total force magnitude production. (b) The finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space, which has been confirmed for all subjects and all testing sessions. (c) The discovered principal components in the principal component analysis of the finger forces agreed well with the principle of superposition, i.e. the complex action of

  10. Reproducible Science▿

    PubMed Central

    Casadevall, Arturo; Fang, Ferric C.

    2010-01-01

    The reproducibility of an experimental result is a fundamental assumption in science. Yet, results that are merely confirmatory of previous findings are given low priority and can be difficult to publish. Furthermore, the complex and chaotic nature of biological systems imposes limitations on the replicability of scientific experiments. This essay explores the importance and limits of reproducibility in scientific manuscripts. PMID:20876290

  11. Robust Heterogeneous Anisotropic Elastic Network Model Precisely Reproduces the Experimental B-factors of Biomolecules.

    PubMed

    Xia, Fei; Tong, Dudu; Lu, Lanyuan

    2013-08-13

    A computational method called the progressive fluctuation matching (PFM) is developed for constructing robust heterogeneous anisotropic network models (HANMs) for biomolecular systems. An HANM derived through the PFM approach consists of harmonic springs with realistic positive force constants, and yields the calculated B-factors that are basically identical to the experimental ones. For the four tested protein systems including crambin, trypsin inhibitor, HIV-1 protease, and lysozyme, the root-mean-square deviations between the experimental and the computed B-factors are only 0.060, 0.095, 0.247, and 0.049 Å(2), respectively, and the correlation coefficients are 0.99 for all. By comparing the HANM/ANM normal modes to their counterparts derived from both an atomistic force field and an NMR structure ensemble, it is found that HANM may provide more accurate results on protein dynamics. PMID:26584122

  12. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    SciTech Connect

    Filice, Luigino; Gagliardi, Francesco; Umbrello, Domenico; Shivpuri, Rajiv

    2007-05-17

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  13. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    NASA Astrophysics Data System (ADS)

    Filice, Luigino; Gagliardi, Francesco; Shivpuri, Rajiv; Umbrello, Domenico

    2007-05-01

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D®) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  14. Accurate theoretical and experimental characterization of optical grating coupler.

    PubMed

    Fesharaki, Faezeh; Hossain, Nadir; Vigne, Sebastien; Chaker, Mohamed; Wu, Ke

    2016-09-01

    Periodic structures, acting as reflectors, filters, and couplers, are a fundamental building block section in many optical devices. In this paper, a three-dimensional simulation of a grating coupler, a well-known periodic structure, is conducted. Guided waves and leakage characteristics of an out-of-plane grating coupler are studied in detail, and its coupling efficiency is examined. Furthermore, a numerical calibration analysis is applied through a commercial software package on the basis of a full-wave finite-element method to calculate the complex propagation constant of the structure and to evaluate the radiation pattern. For experimental evaluation, an optimized grating coupler is fabricated using electron-beam lithography technique and plasma etching. An excellent agreement between simulations and measurements is observed, thereby validating the demonstrated method. PMID:27607706

  15. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    NASA Astrophysics Data System (ADS)

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7 - 4.0 )× 1022cm-3 revealed a 30 - 400 % disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015), 10.1038/nature14048]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data

  16. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions.

    PubMed

    Nagayama, T; Bailey, J E; Loisel, G; Rochau, G A; MacFarlane, J J; Golovkin, I

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7-4.0)×10(22)cm(-3) revealed a 30-400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data interpretation and the

  17. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    DOE PAGESBeta

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 1022 cm–3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproducemore » the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the

  18. [Improved reproducibility of contrast echocardiography by SH U 454. Experimental studies using digital subtraction echocardiography].

    PubMed

    Grube, E; Fritzsch, T

    1986-06-01

    The right heart chambers of 10 animals were contrasted by conventional (NaCl, CO2, H2O2, indocyanine green (ICG), haemaccel) and a newly developed echo-contrast medium (SH U 454) and studied by 2-D echocardiography. By means of digital subtraction echocardiography (DSE) endocardial borders were defined automatically and the results were compared with the manual input of endocardial borders of original and contrast echocardiograms. The area enclosed by these borders served as basis for the calculation of reproducibility (in %) and correlations. The following correlation coefficients (r) and SEE were calculated between the areas defined by the different contrast media and DSE and manually derived borders: r = 0.85, 3.98 cm2 (ICG), and 0.89, 1.00 cm2 (haemaccel). The best calculations were found using SH U 454 in concentrations between 100 and 300 mg/ml. The correlation coefficients were in the range of r = 0.95 and 0.98 with an SEE of 0.21 to 0.56 cm2 between manually and automatically derived contours. Comparing the reproducibility of data between the different evaluation methods we found the following results: manual input of endocardial borders in original echocardiograms 12.3%-16.9%; manual definition of endocardial borders in contrast echocardiograms 2.0% (SH U 454) - 15.7% (CO2); automatic contour finding in original echocardiograms 8.6%-28.9% (mean 21.6%); automatic definition of endocardium by DSE in contrast echocardiograms 7.6% (ICG) - 0.9% (SH U 454, 300 mg/ml). Our results demonstrate that digital subtraction echocardiography is a simple an safe procedure to define endocardial contours if echo contrast media lead to a uniform and homogeneous opacification of the left and right cardiac cavities.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:3529670

  19. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs. PMID:24882687

  20. Experimentally reproduced textures and mineral chemistries of high-titanium mare basalts

    NASA Technical Reports Server (NTRS)

    Usselman, T. M.; Lofgren, G. E.; Williams, R. J.; Donaldson, C. H.

    1975-01-01

    Many of the textures, morphologies, and mineral chemistries of the high-titanium mare basalts have been experimentally duplicated using single-stage cooling histories. Lunar high-titanium mare basalts are modeled in a 1 m thick gravitationally differentiating flow based on cooling rates, thermal models, and modal olivine contents. The low-pressure equilibrium phase relations of a synthetic high-titanium basalt composition were investigated as a function of oxygen fugacity, and petrographic criteria are developed for the recognition of phenocrysts which were present in the liquid at the time of eruption.

  1. An easy, rapid, and reproducible way to create a split-thickness wound for experimental purposes.

    PubMed

    Gümüş, Nazim; Özkaya, Neşe Kurt; Bulut, Hüseyin Eray; Yilmaz, Sarper

    2014-09-01

    with the wounds of groups 1 and 2, the wounds in group 3 were significantly deeper than the wounds of other groups, which was statistically significant. In all groups, mean thickness of epidermis in the wound surface showed statistically significant difference from that in the healthy skin. When compared with the healing times of the wounds in the groups, a statistically significant difference was found between them. Creation of a split-skin wound, by using the waterjet system, provides a wound in reproducible size and depth, also in a standardized and rapid manner. Moreover, it makes precise and controlled wound creation in the rat skin. PMID:25102393

  2. Two-dimensional discrete element models of debris avalanches: Parameterization and the reproducibility of experimental results

    NASA Astrophysics Data System (ADS)

    Banton, J.; Villard, P.; Jongmans, D.; Scavia, C.

    2009-11-01

    Application of the discrete element method (DEM) to model avalanches of granular materials requires determining the correct geometric and rheological parameters for and between the particles as well as for the basal surface. The use of spherical (circular in 2-D) particles enhances particle rolling, yielding excessive runout values. The solution usually adopted to correct this effect is to introduce a drag force which artificially slows down the particle velocities. The aim of this study is to test the capability of the DEM to simulate well-controlled unsteady channelized granular flows, considering the measured properties of the particles and of the basal surface which naturally contribute to dissipate energy. We first performed a parametrical analysis on a simple 2-D model in order to estimate the influence of particle shape, friction parameters, and restitution coefficients on the dynamics of the flow and on the deposit geometry. We then simulated three channelized laboratory experiments performed with two materials and two bed linings. Using the geometrical layout and the values of the mechanical parameters provided by the authors, we obtained a remarkable agreement between the observed and 2-D simulated deposit shapes for the three experiments. Also, the computed mass evolution with time was very consistent with the experimental snapshots in all cases. These results highlight the capability of the DEM technique for modeling avalanche of granular material when the particle shape as well as the friction and restitution coefficients are properly considered.

  3. Experimental Evolution under Fluctuating Thermal Conditions Does Not Reproduce Patterns of Adaptive Clinal Differentiation in Drosophila melanogaster.

    PubMed

    Kellermann, Vanessa; Hoffmann, Ary A; Kristensen, Torsten Nygaard; Moghadam, Neda Nasiri; Loeschcke, Volker

    2015-11-01

    Experimental evolution can be a useful tool for testing the impact of environmental factors on adaptive changes in populations, and this approach is being increasingly used to understand the potential for evolutionary responses in populations under changing climates. However, selective factors will often be more complex in natural populations than in laboratory environments and produce different patterns of adaptive differentiation. Here we test the ability of laboratory experimental evolution under different temperature cycles to reproduce well-known patterns of clinal variation in Drosophila melanogaster. Six fluctuating thermal regimes mimicking the natural temperature conditions along the east coast of Australia were initiated. Contrary to expectations, on the basis of field patterns there was no evidence for adaptation to thermal regimes as reflected by changes in cold and heat resistance after 1-3 years of laboratory natural selection. While laboratory evolution led to changes in starvation resistance, development time, and body size, patterns were not consistent with those seen in natural populations. These findings highlight the complexity of factors affecting trait evolution in natural populations and indicate that caution is required when inferring likely evolutionary responses from the outcome of experimental evolution studies. PMID:26655772

  4. All-atom molecular dynamics analysis of multi-peptide systems reproduces peptide solubility in line with experimental observations

    PubMed Central

    Kuroda, Yutaka; Suenaga, Atsushi; Sato, Yuji; Kosuda, Satoshi; Taiji, Makoto

    2016-01-01

    In order to investigate the contribution of individual amino acids to protein and peptide solubility, we carried out 100 ns molecular dynamics (MD) simulations of 106 Å3 cubic boxes containing ~3 × 104 water molecules and 27 tetra-peptides regularly positioned at 23 Å from each other and composed of a single amino acid type for all natural amino acids but cysteine and glycine. The calculations were performed using Amber with a standard force field on a special purpose MDGRAPE-3 computer, without introducing any “artificial” hydrophobic interactions. Tetra-peptides composed of I, V, L, M, N, Q, F, W, Y, and H formed large amorphous clusters, and those containing A, P, S, and T formed smaller ones. Tetra-peptides made of D, E, K, and R did not cluster at all. These observations correlated well with experimental solubility tendencies as well as hydrophobicity scales with correlation coefficients of 0.5 to > 0.9. Repulsive Coulomb interactions were dominant in ensuring high solubility, whereas both Coulomb and van der Waals (vdW) energies contributed to the aggregations of low solubility amino acids. Overall, this very first all-atom molecular dynamics simulation of a multi-peptide system appears to reproduce the basic properties of peptide solubility, essentially in line with experimental observations. PMID:26817663

  5. Continuous nanoflow-scanning electrochemical microscopy: voltammetric characterization and application for accurate and reproducible imaging of enzyme-labeled protein microarrays.

    PubMed

    Kai, Tianhan; Chen, Shu; Monterroso, Estuardo; Zhou, Feimeng

    2015-04-21

    The coupling of scanning electrochemical microscopy (SECM) to a continuous nanoflow (CNF) system is accomplished with the use of a microconcentric ring electrode/injector probe. The gold microring electrode encapsulated by a glass sheath is robust and can be beveled and polished. The CNF system, comprising a precision gas displacement pump and a rotary valve, is capable of delivering solution to the center of the SECM probe in the range of 1-150 nL/min. Major advantages of the CNF-SECM imaging mode over the conventional SECM generation/collection (G/C) mode include higher imaging resolution, immunity from interferences by species in the bulk solution or at other sites of the substrate, elimination of the feedback current that could interfere with the G/C data interpretation, and versatility of initiating surface reactions/processes via introducing different reactants into the flowing stream. Parameters such as flow rates, probe/substrate separations, and collection efficiencies are examined and optimized. Higher resolution, reproducibility, and accuracy are demonstrated through the application of CNF-SECM to horseradish peroxidase (HRP)-amplified imaging of protein microarrays. By flowing H2O2 and ferrocenemethanol through the injector and detecting the surface-generated ferriceniummethanol, human IgG spots covered with HPR-labeled antihuman IgG can be detected in the range of 13 nM-1.333 μM with a detection limit of 3.0 nM. In addition, consistent images of microarray spots for selective and high-density detection of analytes can be attained. PMID:25831146

  6. Molecular speciated isotope dilution mass spectrometric methods for accurate, reproducible and direct quantification of reduced, oxidized and total glutathione in biological samples.

    PubMed

    Fahrenholz, Timothy; Wolle, Mesay Mulugeta; Kingston, H M Skip; Faber, Scott; Kern, John C; Pamuku, Matt; Miller, Logan; Chatragadda, Hemasudha; Kogelnik, Andreas

    2015-01-20

    Novel protocols were developed to accurately quantify reduced (GSH), oxidized (GSSG) and total (tGSH) glutathione in biological samples using molecular speciated isotope dilution mass spectrometry (SIDMS). For GSH and GSSG measurement, the sample was spiked with isotopically enriched analogues of the analytes ((310)GSH and (616)GSSG), along with N-ethylmaleimide (NEM), and treated with acetonitrile to solubilize the endogenous analytes via protein precipitation and equilibrate them with the spikes. The supernatant was analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS), and the analytes were quantified with simultaneous tracking and correction for auto-oxidation of GSH to GSSG. For tGSH assay, a (310)GSH-spiked sample was treated with dithiothreitol (DTT) to convert disulfide-bonded glutathione to GSH. After removing the protein, the supernatant was analyzed by LC-MS/MS and the analyte was quantified by single-spiking isotope dilution mass spectrometry (IDMS). The mathematical relationships in IDMS and SIDMS quantifications are based on isotopic ratios and do not involve calibration curves. The protocols were validated using spike recovery tests and by analyzing synthetic standard solutions. Red blood cell (RBC) and saliva samples obtained from healthy subjects, and whole blood samples collected and shipped from a remote location were analyzed. The concentrations of tGSH in the RBC and whole blood samples were 2 orders of magnitude higher than those found in saliva. The fractions of GSSG were 0.2-2.2% (RBC and blood) and 15-47% (saliva) of the free glutathione (GSH + 2xGSSG) in the corresponding samples. Up to 3% GSH was auto-oxidized to GSSG during sample workup; the highest oxidations (>1%) were in the saliva samples. PMID:25519489

  7. Turtle utricle dynamic behavior using a combined anatomically accurate model and experimentally measured hair bundle stiffness

    PubMed Central

    Davis, J.L.; Grant, J.W.

    2014-01-01

    Anatomically correct turtle utricle geometry was incorporated into two finite element models. The geometrically accurate model included appropriately shaped macular surface and otoconial layer, compact gel and column filament (or shear) layer thicknesses and thickness distributions. The first model included a shear layer where the effects of hair bundle stiffness was included as part of the shear layer modulus. This solid model’s undamped natural frequency was matched to an experimentally measured value. This frequency match established a realistic value of the effective shear layer Young’s modulus of 16 Pascals. We feel this is the most accurate prediction of this shear layer modulus and fits with other estimates (Kondrachuk, 2001b). The second model incorporated only beam elements in the shear layer to represent hair cell bundle stiffness. The beam element stiffness’s were further distributed to represent their location on the neuroepithelial surface. Experimentally measured striola hair cell bundles mean stiffness values were used in the striolar region and the mean extrastriola hair cell bundles stiffness values were used in this region. The results from this second model indicated that hair cell bundle stiffness contributes approximately 40% to the overall stiffness of the shear layer– hair cell bundle complex. This analysis shows that high mass saccules, in general, achieve high gain at the sacrifice of frequency bandwidth. We propose the mechanism by which this can be achieved is through increase the otoconial layer mass. The theoretical difference in gain (deflection per acceleration) is shown for saccules with large otoconial layer mass relative to saccules and utricles with small otoconial layer mass. Also discussed is the necessity of these high mass saccules to increase their overall system shear layer stiffness. Undamped natural frequencies and mode shapes for these sensors are shown. PMID:25445820

  8. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions

    PubMed Central

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower. PMID:26457898

  9. The NHLBI-Sponsored Consortium for preclinicAl assESsment of cARdioprotective Therapies (CAESAR): A New Paradigm for Rigorous, Accurate, and Reproducible Evaluation of Putative Infarct-Sparing Interventions in Mice, Rabbits, and Pigs

    PubMed Central

    Jones, Steven P.; Tang, Xian-Liang; Guo, Yiru; Steenbergen, Charles; Lefer, David J.; Kukreja, Rakesh C.; Kong, Maiying; Li, Qianhong; Bhushan, Shashi; Zhu, Xiaoping; Du, Junjie; Nong, Yibing; Stowers, Heather L.; Kondo, Kazuhisa; Hunt, Gregory N.; Goodchild, Traci T.; Orr, Adam; Chang, Carlos C.; Ockaili, Ramzi; Salloum, Fadi N.; Bolli, Roberto

    2014-01-01

    rigorous, accurate, and reproducible. PMID:25499773

  10. Developing a reproducible non-line-of-sight experimental setup for testing wireless medical device coexistence utilizing ZigBee.

    PubMed

    LaSorte, Nickolas J; Rajab, Samer A; Refai, Hazem H

    2012-11-01

    The integration of heterogeneous wireless technologies is believed to aid revolutionary healthcare delivery in hospitals and residential care. Wireless medical device coexistence is a growing concern given the ubiquity of wireless technology. In spite of this, a consensus standard that addresses risks associated with wireless heterogeneous networks has not been adopted. This paper serves as a starting point by recommending a practice for assessing the coexistence of a wireless medical device in a non-line-of-sight environment utilizing 802.15.4 in a practical, versatile, and reproducible test setup. This paper provides an extensive survey of other coexistence studies concerning 802.15.4 and 802.11 and reports on the authors' coexistence testing inside and outside an anechoic chamber. Results are compared against a non-line-of-sight test setup. Findings relative to co-channel and adjacent channel interference were consistent with results reported in the literature. PMID:22907957

  11. Host-parasite relationships in experimental airborne tuberculosis. II. Reproducible infection by means of an inoculum preserved at -70 C.

    PubMed

    Grover, A A; Kim, H K; Wiegeshaus, E H; Smith, D W

    1967-10-01

    The use of an inoculum preserved at low temperature for the infection of guinea pigs by the respiratory route was evaluated. In a preliminary study with Mycobacterium bovis (BCG), some of the conditions required for maximal recovery of viable cells stored at low temperature were examined. Survival of BCG was decreased by rapid freezing to -70 C and by storage at -20 C, but there was no decrease when BCG was frozen slowly and stored at -70 C or -196 C. In a subsequent study, the effect of storage at -70 C on viability and infectivity of M. tuberculosis (H37Rv) was considered. There was no loss of viability of H37Rv cells suspended in Dubos broth and stored 1 year at -70 C. This suspension showed no loss of infectivity as assessed by the number of primary pulmonary lesions initiated in guinea pigs. Constant viability and infectivity of a suspension stored at low temperature assures the reproducibility of the amount of infection and facilitates comparisons between experiments. This advantage, as well as others, of storage at low temperature are discussed. PMID:4964472

  12. Material response mechanisms are needed to obtain highly accurate experimental shock wave data

    NASA Astrophysics Data System (ADS)

    Forbes, Jerry

    2015-06-01

    The field of shock wave compression of matter has provided a simple set of equations relating thermodynamic and kinematic parameters that describe the conservation of mass, momentum and energy across a steady shock wave with one-dimensional flow. Well-known condensed matter shock wave experimental results will be reviewed to see whether the assumptions required for deriving these simple R-H equations are met. Note that the material compression model is not required for deriving the 1-D conservation flow equations across a steady shock front. However, this statement is misleading from a practical experimental viewpoint since obtaining small systematic errors in shock wave measured parameters requires the material compression and release mechanisms to be known. A brief review will be presented on systematic errors in shock wave data from common experimental techniques for fluids, elastic-plastic solids, materials with negative volume phase transitions, glass and ceramic materials, and high explosives. Issues related to time scales of experiments and quasi-steady flow will also be presented.

  13. The accurate measurement of second virial coefficients using self-interaction chromatography: experimental considerations.

    PubMed

    Quigley, A; Heng, J Y Y; Liddell, J M; Williams, D R

    2013-11-01

    Measurement of B22, the second virial coefficient, is an important technique for describing the solution behaviour of proteins, especially as it relates to precipitation, aggregation and crystallisation phenomena. This paper describes the best practise for calculating B22 values from self-interaction chromatograms (SIC) for aqueous protein solutions. Detailed analysis of SIC peak shapes for lysozyme shows that non-Gaussian peaks are commonly encountered for SIC, with typical peak asymmetries of 10%. This asymmetry reflects a non-linear chromatographic retention process, in this case heterogeneity of the protein-protein interactions. Therefore, it is important to use the centre of mass calculations for determining accurate retention volumes and thus B22 values. Empirical peak maximum chromatogram analysis, often reported in the literature, can result in errors of up to 50% in B22 values. A methodology is reported here for determining both the mean and the variance in B22 from SIC experiments, includes a correction for normal longitudinal peak broadening. The variance in B22 due to chemical effects is quantified statistically and is a measure of the heterogeneity of protein-protein interactions in solution. In the case of lysozyme, a wide range of B22 values are measured which can vary significantly from the average B22 values. PMID:23623796

  14. An experimental correction proposed for an accurate determination of mass diffusivity of wood in steady regime

    NASA Astrophysics Data System (ADS)

    Zohoun, Sylvain; Agoua, Eusèbe; Degan, Gérard; Perre, Patrick

    2002-08-01

    This paper presents an experimental study of the mass diffusion in the hygroscopic region of four temperate species and three tropical ones. In order to simplify the interpretation of the phenomena, a dimensionless parameter called reduced diffusivity is defined. This parameter varies from 0 to 1. The method used is firstly based on the determination of that parameter from results of the measurement of the mass flux which takes into account the conditions of operating standard device (tightness, dimensional variations and easy installation of samples of wood, good stability of temperature and humidity). Secondly the reasons why that parameter has to be corrected are presented. An abacus for this correction of mass diffusivity of wood in steady regime has been plotted. This work constitutes an advanced deal nowadays for characterising forest species.

  15. An experimental device for accurate ultrasounds measurements in liquid foods at high pressure

    NASA Astrophysics Data System (ADS)

    Hidalgo-Baltasar, E.; Taravillo, M.; Baonza, V. G.; Sanz, P. D.; Guignon, B.

    2012-12-01

    The use of high hydrostatic pressure to ensure safe and high-quality product has markedly increased in the food industry during the last decade. Ultrasonic sensors can be employed to control such processes in an equivalent way as they are currently used in processes carried out at room pressure. However, their installation, calibration and use are particularly challenging in the context of a high pressure environment. Besides, data about acoustic properties of food under pressure and even for water are quite scarce in the pressure range of interest for food treatment (namely, above 200 MPa). The objective of this work was to establish a methodology to determine the speed of sound in foods under pressure. An ultrasonic sensor using the multiple reflections method was adapted to a lab-scale HHP equipment to determine the speed of sound in water between 253.15 and 348.15 K, and at pressures up to 700 MPa. The experimental speed-of-sound data were compared to the data calculated from the equation of state of water (IAPWS-95 formulation). From this analysis, the way to calibrate cell path was validated. After this calibration procedure, the speed of sound could be determined in liquid foods by using this sensor with a relative uncertainty between (0.22 and 0.32) % at a confidence level of 95 % over the whole pressure domain.

  16. The use of experimental bending tests to more accurate numerical description of TBC damage process

    NASA Astrophysics Data System (ADS)

    Sadowski, T.; Golewski, P.

    2016-04-01

    Thermal barrier coatings (TBCs) have been extensively used in aircraft engines to protect critical engine parts such as blades and combustion chambers, which are exposed to high temperatures and corrosive environment. The blades of turbine engines are additionally exposed to high mechanical loads. These loads are created by the high rotational speed of the rotor (30 000 rot/min), causing the tensile and bending stresses. Therefore, experimental testing of coated samples is necessary in order to determine strength properties of TBCs. Beam samples with dimensions 50×10×2 mm were used in those studies. The TBC system consisted of 150 μm thick bond coat (NiCoCrAlY) and 300 μm thick top coat (YSZ) made by APS (air plasma spray) process. Samples were tested by three-point bending test with various loads. After bending tests, the samples were subjected to microscopic observation to determine the quantity of cracks and their depth. The above mentioned results were used to build numerical model and calibrate material data in Abaqus program. Brittle cracking damage model was applied for the TBC layer, which allows to remove elements after reaching criterion. Surface based cohesive behavior was used to model the delamination which may occur at the boundary between bond coat and top coat.

  17. Theory of bi-molecular association dynamics in 2D for accurate model and experimental parameterization of binding rates

    NASA Astrophysics Data System (ADS)

    Yogurtcu, Osman N.; Johnson, Margaret E.

    2015-08-01

    The dynamics of association between diffusing and reacting molecular species are routinely quantified using simple rate-equation kinetics that assume both well-mixed concentrations of species and a single rate constant for parameterizing the binding rate. In two-dimensions (2D), however, even when systems are well-mixed, the assumption of a single characteristic rate constant for describing association is not generally accurate, due to the properties of diffusional searching in dimensions d ≤ 2. Establishing rigorous bounds for discriminating between 2D reactive systems that will be accurately described by rate equations with a single rate constant, and those that will not, is critical for both modeling and experimentally parameterizing binding reactions restricted to surfaces such as cellular membranes. We show here that in regimes of intrinsic reaction rate (ka) and diffusion (D) parameters ka/D > 0.05, a single rate constant cannot be fit to the dynamics of concentrations of associating species independently of the initial conditions. Instead, a more sophisticated multi-parametric description than rate-equations is necessary to robustly characterize bimolecular reactions from experiment. Our quantitative bounds derive from our new analysis of 2D rate-behavior predicted from Smoluchowski theory. Using a recently developed single particle reaction-diffusion algorithm we extend here to 2D, we are able to test and validate the predictions of Smoluchowski theory and several other theories of reversible reaction dynamics in 2D for the first time. Finally, our results also mean that simulations of reactive systems in 2D using rate equations must be undertaken with caution when reactions have ka/D > 0.05, regardless of the simulation volume. We introduce here a simple formula for an adaptive concentration dependent rate constant for these chemical kinetics simulations which improves on existing formulas to better capture non-equilibrium reaction dynamics from dilute

  18. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    NASA Astrophysics Data System (ADS)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  19. Accurate experimental and theoretical comparisons between superconductor-insulator-superconductor mixers showing weak and strong quantum effects

    NASA Technical Reports Server (NTRS)

    Mcgrath, W. R.; Richards, P. L.; Face, D. W.; Prober, D. E.; Lloyd, F. L.

    1988-01-01

    A systematic study of the gain and noise in superconductor-insulator-superconductor mixers employing Ta based, Nb based, and Pb-alloy based tunnel junctions was made. These junctions displayed both weak and strong quantum effects at a signal frequency of 33 GHz. The effects of energy gap sharpness and subgap current were investigated and are quantitatively related to mixer performance. Detailed comparisons are made of the mixing results with the predictions of a three-port model approximation to the Tucker theory. Mixer performance was measured with a novel test apparatus which is accurate enough to allow for the first quantitative tests of theoretical noise predictions. It is found that the three-port model of the Tucker theory underestimates the mixer noise temperature by a factor of about 2 for all of the mixers. In addition, predicted values of available mixer gain are in reasonable agreement with experiment when quantum effects are weak. However, as quantum effects become strong, the predicted available gain diverges to infinity, which is in sharp contrast to the experimental results. Predictions of coupled gain do not always show such divergences.

  20. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  1. Impact of the H275Y and I223V Mutations in the Neuraminidase of the 2009 Pandemic Influenza Virus In Vitro and Evaluating Experimental Reproducibility

    PubMed Central

    Paradis, Eric G.; Pinilla, Lady Tatiana; Holder, Benjamin P.; Abed, Yacine; Boivin, Guy; Beauchemin, Catherine A.A.

    2015-01-01

    The 2009 pandemic H1N1 (H1N1pdm09) influenza virus is naturally susceptible to neuraminidase (NA) inhibitors, but mutations in the NA protein can cause oseltamivir resistance. The H275Y and I223V amino acid substitutions in the NA of the H1N1pdm09 influenza strain have been separately observed in patients exhibiting oseltamivir-resistance. Here, we apply mathematical modelling techniques to compare the fitness of the wild-type H1N1pdm09 strain relative to each of these two mutants. We find that both the H275Y and I223V mutations in the H1N1pdm09 background significantly lengthen the duration of the eclipse phase (by 2.5 h and 3.6 h, respectively), consistent with these NA mutations delaying the release of viral progeny from newly infected cells. Cells infected by H1N1pdm09 virus carrying the I223V mutation display a disadvantageous, shorter infectious lifespan (17 h shorter) than those infected with the wild-type or MUT-H275Y strains. In terms of compensating traits, the H275Y mutation in the H1N1pdm09 background results in increased virus infectiousness, as we reported previously, whereas the I223V exhibits none, leaving it overall less fit than both its wild-type counterpart and the MUT-H275Y strain. Using computer simulated competition experiments, we determine that in the presence of oseltamivir at doses even below standard therapy, both the MUT-H275Y and MUT-I223V dominate their wild-type counterpart in all aspects, and the MUT-H275Y outcompetes the MUT-I223V. The H275Y mutation should therefore be more commonly observed than the I223V mutation in circulating H1N1pdm09 strains, assuming both mutations have a similar impact or no significant impact on between-host transmission. We also show that mathematical modelling offers a relatively inexpensive and reliable means to quantify inter-experimental variability and assess the reproducibility of results. PMID:25992792

  2. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  3. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  4. Toward the optimization of double-pulse LIBS underwater: effects of experimental parameters on the reproducibility and dynamics of laser-induced cavitation bubble.

    PubMed

    Cristoforetti, Gabriele; Tiberi, Marco; Simonelli, Andrea; Marsili, Paolo; Giammanco, Francesco

    2012-03-01

    Double-pulse laser-induced breakdown spectroscopy (LIBS) was recently proposed for the analysis of underwater samples, since it overcomes the drawbacks of rapid plasma quenching and of large continuum emission, typical of single-pulse ablation. Despite the attractiveness of the method, this approach suffers nevertheless from a poor spectroscopic reproducibility, which is partially due to the scarce reproducibility of the cavitation bubble induced by the first laser pulse, since pressure and dimensions of the bubble strongly affect plasma emission. In this work, we investigated the reproducibility and the dynamics of the cavitation bubble induced on a solid target in water, and how they depend on pulse duration, energy, and wavelength, as well as on target composition. Results are discussed in terms of the effects on the laser ablation process produced by the crater formation and by the interaction of the laser pulse with floating particles and gas bubbles. This work, preliminary to the optimization of the spectroscopic signal, provides an insight of the phenomena occurring during laser ablation in water, together with useful information for the choice of the laser source to be used in the apparatus. PMID:22410923

  5. Repeatability and Reproducibility of Corneal Biometric Measurements Using the Visante Omni and a Rabbit Experimental Model of Post-Surgical Corneal Ectasia

    PubMed Central

    Liu, Yu-Chi; Konstantopoulos, Aris; Riau, Andri K.; Bhayani, Raj; Lwin, Nyein C.; Teo, Ericia Pei Wen; Yam, Gary Hin Fai; Mehta, Jodhbir S.

    2015-01-01

    Purpose: To investigate the repeatability and reproducibility of the Visante Omni topography in obtaining topography measurements of rabbit corneas and to develop a post-surgical model of corneal ectasia. Methods: Eight rabbits were used to study the repeatability and reproducibility by assessing the intra- and interobserver bias and limits of agreement. Another nine rabbits underwent different diopters (D) of laser in situ keratosmileusis (LASIK) were used for the development of ectasia model. All eyes were examined with the Visante Omni, and corneal ultrastructure were evaluated with transmission electron microscopy (TEM). Results: There was no significant intra- or interobserver difference for mean steep and flat keratometry (K) values of simulated K, anterior, and posterior elevation measurements. Eyes underwent −5 D LASIK had a significant increase in mean amplitude of astigmatism and posterior surface elevation with time (P for trend < 0.05). At 2 and 3 months, the −5 D LASIK group had significant greater mean amplitude of astigmatism (P = 0.036; P = 0.027) and posterior surface elevation (both P < 0.005) compared with control group. On TEM, the mean collagen fibril diameter and interfibril distance in the −5 D LASIK eyes were significantly greater than in controls at 3 months (P = 0.018; P < 0.001). Conclusions: The Visante Omni provided imaging of the rabbit cornea with good repeatability and reproducibility. Application of −5 D LASIK treatment produced a rabbit model of corneal ectasia that was gradual in development and simulated the human condition. Translational Relevance: The results provide the foundations for the future evaluation of novel treatment modalities for post-surgical ectasia and keratoconus. PMID:25938004

  6. Reproducibility of NIF hohlraum measurements

    NASA Astrophysics Data System (ADS)

    Moody, J. D.; Ralph, J. E.; Turnbull, D. P.; Casey, D. T.; Albert, F.; Bachmann, B. L.; Doeppner, T.; Divol, L.; Grim, G. P.; Hoover, M.; Landen, O. L.; MacGowan, B. J.; Michel, P. A.; Moore, A. S.; Pino, J. E.; Schneider, M. B.; Tipton, R. E.; Smalyuk, V. A.; Strozzi, D. J.; Widmann, K.; Hohenberger, M.

    2015-11-01

    The strategy of experimentally ``tuning'' the implosion in a NIF hohlraum ignition target towards increasing hot-spot pressure, areal density of compressed fuel, and neutron yield relies on a level of experimental reproducibility. We examine the reproducibility of experimental measurements for a collection of 15 identical NIF hohlraum experiments. The measurements include incident laser power, backscattered optical power, x-ray measurements, hot-electron fraction and energy, and target characteristics. We use exact statistics to set 1-sigma confidence levels on the variations in each of the measurements. Of particular interest is the backscatter and laser-induced hot-spot locations on the hohlraum wall. Hohlraum implosion designs typically include variability specifications [S. W. Haan et al., Phys. Plasmas 18, 051001 (2011)]. We describe our findings and compare with the specifications. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  7. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull. PMID:22620716

  8. Experimental scale and dimensionality requirements for reproducing and studying coupled land-atmosphere-vegetative processes in the intermediate scale laboratory settings

    NASA Astrophysics Data System (ADS)

    Trautz, Andrew; Illangasekare, Tissa; Rodriguez-Iturbe, Ignacio; Helmig, Rainer; Heck, Katharina

    2016-04-01

    Past investigations of coupled land-atmosphere-vegetative processes have been constrained to two extremes, small laboratory bench-scale and field scale testing. In recognition of the limitations of studying the scale-dependency of these fundamental processes at either extreme, researchers have recently begun to promote the use of experimentation at intermediary scales between the bench and field scales. A requirement for employing intermediate scale testing to refine heat and mass transport theory regarding land-atmosphere-vegetative processes is high spatial-temporal resolution datasets generated under carefully controlled experimental conditions in which both small and field scale phenomena can be observed. Field experimentation often fails these criteria as a result of sensor network limitations as well as the natural complexities and uncertainties introduced by heterogeneity and constantly changing atmospheric conditions. Laboratory experimentation, which is used to study three-dimensional (3-D) processes, is often conducted in 2-D test systems as a result of space, instrumentation, and cost constraints. In most flow and transport problems, 2-D testing is not considered a serious limitation because the bypassing of flow and transport due to geo-biochemical heterogeneities can still be studied. Constraining the study of atmosphere-soil-vegetation interactions to 2-D systems introduces a new challenge given that the soil moisture dynamics associated with these interactions occurs in three dimensions. This is an important issue that needs to be addressed as evermore intricate and specialized experimental apparatuses like the climate-controlled wind tunnel-porous media test system at CESEP are being constructed and used for these types of studies. The purpose of this study is to therefore investigate the effects of laboratory experimental dimensionality on observed soil moisture dynamics in the context of bare-soil evaporation and evapotranspiration

  9. Z-scan theoretical and experimental studies for accurate measurements of the nonlinear refractive index and absorption of optical glasses near damage threshold

    NASA Astrophysics Data System (ADS)

    Olivier, Thomas; Billard, Franck; Akhouayri, Hassan

    2004-06-01

    Self-focusing is one of the dramatic phenomena that may occur during the propagation of a high power laser beam in a nonlinear material. This phenomenon leads to a degradation of the wave front and may also lead to a photoinduced damage of the material. Realistic simulations of the propagation of high power laser beams require an accurate knowledge of the nonlinear refractive index γ. In the particular case of fused silica and in the nanosecond regime, it seems that electronic mechanisms as well as electrostriction and thermal effects can lead to a significant refractive index variation. Compared to the different methods used to measure this parmeter, the Z-scan method is simple, offers a good sensitivity and may give absolute measurements if the incident beam is accurately studied. However, this method requires a very good knowledge of the incident beam and of its propagation inside a nonlinear sample. We used a split-step propagation algorithm to simlate Z-scan curves for arbitrary beam shape, sample thickness and nonlinear phase shift. According to our simulations and a rigorous analysis of the Z-scan measured signal, it appears that some abusive approximations lead to very important errors. Thus, by reducing possible errors on the interpretation of Z-scan experimental studies, we performed accurate measurements of the nonlinear refractive index of fused silica that show the significant contribution of nanosecond mechanisms.

  10. Impact of Surface Water Layers on Protein--Ligand Binding: How Well Are Experimental Data Reproduced by Molecular Dynamics Simulations in a Thermolysin Test Case?

    PubMed

    Betz, Michael; Wulsdorf, Tobias; Krimmer, Stefan G; Klebe, Gerhard

    2016-01-25

    Drug binding involves changes of the local water structure around proteins including water rearrangements across surface-solvation layers around protein and ligand portions exposed to the newly formed complex surface. For a series of thermolysin-binding phosphonamidates, we discovered that variations of the partly exposed P2'-substituents modulate binding affinity up to 10 kJ mol(-1) with even larger enthalpy/entropy partitioning of the binding signature. The observed profiles cannot be completely explained by desolvation effects. Instead, the quality and completeness of the surface water network wrapping around the formed complexes provide an explanation for the observed structure-activity relationship. We used molecular dynamics to compute surface water networks and predict solvation sites around the complexes. A fairly good correspondence with experimental difference electron densities in high-resolution crystal structures is achieved; in detail some problems with the potentials were discovered. Charge-assisted contacts to waters appeared as exaggerated by AMBER, and stabilizing contributions of water-to-methyl contacts were underestimated. PMID:26691064

  11. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  12. Magnetogastrography (MGG) Reproducibility Assessments

    NASA Astrophysics Data System (ADS)

    de la Roca-Chiapas, J. M.; Córdova, T.; Hernández, E.; Solorio, S.; Solís Ortiz, S.; Sosa, M.

    2006-09-01

    Seven healthy subjects underwent a magnetic pulse of 32 mT for 17 ms, seven times in 90 minutes. The procedure was repeated one and two weeks later. Assessments of the gastric emptying were carried out for each one of the measurements and a statistical analysis of ANOVA was performed for every group of data. The gastric emptying time was 19.22 ± 5 min. Reproducibility estimation was above 85%. Therefore, magnetogastrography seems to be an excellent technique to be implemented in routine clinical trials.

  13. Experimental study on the application of a compressed-sensing (CS) algorithm to dental cone-beam CT (CBCT) for accurate, low-dose image reconstruction

    NASA Astrophysics Data System (ADS)

    Oh, Jieun; Cho, Hyosung; Je, Uikyu; Lee, Minsik; Kim, Hyojeong; Hong, Daeki; Park, Yeonok; Lee, Seonhwa; Cho, Heemoon; Choi, Sungil; Koo, Yangseo

    2013-03-01

    In practical applications of three-dimensional (3D) tomographic imaging, there are often challenges for image reconstruction from insufficient data. In computed tomography (CT); for example, image reconstruction from few views would enable fast scanning with reduced doses to the patient. In this study, we investigated and implemented an efficient reconstruction method based on a compressed-sensing (CS) algorithm, which exploits the sparseness of the gradient image with substantially high accuracy, for accurate, low-dose dental cone-beam CT (CBCT) reconstruction. We applied the algorithm to a commercially-available dental CBCT system (Expert7™, Vatech Co., Korea) and performed experimental works to demonstrate the algorithm for image reconstruction in insufficient sampling problems. We successfully reconstructed CBCT images from several undersampled data and evaluated the reconstruction quality in terms of the universal-quality index (UQI). Experimental demonstrations of the CS-based reconstruction algorithm appear to show that it can be applied to current dental CBCT systems for reducing imaging doses and improving the image quality.

  14. Reproducing in cities.

    PubMed

    Mace, Ruth

    2008-02-01

    Reproducing in cities has always been costly, leading to lower fertility (that is, lower birth rates) in urban than in rural areas. Historically, although cities provided job opportunities, initially residents incurred the penalty of higher infant mortality, but as mortality rates fell at the end of the 19th century, European birth rates began to plummet. Fertility decline in Africa only started recently and has been dramatic in some cities. Here it is argued that both historical and evolutionary demographers are interpreting fertility declines across the globe in terms of the relative costs of child rearing, which increase to allow children to outcompete their peers. Now largely free from the fear of early death, postindustrial societies may create an environment that generates runaway parental investment, which will continue to drive fertility ever lower. PMID:18258904

  15. Reproducible measurements of MPI performance characteristics.

    SciTech Connect

    Gropp, W.; Lusk, E.

    1999-06-25

    In this paper we describe the difficulties inherent in making accurate, reproducible measurements of message-passing performance. We describe some of the mistakes often made in attempting such measurements and the consequences of such mistakes. We describe mpptest, a suite of performance measurement programs developed at Argonne National Laboratory, that attempts to avoid such mistakes and obtain reproducible measures of MPI performance that can be useful to both MPI implementers and MPI application writers. We include a number of illustrative examples of its use.

  16. Accurate experimental determination of the isotope effects on the triple point temperature of water. I. Dependence on the 2H abundance

    NASA Astrophysics Data System (ADS)

    Faghihi, V.; Peruzzi, A.; Aerts-Bijma, A. T.; Jansen, H. G.; Spriensma, J. J.; van Geel, J.; Meijer, H. A. J.

    2015-12-01

    Variation in the isotopic composition of water is one of the major contributors to uncertainty in the realization of the triple point of water (TPW). Although the dependence of the TPW on the isotopic composition of the water has been known for years, there is still a lack of a detailed and accurate experimental determination of the values for the correction constants. This paper is the first of two articles (Part I and Part II) that address quantification of isotope abundance effects on the triple point temperature of water. In this paper, we describe our experimental assessment of the 2H isotope effect. We manufactured five triple point cells with prepared water mixtures with a range of 2H isotopic abundances encompassing widely the natural abundance range, while the 18O and 17O isotopic abundance were kept approximately constant and the 18O  -  17O ratio was close to the Meijer-Li relationship for natural waters. The selected range of 2H isotopic abundances led to cells that realised TPW temperatures between approximately  -140 μK to  +2500 μK with respect to the TPW temperature as realized by VSMOW (Vienna Standard Mean Ocean Water). Our experiment led to determination of the value for the δ2H correction parameter of A2H  =  673 μK / (‰ deviation of δ2H from VSMOW) with a combined uncertainty of 4 μK (k  =  1, or 1σ).

  17. Reproducible Experiment Platform

    NASA Astrophysics Data System (ADS)

    Likhomanenko, Tatiana; Rogozhnikov, Alex; Baranov, Alexander; Khairullin, Egor; Ustyuzhanin, Andrey

    2015-12-01

    Data analysis in fundamental sciences nowadays is an essential process that pushes frontiers of our knowledge and leads to new discoveries. At the same time we can see that complexity of those analyses increases fast due to a) enormous volumes of datasets being analyzed, b) variety of techniques and algorithms one have to check inside a single analysis, c) distributed nature of research teams that requires special communication media for knowledge and information exchange between individual researchers. There is a lot of resemblance between techniques and problems arising in the areas of industrial information retrieval and particle physics. To address those problems we propose Reproducible Experiment Platform (REP), a software infrastructure to support collaborative ecosystem for computational science. It is a Python based solution for research teams that allows running computational experiments on shared datasets, obtaining repeatable results, and consistent comparisons of the obtained results. We present some key features of REP based on case studies which include trigger optimization and physics analysis studies at the LHCb experiment.

  18. ROCS: A reproducibility index and confidence score for interaction proteomics

    PubMed Central

    2013-01-01

    -MS experiments, each containing well characterized protein interactions, allowing for systematic benchmarking of ROCS. We show that our method may be used on its own to make accurate identification of specific, biologically relevant protein-protein interactions or in combination with other AP-MS scoring methods to significantly improve inferences. Conclusions Our method addresses important issues encountered in AP-MS datasets, making ROCS a very promising tool for this purpose, either on its own or especially in conjunction with other methods. We anticipate that our methodology may be used more generally in proteomics studies and databases, where experimental reproducibility issues arise. The method is implemented in the R language, and is available as an R package called "ROCS", freely available from the CRAN repository http://cran.r-project.org/. PMID:24156626

  19. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  20. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  1. First accurate experimental study of Mu reactivity from a state-selected reactant in the gas phase: the Mu + H2{1} reaction rate at 300 K

    NASA Astrophysics Data System (ADS)

    Bakule, Pavel; Sukhorukov, Oleksandr; Ishida, Katsuhiko; Pratt, Francis; Fleming, Donald; Momose, Takamasa; Matsuda, Yasuyuki; Torikai, Eiko

    2015-02-01

    This paper reports on the experimental background and methodology leading to recent results on the first accurate measurement of the reaction rate of the muonium (Mu) atom from a state-selected reactant in the gas phase: the Mu + H2\\{1\\}\\to MuH + H reaction at 300 K, and its comparison with rigorous quantum rate theory, Bakule et al (2012 J. Phys. Chem. Lett. 3 2755). Stimulated Raman pumping, induced by 532 nm light from the 2nd harmonic of a Nd:YAG laser, was used to produce H2 in its first vibrational (v = 1) state, H2\\{1\\}, in a single Raman/reaction cell. A pulsed muon beam (from ‘ISIS’, at 50 Hz) matched the 25 Hz repetition rate of the laser, allowing data taking in equal ‘Laser-On/Laser-Off’ modes of operation. The signal to noise was improved by over an order of magnitude in comparison with an earlier proof-of-principle experiment. The success of the present experiment also relied on optimizing the overlap of the laser profile with the extended stopping distribution of the muon beam at 50 bar H2 pressure, in which Monte Carlo simulations played a central role. The rate constant, found from the analysis of three separate measurements, which includes a correction for the loss of {{H}2}\\{1\\} concentration due to collisional relaxation with unpumped H2 during the time of each measurement, is {{k}Mu}\\{1\\} = 9.9[(-1.4)(+1.7)] × 10-13 cm3 s-1 at 300 K. This is in good to excellent agreement with rigorous quantum rate calculations on the complete configuration interaction/Born-Huang surface, as reported earlier by Bakule et al, and which are also briefly commented on herein.

  2. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science. PMID:27231259

  3. Reproducibility in a multiprocessor system

    DOEpatents

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  4. Study of resonance interactions in polyatomic molecules on the basis of highly accurate experimental data: Set of strongly interacting Bands ν10(B1), ν7(B2), ν4(A2), ν8(B2), ν3(A1) and ν6(B1) of CH2=CD2

    NASA Astrophysics Data System (ADS)

    Ulenikov, O. N.; Gromova, O. V.; Bekhtereva, E. S.; Berezkin, K. B.; Kashirina, N. V.; Tan, T. L.; Sydow, C.; Maul, C.; Bauerecker, S.

    2016-09-01

    The highly accurate (experimental accuracy in line positions ~(1 - 3) ×10-4cm-1) FTIR ro-vibrational spectra of CH2=CD2 in the region of 600-1300 cm-1, where the fundamental bands ν10, ν7, ν4, ν8, ν3, and ν6 are located, were recorded and analyzed with the Hamiltonian model which takes into account resonance interactions between all six studied bands. About 12 200 ro-vibrational transitions belonging to these bands (that is considerably more than it was made in the preceding studies for the bands ν10, ν7, ν8, ν3 and ν6; transitions belonging to the ν4 band were assigned for the first time) were assigned in the experimental spectra with the maximum values of quantum numbers Jmax. / Kamax . equal to 31/20, 46/18, 33/11, 50/26, 44/20 and 42/21 for the bands ν10, ν7, ν4, ν8, ν3, and ν6, respectively. On that basis, a set of 133 vibrational, rotational, centrifugal distortion and resonance interaction parameters was obtained from the weighted fit. They reproduce values of 3920 initial "experimental" ro-vibrational energy levels (positions of about 12 200 experimentally recorded and assigned transitions) with the rms error drms = 2.3 ×10-4cm-1.

  5. Contextual sensitivity in scientific reproducibility.

    PubMed

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  6. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  7. Accurate Molecular Polarizabilities Based on Continuum Electrostatics

    PubMed Central

    Truchon, Jean-François; Nicholls, Anthony; Iftimie, Radu I.; Roux, Benoît; Bayly, Christopher I.

    2013-01-01

    A novel approach for representing the intramolecular polarizability as a continuum dielectric is introduced to account for molecular electronic polarization. It is shown, using a finite-difference solution to the Poisson equation, that the Electronic Polarization from Internal Continuum (EPIC) model yields accurate gas-phase molecular polarizability tensors for a test set of 98 challenging molecules composed of heteroaromatics, alkanes and diatomics. The electronic polarization originates from a high intramolecular dielectric that produces polarizabilities consistent with B3LYP/aug-cc-pVTZ and experimental values when surrounded by vacuum dielectric. In contrast to other approaches to model electronic polarization, this simple model avoids the polarizability catastrophe and accurately calculates molecular anisotropy with the use of very few fitted parameters and without resorting to auxiliary sites or anisotropic atomic centers. On average, the unsigned error in the average polarizability and anisotropy compared to B3LYP are 2% and 5%, respectively. The correlation between the polarizability components from B3LYP and this approach lead to a R2 of 0.990 and a slope of 0.999. Even the F2 anisotropy, shown to be a difficult case for existing polarizability models, can be reproduced within 2% error. In addition to providing new parameters for a rapid method directly applicable to the calculation of polarizabilities, this work extends the widely used Poisson equation to areas where accurate molecular polarizabilities matter. PMID:23646034

  8. Open Science and Research Reproducibility.

    PubMed

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  9. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  10. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  11. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  12. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  13. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  14. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. PMID:25490932

  15. An open investigation of the reproducibility of cancer biology research

    PubMed Central

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. DOI: http://dx.doi.org/10.7554/eLife.04333.001 PMID:25490932

  16. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  17. Performance reproducibility index for classification

    PubMed Central

    Yousefi, Mohammadmahdi R.; Dougherty, Edward R.

    2012-01-01

    Motivation: A common practice in biomarker discovery is to decide whether a large laboratory experiment should be carried out based on the results of a preliminary study on a small set of specimens. Consideration of the efficacy of this approach motivates the introduction of a probabilistic measure, for whether a classifier showing promising results in a small-sample preliminary study will perform similarly on a large independent sample. Given the error estimate from the preliminary study, if the probability of reproducible error is low, then there is really no purpose in substantially allocating more resources to a large follow-on study. Indeed, if the probability of the preliminary study providing likely reproducible results is small, then why even perform the preliminary study? Results: This article introduces a reproducibility index for classification, measuring the probability that a sufficiently small error estimate on a small sample will motivate a large follow-on study. We provide a simulation study based on synthetic distribution models that possess known intrinsic classification difficulties and emulate real-world scenarios. We also set up similar simulations on four real datasets to show the consistency of results. The reproducibility indices for different distributional models, real datasets and classification schemes are empirically calculated. The effects of reporting and multiple-rule biases on the reproducibility index are also analyzed. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routine and error estimation methods. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi12a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:22954625

  18. Can Contemporary Density Functional Theory Predict Energy Spans in Molecular Catalysis Accurately Enough To Be Applicable for in Silico Catalyst Design? A Computational/Experimental Case Study for the Ruthenium-Catalyzed Hydrogenation of Olefins.

    PubMed

    Rohmann, Kai; Hölscher, Markus; Leitner, Walter

    2016-01-13

    The catalytic hydrogenation of cyclohexene and 1-methylcyclohexene is investigated experimentally and by means of density functional theory (DFT) computations using novel ruthenium Xantphos(Ph) (4,5-bis(diphenylphosphino)-9,9-dimethylxanthene) and Xantphos(Cy) (4,5-bis(dicyclohexylphosphino)-9,9-dimethylxanthene) precatalysts [Ru(Xantphos(Ph))(PhCO2)(Cl)] (1) and [Ru(Xantphos(Cy))(PhCO2)(Cl)] (2), the synthesis, characterization, and crystal structures of which are reported. The intention of this work is to (i) understand the reaction mechanisms on the microscopic level and (ii) compare experimentally observed activation barriers with computed barriers. The Gibbs free activation energy ΔG(⧧) was obtained experimentally with precatalyst 1 from Eyring plots for the hydrogenation of cyclohexene (ΔG(⧧) = 17.2 ± 1.0 kcal/mol) and 1-methylcyclohexene (ΔG(⧧) = 18.8 ± 2.4 kcal/mol), while the Gibbs free activation energy ΔG(⧧) for the hydrogenation of cyclohexene with precatalyst 2 was determined to be 21.1 ± 2.3 kcal/mol. Plausible activation pathways and catalytic cycles were computed in the gas phase (M06-L/def2-SVP). A variety of popular density functionals (ωB97X-D, LC-ωPBE, CAM-B3LYP, B3LYP, B97-D3BJ, B3LYP-D3, BP86-D3, PBE0-D3, M06-L, MN12-L) were used to reoptimize the turnover determining states in the solvent phase (DF/def2-TZVP; IEF-PCM and/or SMD) to investigate how well the experimentally obtained activation barriers can be reproduced by the calculations. The density functionals B97-D3BJ, MN12-L, M06-L, B3LYP-D3, and CAM-B3LYP reproduce the experimentally observed activation barriers for both olefins very well with very small (0.1 kcal/mol) to moderate (3.0 kcal/mol) mean deviations from the experimental values indicating for the field of hydrogenation catalysis most of these functionals to be useful for in silico catalyst design prior to experimental work. PMID:26713773

  19. Accurate experimental determination of the isotope effects on the triple point temperature of water. II. Combined dependence on the 18O and 17O abundances

    NASA Astrophysics Data System (ADS)

    Faghihi, V.; Kozicki, M.; Aerts-Bijma, A. T.; Jansen, H. G.; Spriensma, J. J.; Peruzzi, A.; Meijer, H. A. J.

    2015-12-01

    This paper is the second of two articles on the quantification of isotope effects on the triple point temperature of water. In this second article, we address the combined effects of 18O and 17O isotopes. We manufactured five triple point cells with waters with 18O and 17O abundances exceeding widely the natural abundance range while maintaining their natural 18O/17O relationship. The 2H isotopic abundance was kept close to that of VSMOW (Vienna Standard Mean Ocean Water). These cells realized triple point temperatures ranging between  -220 μK to 1420 μK with respect to the temperature realized by a triple point cell filled with VSMOW. Our experiment allowed us to determine an accurate and reliable value for the newly defined combined 18, 17O correction parameter of AO  =  630 μK with a combined uncertainty of 10 μK. To apply this correction, only the 18O abundance of the TPW needs to be known (and the water needs to be of natural origin). Using the results of our two articles, we recommend a correction equation along with the coefficient values for isotopic compositions differing from that of VSMOW and compare the effect of this new equation on a number of triple point cells from the literature and from our own institute. Using our correction equation, the uncertainty in the isotope correction for triple point cell waters used around the world will be  <1 μK.

  20. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: Application to pure copper, platinum, tungsten, and nickel at very high temperatures

    SciTech Connect

    Abadlia, L.; Mayoufi, M.; Gasser, F.; Khalouk, K.; Gasser, J. G.

    2014-09-15

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  1. Construction of Spectroscopically Accurate IR Linelists for NH3 and CO2

    NASA Astrophysics Data System (ADS)

    Huang, X.; Schwenke, D. W.; Lee, T. J.

    2011-05-01

    The strategy of using the best theory together with high-resolution experi-ment was applied to NH3 and CO2: that is, refine a highly accurate ab initio PES with the most reliable HITRAN or pure experimental data. With 0.01 - 0.02 cm-1 accuracy, our calculations are clearly far beyond simply reproducing experimental data, but are also capable of revealing many deficiencies in the cur- rent experimental analysis of the various isotopologues, as well as provide reliable predictions with similar accuracy.

  2. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. PMID:26315443

  3. Reproducibility of muscle oxygen saturation.

    PubMed

    Thiel, C; Vogt, L; Himmelreich, H; Hübscher, M; Banzer, W

    2011-04-01

    The present study evaluated the reproducibility of tissue oxygenation in relation to oxygen consumption (VO2) across cycle exercise intensities in a test-retest design. 12 subjects (25.7±2.1 years; 24.7±1.9 kg · m(-2)) twice performed an incremental bicycle exercise protocol, while tissue oxygen saturation (StO2) in the vastus lateralis muscle was monitored by a commercially available NIRS unit and VO2 determined by an open-circuit indirect calorimetric system. Coefficients of variation across rest, workloads corresponding to 25, 50 and 75% of individual maximum capacity, and maximum load were 5.8, 4.6, 6.1, 8.0, 11.0% (StO2) and 7.6, 6.0, 3.7, 3.4, 3.1% (VO2), respectively. 95 % CI of relative test-retest differences ranged from -5.6 to +5.4% at 25% load to -17.2 to +7.5% at maximum load for StO2 and from -7.3 to +7.7% at rest to -3.3 to +3.2% at maximum load for VO2. With advancing exercise intensity, within-subject variability of StO2 was augmented, whereas VO2 variability slightly attenuated. NIRS measurements at higher workloads need to be interpreted with caution. PMID:21271493

  4. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  5. Musical training generalises across modalities and reveals efficient and adaptive mechanisms for reproducing temporal intervals.

    PubMed

    Aagten-Murphy, David; Cappagli, Giulia; Burr, David

    2014-03-01

    Expert musicians are able to time their actions accurately and consistently during a musical performance. We investigated how musical expertise influences the ability to reproduce auditory intervals and how this generalises across different techniques and sensory modalities. We first compared various reproduction strategies and interval length, to examine the effects in general and to optimise experimental conditions for testing the effect of music, and found that the effects were robust and consistent across different paradigms. Focussing on a 'ready-set-go' paradigm subjects reproduced time intervals drawn from distributions varying in total length (176, 352 or 704 ms) or in the number of discrete intervals within the total length (3, 5, 11 or 21 discrete intervals). Overall, Musicians performed more veridical than Non-Musicians, and all subjects reproduced auditory-defined intervals more accurately than visually-defined intervals. However, Non-Musicians, particularly with visual stimuli, consistently exhibited a substantial and systematic regression towards the mean interval. When subjects judged intervals from distributions of longer total length they tended to regress more towards the mean, while the ability to discriminate between discrete intervals within the distribution had little influence on subject error. These results are consistent with a Bayesian model that minimizes reproduction errors by incorporating a central tendency prior weighted by the subject's own temporal precision relative to the current distribution of intervals. Finally a strong correlation was observed between all durations of formal musical training and total reproduction errors in both modalities (accounting for 30% of the variance). Taken together these results demonstrate that formal musical training improves temporal reproduction, and that this improvement transfers from audition to vision. They further demonstrate the flexibility of sensorimotor mechanisms in adapting to

  6. Wire like link for cycle reproducible and cycle accurate hardware accelerator

    SciTech Connect

    Asaad, Sameh; Kapur, Mohit; Parker, Benjamin D

    2015-04-07

    First and second field programmable gate arrays are provided which implement first and second blocks of a circuit design to be simulated. The field programmable gate arrays are operated at a first clock frequency and a wire like link is provided to send a plurality of signals between them. The wire like link includes a serializer, on the first field programmable gate array, to serialize the plurality of signals; a deserializer on the second field programmable gate array, to deserialize the plurality of signals; and a connection between the serializer and the deserializer. The serializer and the deserializer are operated at a second clock frequency, greater than the first clock frequency, and the second clock frequency is selected such that latency of transmission and reception of the plurality of signals is less than the period corresponding to the first clock frequency.

  7. Applicability of Density Functional Theory in Reproducing Accurate Vibrational Spectra of Surface Bound Species

    SciTech Connect

    Matanovic, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando; Henson, Neil J.

    2014-10-05

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of 22.62 and 21.1% for the NAN stretching and RhAH stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the RhAH and NAN stretching modes from the bulk phonons and by solving one- and two-dimensional Schr€odinger equation associated with the RhAH, RhAN, and NAN potential energy we calculated the anharmonic correction for NAN and RhAH stretching modes as 231 cm21 and 277 cm21 at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments.

  8. Applicability of density functional theory in reproducing accurate vibrational spectra of surface bound species.

    PubMed

    Matanović, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando H; Henson, Neil J

    2014-10-01

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of -2.62 and -1.1% for the N-N stretching and Rh-H stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the Rh-H and N-N stretching modes from the bulk phonons and by solving one- and two-dimensional Schrödinger equation associated with the Rh-H, Rh-N, and N-N potential energy we calculated the anharmonic correction for N-N and Rh-H stretching modes as -31 cm(-1) and -77 cm(-1) at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments. PMID:25164265

  9. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions. PMID:27196963

  10. Anisotropic Turbulence Modeling for Accurate Rod Bundle Simulations

    SciTech Connect

    Baglietto, Emilio

    2006-07-01

    An improved anisotropic eddy viscosity model has been developed for accurate predictions of the thermal hydraulic performances of nuclear reactor fuel assemblies. The proposed model adopts a non-linear formulation of the stress-strain relationship in order to include the reproduction of the anisotropic phenomena, and in combination with an optimized low-Reynolds-number formulation based on Direct Numerical Simulation (DNS) to produce correct damping of the turbulent viscosity in the near wall region. This work underlines the importance of accurate anisotropic modeling to faithfully reproduce the scale of the turbulence driven secondary flows inside the bundle subchannels, by comparison with various isothermal and heated experimental cases. The very low scale secondary motion is responsible for the increased turbulence transport which produces a noticeable homogenization of the velocity distribution and consequently of the circumferential cladding temperature distribution, which is of main interest in bundle design. Various fully developed bare bundles test cases are shown for different geometrical and flow conditions, where the proposed model shows clearly improved predictions, in close agreement with experimental findings, for regular as well as distorted geometries. Finally the applicability of the model for practical bundle calculations is evaluated through its application in the high-Reynolds form on coarse grids, with excellent results. (author)

  11. Statistical analysis of accurate prediction of local atmospheric optical attenuation with a new model according to weather together with beam wandering compensation system: a season-wise experimental investigation

    NASA Astrophysics Data System (ADS)

    Arockia Bazil Raj, A.; Padmavathi, S.

    2016-07-01

    Atmospheric parameters strongly affect the performance of Free Space Optical Communication (FSOC) system when the optical wave is propagating through the inhomogeneous turbulent medium. Developing a model to get an accurate prediction of optical attenuation according to meteorological parameters becomes significant to understand the behaviour of FSOC channel during different seasons. A dedicated free space optical link experimental set-up is developed for the range of 0.5 km at an altitude of 15.25 m. The diurnal profile of received power and corresponding meteorological parameters are continuously measured using the developed optoelectronic assembly and weather station, respectively, and stored in a data logging computer. Measured meteorological parameters (as input factors) and optical attenuation (as response factor) of size [177147 × 4] are used for linear regression analysis and to design the mathematical model that is more suitable to predict the atmospheric optical attenuation at our test field. A model that exhibits the R2 value of 98.76% and average percentage deviation of 1.59% is considered for practical implementation. The prediction accuracy of the proposed model is investigated along with the comparative results obtained from some of the existing models in terms of Root Mean Square Error (RMSE) during different local seasons in one-year period. The average RMSE value of 0.043-dB/km is obtained in the longer range dynamic of meteorological parameters variations.

  12. Accurate on-line mass flow measurements in supercritical fluid chromatography.

    PubMed

    Tarafder, Abhijit; Vajda, Péter; Guiochon, Georges

    2013-12-13

    This work demonstrates the possible advantages and the challenges of accurate on-line measurements of the CO2 mass flow rate during supercritical fluid chromatography (SFC) operations. Only the mass flow rate is constant along the column in SFC. The volume flow rate is not. The critical importance of accurate measurements of mass flow rates for the achievement of reproducible data and the serious difficulties encountered in supercritical fluid chromatography for its assessment were discussed earlier based on the physical properties of carbon dioxide. In this report, we experimentally demonstrate the problems encountered when performing mass flow rate measurements and the gain that can possibly be achieved by acquiring reproducible data using a Coriolis flow meter. The results obtained show how the use of a highly accurate mass flow meter permits, besides the determination of accurate values of the mass flow rate, a systematic, constant diagnosis of the correct operation of the instrument and the monitoring of the condition of the carbon dioxide pump. PMID:24210558

  13. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  14. Technical issues in using robots to reproduce joint specific gait.

    PubMed

    Rosvold, J M; Darcy, S P; Peterson, R C; Achari, Y; Corr, D T; Marchuk, L L; Frank, C B; Shrive, N G; Rosvold, Joshua M; Darcy, Shon P; Peterson, Robert C; Achari, Yamini; Corr, David T; Marchuk, Linda L; Frank, Cyril B; Shrive, Nigel G

    2011-05-01

    Reproduction of the in vivo motions of joints has become possible with improvements in robot technology and in vivo measuring techniques. A motion analysis system has been used to measure the motions of the tibia and femur of the ovine stifle joint during normal gait. These in vivo motions are then reproduced with a parallel robot. To ensure that the motion of the joint is accurately reproduced and that the resulting data are reliable, the testing frame, the data acquisition system, and the effects of limitations of the testing platform need to be considered. Of the latter, the stiffness of the robot and the ability of the control system to process sequential points on the path of motion in a timely fashion for repeatable path accuracy are of particular importance. Use of the system developed will lead to a better understanding of the mechanical environment of joints and ligaments in vivo. PMID:21599101

  15. Grading More Accurately

    ERIC Educational Resources Information Center

    Rom, Mark Carl

    2011-01-01

    Grades matter. College grading systems, however, are often ad hoc and prone to mistakes. This essay focuses on one factor that contributes to high-quality grading systems: grading accuracy (or "efficiency"). I proceed in several steps. First, I discuss the elements of "efficient" (i.e., accurate) grading. Next, I present analytical results…

  16. Numerical reproducibility for implicit Monte Carlo simulations

    SciTech Connect

    Cleveland, M.; Brunner, T.; Gentile, N.

    2013-07-01

    We describe and compare different approaches for achieving numerical reproducibility in photon Monte Carlo simulations. Reproducibility is desirable for code verification, testing, and debugging. Parallelism creates a unique problem for achieving reproducibility in Monte Carlo simulations because it changes the order in which values are summed. This is a numerical problem because double precision arithmetic is not associative. In [1], a way of eliminating this roundoff error using integer tallies was described. This approach successfully achieves reproducibility at the cost of lost accuracy by rounding double precision numbers to fewer significant digits. This integer approach, and other extended reproducibility techniques, are described and compared in this work. Increased precision alone is not enough to ensure reproducibility of photon Monte Carlo simulations. A non-arbitrary precision approaches required a varying degree of rounding to achieve reproducibility. For the problems investigated in this work double precision global accuracy was achievable by using 100 bits of precision or greater on all unordered sums which where subsequently rounded to double precision at the end of every time-step. (authors)

  17. Towards Accurate Molecular Modeling of Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Chantawansri, T. L.; Andzelm, J.; Taylor, D.; Byrd, E.; Rice, B.

    2010-03-01

    There is substantial interest in identifying the controlling factors that influence the susceptibility of polymer bonded explosives (PBXs) to accidental initiation. Numerous Molecular Dynamics (MD) simulations of PBXs using the COMPASS force field have been reported in recent years, where the validity of the force field in modeling the solid EM fill has been judged solely on its ability to reproduce lattice parameters, which is an insufficient metric. Performance of the COMPASS force field in modeling EMs and the polymeric binder has been assessed by calculating structural, thermal, and mechanical properties, where only fair agreement with experimental data is obtained. We performed MD simulations using the COMPASS force field for the polymer binder hydroxyl-terminated polybutadiene and five EMs: cyclotrimethylenetrinitramine, 1,3,5,7-tetranitro-1,3,5,7-tetra-azacyclo-octane, 2,4,6,8,10,12-hexantirohexaazazisowurzitane, 2,4,6-trinitro-1,3,5-benzenetriamine, and pentaerythritol tetranitate. Predicted EM crystallographic and molecular structural parameters, as well as calculated properties for the binder will be compared with experimental results for different simulation conditions. We also present novel simulation protocols, which improve agreement between experimental and computation results thus leading to the accurate modeling of PBXs.

  18. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  19. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  20. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  1. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  2. Optimal seeding of self-reproducing systems.

    PubMed

    Menezes, Amor A; Kabamba, Pierre T

    2012-01-01

    This article is motivated by the need to minimize the number of elements required to establish a self-reproducing system. One such system is a self-reproducing extraterrestrial robotic colony, which reduces the launch payload mass for space exploration compared to current mission configurations. In this work, self-reproduction is achieved by the actions of a robot on available resources. An important consideration for the establishment of any self-reproducing system is the identification of a seed, for instance, a set of resources and a set of robots that utilize them to produce all of the robots in the colony. This article outlines a novel algorithm to determine an optimal seed for self-reproducing systems, with application to a self-reproducing extraterrestrial robotic colony. Optimality is understood as the minimization of a cost function of the resources and, in this article, the robots. Since artificial self-reproduction is currently an open problem, the algorithm is illustrated with a simple robotic self-replicating system from the literature and with a more complicated self-reproducing example from nature. PMID:22035080

  3. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    PubMed

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model. PMID:19360291

  4. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity. PMID:21302997

  5. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  6. Reproducibility responsibilities in the HPC arena

    SciTech Connect

    Fahey, Mark R; McLay, Robert

    2014-01-01

    Expecting bit-for-bit reproducibility in the HPC arena is not feasible because of the ever changing hardware and software. No user s application is an island; it lives in an HPC eco-system that changes over time. Old hardware stops working and even old software won t run on new hardware. Further, software libraries change over time either by changing the internals or even interfaces. So bit-for-bit reproducibility should not be expected. Rather a reasonable expectation is that results are reproducible within error bounds; or that the answers are close (which is its own debate.) To expect a researcher to reproduce their own results or the results of others within some error bounds, there must be enough information to recreate all the details of the experiment. This requires complete documentation of all phases of the researcher s workflow; from code to versioning to programming and runtime environments to publishing of data. This argument is the core statement of the Yale 2009 Declaration on Reproducible Research [1]. Although the HPC ecosystem is often outside the researchers control, the application code could be built almost identically and there is a chance for very similar results with just only round-off error differences. To achieve complete documentation at every step, the researcher, the computing center, and the funding agencies all have a role. In this thesis, the role of the researcher is expanded upon as compared to the Yale report and the role of the computing centers is described.

  7. To Your Health: NLM update transcript - The road to reproducible research?

    MedlinePlus

    ... Medicine reproducibility conference he was interested in the feasibility of (and we quote): 'something like a Clinicaltrials. ... describes when consistent results occur in repeated scientific studies (despite small variations within an experimental set-up), ...

  8. Accurate measurement of time

    NASA Astrophysics Data System (ADS)

    Itano, Wayne M.; Ramsey, Norman F.

    1993-07-01

    The paper discusses current methods for accurate measurements of time by conventional atomic clocks, with particular attention given to the principles of operation of atomic-beam frequency standards, atomic hydrogen masers, and atomic fountain and to the potential use of strings of trapped mercury ions as a time device more stable than conventional atomic clocks. The areas of application of the ultraprecise and ultrastable time-measuring devices that tax the capacity of modern atomic clocks include radio astronomy and tests of relativity. The paper also discusses practical applications of ultraprecise clocks, such as navigation of space vehicles and pinpointing the exact position of ships and other objects on earth using the GPS.

  9. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  10. The Economics of Reproducibility in Preclinical Research

    PubMed Central

    Freedman, Leonard P.; Cockburn, Iain M.; Simcoe, Timothy S.

    2015-01-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible—in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures. PMID:26057340

  11. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  12. Reproducibility, Controllability, and Optimization of Lenr Experiments

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  13. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  14. Reproducibility of ambulatory blood pressure load.

    PubMed

    Zachariah, P K; Sheps, S G; Bailey, K R; Wiltgen, C M; Moore, A G

    1990-12-01

    Twenty-two hypertensive patients were monitored during two separate drug-free occasions with a Del Mar Avionics ambulatory device. Blood pressure loads (percentage of systolic and diastolic readings more than 140 and 90 mmHg, respectively) and mean BP were measured both to determine their reproducibility and to examine how they correlate with each other. The systolic and diastolic mean awake BPs for day 1 and day 2 were 140/93 mmHg and 140/91 mmHg, respectively, and BP loads were 45%/55% and 43%/54%. Moreover, mean BP loads correlated highly (r = 0.93) with mean BP values taken on the same day. Both ambulatory mean SBP and BP load were highly reproducible (r = 0.87 and 0.80, respectively, during the awake hours), and mean DBP and load were fairly reproducible (r = 0.59 and 0.39, respectively, during the awake hours). Clinically, however, both were consistent from day 1 to day 2. Mean and individual standard deviations also were reproducible for both systolic and diastolic pressures and loads. PMID:2096203

  15. Design Procedure and Fabrication of Reproducible Silicon Vernier Devices for High-Performance Refractive Index Sensing

    PubMed Central

    Troia, Benedetto; Khokhar, Ali Z.; Nedeljkovic, Milos; Reynolds, Scott A.; Hu, Youfang; Mashanovich, Goran Z.; Passaro, Vittorio M. N.

    2015-01-01

    In this paper, we propose a generalized procedure for the design of integrated Vernier devices for high performance chemical and biochemical sensing. In particular, we demonstrate the accurate control of the most critical design and fabrication parameters of silicon-on-insulator cascade-coupled racetrack resonators operating in the second regime of the Vernier effect, around 1.55 μm. The experimental implementation of our design strategies has allowed a rigorous and reliable investigation of the influence of racetrack resonator and directional coupler dimensions as well as of waveguide process variability on the operation of Vernier devices. Figures of merit of our Vernier architectures have been measured experimentally, evidencing a high reproducibility and a very good agreement with the theoretical predictions, as also confirmed by relative errors even lower than 1%. Finally, a Vernier gain as high as 30.3, average insertion loss of 2.1 dB and extinction ratio up to 30 dB have been achieved. PMID:26067193

  16. Design Procedure and Fabrication of Reproducible Silicon Vernier Devices for High-Performance Refractive Index Sensing.

    PubMed

    Troia, Benedetto; Khokhar, Ali Z; Nedeljkovic, Milos; Reynolds, Scott A; Hu, Youfang; Mashanovich, Goran Z; Passaro, Vittorio M N

    2015-01-01

    In this paper, we propose a generalized procedure for the design of integrated Vernier devices for high performance chemical and biochemical sensing. In particular, we demonstrate the accurate control of the most critical design and fabrication parameters of silicon-on-insulator cascade-coupled racetrack resonators operating in the second regime of the Vernier effect, around 1.55 μm. The experimental implementation of our design strategies has allowed a rigorous and reliable investigation of the influence of racetrack resonator and directional coupler dimensions as well as of waveguide process variability on the operation of Vernier devices. Figures of merit of our Vernier architectures have been measured experimentally, evidencing a high reproducibility and a very good agreement with the theoretical predictions, as also confirmed by relative errors even lower than 1%. Finally, a Vernier gain as high as 30.3, average insertion loss of 2.1 dB and extinction ratio up to 30 dB have been achieved. PMID:26067193

  17. A Physical Activity Questionnaire: Reproducibility and Validity

    PubMed Central

    Barbosa, Nicolas; Sanchez, Carlos E.; Vera, Jose A.; Perez, Wilson; Thalabard, Jean-Christophe; Rieu, Michel

    2007-01-01

    This study evaluates the Quantification de L’Activite Physique en Altitude chez les Enfants (QAPACE) supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE) on Bogotá’s schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC). The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2) from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97); by age categories 8-10, 0.94 (0.89-0. 97); 11-13, 0.98 (0.96- 0.99); 14-16, 0.95 (0.91-0.98). The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66) (p<0.01); by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87), 0.76 (0.78) and 0.88 (0.80) respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake. Key pointsThe presence of a supervisor, the limited size of the group with the possibility of answering to their questions could explain the high reproducibility for this questionnaire.No study in the literature had directly addressed the issue of estimating a yearly average PA including school and vacation period.A two step procedure, in the population of schoolchildren of Bogotá, gives confidence in the use of the QAPACE questionnaire in a large epidemiological survey in related populations. PMID:24149485

  18. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  19. A meshfree unification: reproducing kernel peridynamics

    NASA Astrophysics Data System (ADS)

    Bessa, M. A.; Foster, J. T.; Belytschko, T.; Liu, Wing Kam

    2014-06-01

    This paper is the first investigation establishing the link between the meshfree state-based peridynamics method and other meshfree methods, in particular with the moving least squares reproducing kernel particle method (RKPM). It is concluded that the discretization of state-based peridynamics leads directly to an approximation of the derivatives that can be obtained from RKPM. However, state-based peridynamics obtains the same result at a significantly lower computational cost which motivates its use in large-scale computations. In light of the findings of this study, an update to the method is proposed such that the limitations regarding application of boundary conditions and the use of non-uniform grids are corrected by using the reproducing kernel approximation.

  20. The Road to Reproducibility in Animal Research.

    PubMed

    Jilka, Robert L

    2016-07-01

    Reproducibility of research findings is the hallmark of scientific advance. However, the recently noted lack of reproducibility and transparency of published research using animal models of human biology and disease has alarmed funders, scientists, and the public. Improved reporting of methodology and better use of statistical tools are needed to enhance the quality and utility of published research. Reporting guidelines like Animal Research: Reporting In Vivo Experiments (ARRIVE) have been devised to achieve these goals, but most biomedical research journals, including the JBMR, have not been able to obtain high compliance. Cooperative efforts among authors, reviewers and editors-empowered by increased awareness of their responsibilities, and enabled by user-friendly guidelines-are needed to solve this problem. © 2016 American Society for Bone and Mineral Research. PMID:27255286

  1. Reproducibility study for volume estimation in MRI of the brain using the Eigenimage algorithm

    NASA Astrophysics Data System (ADS)

    Windham, Joe P.; Peck, Donald J.; Soltanian-Zadeh, Hamid

    1995-05-01

    Accurate and reproducible volume calculations are essential for diagnosis and treatment evaluation for many medical situations. Current techniques employ planimetric methods that are very time consuming to obtain reliable results. The reproducibility and accuracy of these methods depend on the user and the complexity of the volume being measured. We have reported on an algorithm for volume calculation that uses the Eigenimage filter to segment a desired feature from surrounding, interfering features. The pixel intensities of the resulting image have information pertaining to partial volume averaging effects in each voxel preserved thus providing an accurate volume calculation. Also, the amount of time required is significantly reduced, as compared to planimetric methods, and the reproducibility is less user dependent and is independent of the volume shape. In simulations and phantom studies the error in accuracy and reproducibility of this method were less than 2%. The purpose of this study was to determine the reproducibility of the method for volume calculations of the human brain. Ten volunteers were imaged and the volume of white matter, gray matter, and CSF were estimated. The time required to calculate the volume for all three tissues was approximately one minute per slice. The inter- and intra-observer reproducibility errors were less than 5% on average for all volumes calculated. These results were determined to be dependent on the proper selection of the ROIs used to define the tissue signature vectors and the non-uniformity of the MRI system.

  2. Reproducibility and imputation of air toxics data.

    PubMed

    Le, Hien Q; Batterman, Stuart A; Wahl, Robert L

    2007-12-01

    Ambient air quality datasets include missing data, values below method detection limits and outliers, and the precision and accuracy of the measurements themselves are often unknown. At the same time, many analyses require continuous data sequences and assume that measurements are error-free. While a variety of data imputation and cleaning techniques are available, the evaluation of such techniques remains limited. This study evaluates the performance of these techniques for ambient air toxics measurements, a particularly challenging application, and includes the analysis of intra- and inter-laboratory precision. The analysis uses an unusually complete-dataset, consisting of daily measurements of over 70 species of carbonyls and volatile organic compounds (VOCs) collected over a one year period in Dearborn, Michigan, including 122 pairs of replicates. Analysis was restricted to compounds found above detection limits in > or =20% of the samples. Outliers were detected using the Gumbell extreme value distribution. Error models for inter- and intra-laboratory reproducibility were derived from replicate samples. Imputation variables were selected using a generalized additive model, and the performance of two techniques, multiple imputation and optimal linear estimation, was evaluated for three missingness patterns. Many species were rarely detected or had very poor reproducibility. Error models developed for seven carbonyls showed median intra- and inter-laboratory errors of 22% and 25%, respectively. Better reproducibility was seen for the 16 VOCs meeting detection and reproducibility criteria. Imputation performance depended on the compound and missingness pattern. Data missing at random could be adequately imputed, but imputations for row-wise deletions, the most common type of missingness pattern encountered, were not informative. The analysis shows that air toxics data require significant efforts to identify and mitigate errors, outliers and missing observations

  3. An International Ki67 Reproducibility Study

    PubMed Central

    2013-01-01

    Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without

  4. Reproducibility of liquid oxygen impact test results

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1975-01-01

    Results for 12,000 impacts on a wide range of materials were studied to determine the reproducibility of the liquid oxygen impact test method. Standard deviations representing the overall variability of results were in close agreement with the expected values for a binomial process. This indicates that the major source of variability is due to the go - no go nature of the test method and that variations due to sampling and testing operations were not significant.

  5. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  6. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  7. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  8. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  9. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments

    PubMed Central

    Qi, Yuan; Liu, Xiuping; Liu, Chang-gong; Wang, Bailing; Hess, Kenneth R.; Symmans, W. Fraser; Shi, Weiwei; Pusztai, Lajos

    2015-01-01

    Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome) (~3.2 Mb) using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14) or triplicate (n=3) to assess concordance of all calls and single nucleotide variant (SNV) calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC), variant allele frequency (VAF), variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA) with accession number EGAS00001000826. PMID:26136146

  10. A highly reproducible rotenone model of Parkinson's disease.

    PubMed

    Cannon, Jason R; Tapias, Victor; Na, Hye Mee; Honick, Anthony S; Drolet, Robert E; Greenamyre, J Timothy

    2009-05-01

    The systemic rotenone model of Parkinson's disease (PD) accurately replicates many aspects of the pathology of human PD and has provided insights into the pathogenesis of PD. The major limitation of the rotenone model has been its variability, both in terms of the percentage of animals that develop a clear-cut nigrostriatal lesion and the extent of that lesion. The goal here was to develop an improved and highly reproducible rotenone model of PD. In these studies, male Lewis rats in three age groups (3, 7 or 12-14 months) were administered rotenone (2.75 or 3.0 mg/kg/day) in a specialized vehicle by daily intraperitoneal injection. All rotenone-treated animals developed bradykinesia, postural instability, and/or rigidity, which were reversed by apomorphine, consistent with a lesion of the nigrostriatal dopamine system. Animals were sacrificed when the PD phenotype became debilitating. Rotenone treatment caused a 45% loss of tyrosine hydroxylase-positive substantia nigra neurons and a commensurate loss of striatal dopamine. Additionally, in rotenone-treated animals, alpha-synuclein and poly-ubiquitin positive aggregates were observed in dopamine neurons of the substantia nigra. In summary, this version of the rotenone model is highly reproducible and may provide an excellent tool to test new neuroprotective strategies. PMID:19385059

  11. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... detects the unauthorized reproduction of classified documents is encouraged. (b) Unless restricted by the CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified...

  12. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... detects the unauthorized reproduction of classified documents is encouraged. (b) Unless restricted by the CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified...

  13. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... detects the unauthorized reproduction of classified documents is encouraged. (b) Unless restricted by the CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified...

  14. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... detects the unauthorized reproduction of classified documents is encouraged. (b) Unless restricted by the CSA, Secret and Confidential documents may be reproduced. Reproduced copies of classified...

  15. A highly accurate interatomic potential for argon

    NASA Astrophysics Data System (ADS)

    Aziz, Ronald A.

    1993-09-01

    A modified potential based on the individually damped model of Douketis, Scoles, Marchetti, Zen, and Thakkar [J. Chem. Phys. 76, 3057 (1982)] is presented which fits, within experimental error, the accurate ultraviolet (UV) vibration-rotation spectrum of argon determined by UV laser absorption spectroscopy by Herman, LaRocque, and Stoicheff [J. Chem. Phys. 89, 4535 (1988)]. Other literature potentials fail to do so. The potential also is shown to predict a large number of other properties and is probably the most accurate characterization of the argon interaction constructed to date.

  16. Towards reproducible, scalable lateral molecular electronic devices

    NASA Astrophysics Data System (ADS)

    Durkan, Colm; Zhang, Qian

    2014-08-01

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  17. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship. PMID:18771116

  18. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  19. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  20. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    for inclusion in standard atmospheric and planetary spectroscopic databases. The methods involved in computing the ab initio potential energy and dipole moment surfaces involved minor corrections to the equilibrium S-O distance, which produced a good agreement with experimentally determined rotational energies. However the purely ab initio method was not been able to reproduce an equally spectroscopically accurate representation of vibrational motion. We therefore present an empirical refinement to this original, ab initio potential surface, based on the experimental data available. This will not only be used to reproduce the room-temperature spectrum to a greater degree of accuracy, but is essential in the production of a larger, accurate line list necessary for the simulation of higher temperature spectra: we aim for coverage suitable for T ? 800 K. Our preliminary studies on SO3 have also shown it to exhibit an interesting "forbidden" rotational spectrum and "clustering" of rotational states; to our knowledge this phenomenon has not been observed in other examples of trigonal planar molecules and is also an investigative avenue we wish to pursue. Finally, the IR absorption bands for SO2 and SO3 exhibit a strong overlap, and the inclusion of SO2 as a complement to our studies is something that we will be interested in doing in the near future.

  1. Comparative reproducibility of defibrillation threshold and upper limit of vulnerability.

    PubMed

    Swerdlow, C D; Davie, S; Ahern, T; Chen, P S

    1996-12-01

    defibrillation efficacy under different experimental conditions, the sample sizes required to detect differences of 2 J, 3 J, and 4 J (80% power, P < 0.05) were 52, 24, and 15 for DFT versus 15, 8, and 6 for ULV. We conclude that a simple, clinically applicable method for determination of ULV is more reproducible than the single point DFT. Measured correlations between the ULV and single point are limited by the reproducibility of the DFT measurement. PMID:8994950

  2. Reproducibility Data on SUMMiT

    SciTech Connect

    Irwin, Lloyd; Jakubczak, Jay; Limary, Siv; McBrayer, John; Montague, Stephen; Smith, James; Sniegowski, Jeffry; Stewart, Harold; de Boer, Maarten

    1999-07-16

    SUMMiT (Sandia Ultra-planar Multi-level MEMS Technology) at the Sandia National Laboratories' MDL (Microelectronics Development Laboratory) is a standardized MEMS (Microelectromechanical Systems) technology that allows designers to fabricate concept prototypes. This technology provides four polysilicon layers plus three sacrificial oxide layers (with the third oxide layer being planarized) to enable fabrication of complex mechanical systems-on-a-chip. Quantified reproducibility of the SUMMiT process is important for process engineers as well as designers. Summary statistics for critical MEMS technology parameters such as film thickness, line width, and sheet resistance will be reported for the SUMMiT process. Additionally, data from Van der Pauw test structures will be presented. Data on film thickness, film uniformity and critical dimensions of etched line widths are collected from both process and monitor wafers during manufacturing using film thickness metrology tools and SEM tools. A standardized diagnostic module is included in each SWiT run to obtain post-processing parametric data to monitor run-to-run reproducibility such as Van der Pauw structures for measuring sheet resistance. This characterization of the SUMMiT process enables design for manufacturability in the SUMMiT technology.

  3. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted. PMID:26941312

  4. Reproducibility of electrochemical noise data from coated metal systems

    SciTech Connect

    Bierwagen, G.P.; Mills, D.J.; Tallman, D.E.; Skerry, B.S.

    1996-12-31

    The use of electrochemical noise (ECN) as a method to characterize the corrosion-protection properties of organic coatings on metal substrates was pioneered by Skerry and Eden, and since then has been used by others as a probe for coating metal corrosion studies. However, no statistical examination of the reproducibility of the data from such measurements has been published. In the data the authors present, they have done a systematic analysis of important experimental variables in such systems. They have examined the method for accuracy and reproducibility with respect to sample preparation, sample immersion, and metal substrate preparation. They have taken several marine coatings systems typical of US Navy use, prepared duplicate samples of coating metal systems, and examined them under the same immersion exposure. The variables they considered for reproducibility are paint application (in three-coat systems), metal panel preparation (grit-blasted steel), and immersion conditions. The authors present ECN data with respect to immersion time on the values of noise voltage standard deviation {sigma}{sub V}, noise current standard deviation {sigma}{sub I}, and the noise resistance R{sub n} as given by {sigma}{sub V}/{sigma}{sub I}. The variation among supposedly identical sample pairs in identical immersion monitored under identical conditions is presented. The statistics of the time records of the data are considered, and the variations with respect to specific coatings classes are also considered within the limits of the data. Based on these data, comments concerning ECN on coated metal systems as a predictive test method are presented along with special considerations that must be made to properly use the method for coating ranking and lifetime prediction.

  5. The importance of accurate adiabatic interaction potentials for the correct description of electronically nonadiabatic vibrational energy transfer: A combined experimental and theoretical study of NO(v = 3) collisions with a Au(111) surface

    SciTech Connect

    Golibrzuch, Kai; Shirhatti, Pranav R.; Kandratsenka, Alexander; Wodtke, Alec M.; Bartels, Christof; Max Planck Institute for Biophysical Chemistry, Göttingen 37077 ; Rahinov, Igor; Auerbach, Daniel J.; Max Planck Institute for Biophysical Chemistry, Göttingen 37077; Department of Chemistry and Biochemistry, University of California Santa Barbara, Santa Barbara, California 93106

    2014-01-28

    We present a combined experimental and theoretical study of NO(v = 3 → 3, 2, 1) scattering from a Au(111) surface at incidence translational energies ranging from 0.1 to 1.2 eV. Experimentally, molecular beam–surface scattering is combined with vibrational overtone pumping and quantum-state selective detection of the recoiling molecules. Theoretically, we employ a recently developed first-principles approach, which employs an Independent Electron Surface Hopping (IESH) algorithm to model the nonadiabatic dynamics on a Newns-Anderson Hamiltonian derived from density functional theory. This approach has been successful when compared to previously reported NO/Au scattering data. The experiments presented here show that vibrational relaxation probabilities increase with incidence energy of translation. The theoretical simulations incorrectly predict high relaxation probabilities at low incidence translational energy. We show that this behavior originates from trajectories exhibiting multiple bounces at the surface, associated with deeper penetration and favored (N-down) molecular orientation, resulting in a higher average number of electronic hops and thus stronger vibrational relaxation. The experimentally observed narrow angular distributions suggest that mainly single-bounce collisions are important. Restricting the simulations by selecting only single-bounce trajectories improves agreement with experiment. The multiple bounce artifacts discovered in this work are also present in simulations employing electronic friction and even for electronically adiabatic simulations, meaning they are not a direct result of the IESH algorithm. This work demonstrates how even subtle errors in the adiabatic interaction potential, especially those that influence the interaction time of the molecule with the surface, can lead to an incorrect description of electronically nonadiabatic vibrational energy transfer in molecule-surface collisions.

  6. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  7. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  8. Effect of population heterogenization on the reproducibility of mouse behavior: a multi-laboratory study.

    PubMed

    Richter, S Helene; Garner, Joseph P; Zipser, Benjamin; Lewejohann, Lars; Sachser, Norbert; Touma, Chadi; Schindler, Britta; Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter; van Stipdonk, Niek; van der Harst, Johanneke; Spruijt, Berry; Võikar, Vootele; Wolfer, David P; Würbel, Hanno

    2011-01-01

    In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies. PMID:21305027

  9. Effect of Population Heterogenization on the Reproducibility of Mouse Behavior: A Multi-Laboratory Study

    PubMed Central

    Richter, S. Helene; Garner, Joseph P.; Zipser, Benjamin; Lewejohann, Lars; Sachser, Norbert; Touma, Chadi; Schindler, Britta; Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter; van Stipdonk, Niek; van der Harst, Johanneke; Spruijt, Berry; Võikar, Vootele; Wolfer, David P.; Würbel, Hanno

    2011-01-01

    In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies. PMID:21305027

  10. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  11. Reproducing kernel hilbert space based single infrared image super resolution

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang; Deng, Liangjian; Shen, Wei; Xi, Ning; Zhou, Zhanxin; Song, Bo; Yang, Yongliang; Cheng, Yu; Dong, Lixin

    2016-07-01

    The spatial resolution of Infrared (IR) images is limited by lens optical diffraction, sensor array pitch size and pixel dimension. In this work, a robust model is proposed to reconstruct high resolution infrared image via a single low resolution sampling, where the image features are discussed and classified as reflective, cooled emissive and uncooled emissive based on infrared irradiation source. A spline based reproducing kernel hilbert space and approximative heaviside function are deployed to model smooth part and edge component of image respectively. By adjusting the parameters of heaviside function, the proposed model can enhance distinct part of images. The experimental results show that the model is applicable on both reflective and emissive low resolution infrared images to improve thermal contrast. The overall outcome produces a high resolution IR image, which makes IR camera better measurement accuracy and observes more details at long distance.

  12. Empirical Bayes for Group (DCM) Studies: A Reproducibility Study

    PubMed Central

    Litvak, Vladimir; Garrido, Marta; Zeidman, Peter; Friston, Karl

    2015-01-01

    This technical note addresses some key reproducibility issues in the dynamic causal modelling of group studies of event related potentials. Specifically, we address the reproducibility of Bayesian model comparison (and inferences about model parameters) from three important perspectives namely: (i) reproducibility with independent data (obtained by averaging over odd and even trials); (ii) reproducibility over formally distinct models (namely, classic ERP and canonical microcircuit or CMC models); and (iii) reproducibility over inversion schemes (inversion of the grand average and estimation of group effects using empirical Bayes). Our hope was to illustrate the degree of reproducibility one can expect from DCM when analysing different data, under different models with different analyses. PMID:26733846

  13. Empirical Bayes for Group (DCM) Studies: A Reproducibility Study.

    PubMed

    Litvak, Vladimir; Garrido, Marta; Zeidman, Peter; Friston, Karl

    2015-01-01

    This technical note addresses some key reproducibility issues in the dynamic causal modelling of group studies of event related potentials. Specifically, we address the reproducibility of Bayesian model comparison (and inferences about model parameters) from three important perspectives namely: (i) reproducibility with independent data (obtained by averaging over odd and even trials); (ii) reproducibility over formally distinct models (namely, classic ERP and canonical microcircuit or CMC models); and (iii) reproducibility over inversion schemes (inversion of the grand average and estimation of group effects using empirical Bayes). Our hope was to illustrate the degree of reproducibility one can expect from DCM when analysing different data, under different models with different analyses. PMID:26733846

  14. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  15. The reproducible radio outbursts of SS Cygni

    NASA Astrophysics Data System (ADS)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disk material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  16. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  17. The reproducible radio outbursts of SS Cygni

    NASA Astrophysics Data System (ADS)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disc material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  18. Reproducibility of the cutoff probe for the measurement of electron density

    NASA Astrophysics Data System (ADS)

    Kim, D. W.; You, S. J.; Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H.; Yoon, J.-S.; Oh, W. Y.

    2016-06-01

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  19. Retention projection enables accurate calculation of liquid chromatographic retention times across labs and methods.

    PubMed

    Abate-Pella, Daniel; Freund, Dana M; Ma, Yan; Simón-Manso, Yamil; Hollender, Juliane; Broeckling, Corey D; Huhman, David V; Krokhin, Oleg V; Stoll, Dwight R; Hegeman, Adrian D; Kind, Tobias; Fiehn, Oliver; Schymanski, Emma L; Prenni, Jessica E; Sumner, Lloyd W; Boswell, Paul G

    2015-09-18

    Identification of small molecules by liquid chromatography-mass spectrometry (LC-MS) can be greatly improved if the chromatographic retention information is used along with mass spectral information to narrow down the lists of candidates. Linear retention indexing remains the standard for sharing retention data across labs, but it is unreliable because it cannot properly account for differences in the experimental conditions used by various labs, even when the differences are relatively small and unintentional. On the other hand, an approach called "retention projection" properly accounts for many intentional differences in experimental conditions, and when combined with a "back-calculation" methodology described recently, it also accounts for unintentional differences. In this study, the accuracy of this methodology is compared with linear retention indexing across eight different labs. When each lab ran a test mixture under a range of multi-segment gradients and flow rates they selected independently, retention projections averaged 22-fold more accurate for uncharged compounds because they properly accounted for these intentional differences, which were more pronounced in steep gradients. When each lab ran the test mixture under nominally the same conditions, which is the ideal situation to reproduce linear retention indices, retention projections still averaged 2-fold more accurate because they properly accounted for many unintentional differences between the LC systems. To the best of our knowledge, this is the most successful study to date aiming to calculate (or even just to reproduce) LC gradient retention across labs, and it is the only study in which retention was reliably calculated under various multi-segment gradients and flow rates chosen independently by labs. PMID:26292625

  20. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  1. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2014-04-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale. A new approach for modelling soil erosion at large spatial scale is here proposed. It is based on the joint use of low data demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available datasets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country level statistics of pre-existing European maps of soil erosion by water is also provided.

  2. How to accurately bypass damage

    PubMed Central

    Broyde, Suse; Patel, Dinshaw J.

    2016-01-01

    Ultraviolet radiation can cause cancer through DNA damage — specifically, by linking adjacent thymine bases. Crystal structures show how the enzyme DNA polymerase η accurately bypasses such lesions, offering protection. PMID:20577203

  3. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, David C.; Goorvitch, D.

    1994-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schr\\"{o}dinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  4. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration. PMID:25195174

  5. Accurate and Efficient Resolution of Overlapping Isotopic Envelopes in Protein Tandem Mass Spectra

    PubMed Central

    Xiao, Kaijie; Yu, Fan; Fang, Houqin; Xue, Bingbing; Liu, Yan; Tian, Zhixin

    2015-01-01

    It has long been an analytical challenge to accurately and efficiently resolve extremely dense overlapping isotopic envelopes (OIEs) in protein tandem mass spectra to confidently identify proteins. Here, we report a computationally efficient method, called OIE_CARE, to resolve OIEs by calculating the relative deviation between the ideal and observed experimental abundance. In the OIE_CARE method, the ideal experimental abundance of a particular overlapping isotopic peak (OIP) is first calculated for all the OIEs sharing this OIP. The relative deviation (RD) of the overall observed experimental abundance of this OIP relative to the summed ideal value is then calculated. The final individual abundance of the OIP for each OIE is the individual ideal experimental abundance multiplied by 1 + RD. Initial studies were performed using higher-energy collisional dissociation tandem mass spectra on myoglobin (with direct infusion) and the intact E. coli proteome (with liquid chromatographic separation). Comprehensive data at the protein and proteome levels, high confidence and good reproducibility were achieved. The resolving method reported here can, in principle, be extended to resolve any envelope-type overlapping data for which the corresponding theoretical reference values are available. PMID:26439836

  6. Accurate Evaluation of Ion Conductivity of the Gramicidin A Channel Using a Polarizable Force Field without Any Corrections.

    PubMed

    Peng, Xiangda; Zhang, Yuebin; Chu, Huiying; Li, Yan; Zhang, Dinglin; Cao, Liaoran; Li, Guohui

    2016-06-14

    Classical molecular dynamic (MD) simulation of membrane proteins faces significant challenges in accurately reproducing and predicting experimental observables such as ion conductance and permeability due to its incapability of precisely describing the electronic interactions in heterogeneous systems. In this work, the free energy profiles of K(+) and Na(+) permeating through the gramicidin A channel are characterized by using the AMOEBA polarizable force field with a total sampling time of 1 μs. Our results indicated that by explicitly introducing the multipole terms and polarization into the electrostatic potentials, the permeation free energy barrier of K(+) through the gA channel is considerably reduced compared to the overestimated results obtained from the fixed-charge model. Moreover, the estimated maximum conductance, without any corrections, for both K(+) and Na(+) passing through the gA channel are much closer to the experimental results than any classical MD simulations, demonstrating the power of AMOEBA in investigating the membrane proteins. PMID:27171823

  7. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  8. A Mechanical System to Reproduce Cardiovascular Flows

    NASA Astrophysics Data System (ADS)

    Lindsey, Thomas; Valsecchi, Pietro

    2010-11-01

    Within the framework of the "Pumps&Pipes" collaboration between ExxonMobil Upstream Research Company and The DeBakey Heart and Vascular Center in Houston, a hydraulic control system was developed to accurately simulate general cardiovascular flows. The final goal of the development of the apparatus was the reproduction of the periodic flow of blood through the heart cavity with the capability of varying frequency and amplitude, as well as designing the systolic/diastolic volumetric profile over one period. The system consists of a computer-controlled linear actuator that drives hydraulic fluid in a closed loop to a secondary hydraulic cylinder. The test section of the apparatus is located inside a MRI machine, and the closed loop serves to physically separate all metal moving parts (control system and actuator cylinder) from the MRI-compatible pieces. The secondary cylinder is composed of nonmetallic elements and directly drives the test section circulatory flow loop. The circulatory loop consists of nonmetallic parts and several types of Newtonian and non-Newtonian fluids, which model the behavior of blood. This design allows for a periodic flow of blood-like fluid pushed through a modeled heart cavity capable of replicating any healthy heart condition as well as simulating anomalous conditions. The behavior of the flow inside the heart can thus be visualized by MRI techniques.

  9. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  10. Reproducible erythroid aplasia caused by mycophenolate mofetil.

    PubMed

    Arbeiter, K; Greenbaum, L; Balzar, E; Müller, T; Hofmeister, F; Bidmon, B; Aufricht, C

    2000-03-01

    Anemia secondary to mycophenolate mofetil (MMF) was recently described in experimental animals. A clinical association between MMF and anemia has been observed, but there are no proven reports. We describe a girl with chronic graft failure who developed erythroid aplasia under immunosuppression with MMF. She showed prompt resolution when MMF was discontinued and a recurrence of this clinical course when MMF was restarted. As re-challenge with a medication is the most definitive approach for showing a direct relationship between the drug and the side effect, this case clearly demonstrates that MMF can cause erythroid aplasia. PMID:10752755

  11. Performance of Density Functional Models to Reproduce Observed 13Cα Chemical Shifts of Proteins in Solution

    PubMed Central

    Vila, Jorge A.; Baldoni, Héctor A.; Scheraga, Harold A.

    2009-01-01

    The purpose of this work is to test several density functional models (namely, OPBE, O3LYP, OPW91, BPW91, OB98, BPBE, B971, OLYP, PBE1PBE, and B3LYP) to determine their accuracy and speed for computing 13Cα chemical shifts in proteins. The test is applied to 10 NMR-derived conformations of the 76-residue α/β protein ubiquitin (protein data bank id 1D3Z). With each functional, the 13Cα shielding was computed for 760 amino acid residues by using a combination of approaches that includes, but is not limited to, treating each amino acid X in the sequence as a terminally blocked tripeptide with the sequence Ac-GXG-NMe in the conformation of the regularized experimental protein structure. As computation of the 13Cα chemical shifts, not their shielding, is the main goal of this work, a computation of the 13Cα shielding of the reference, namely, tetramethylsilane, is investigated here and an effective and a computed tetramethylsilane shielding value for each of the functionals is provided. Despite observed small differences among all functionals tested, the results indicate that four of them, namely, OPBE, OPW91, OB98, and OLYP, provide the most accurate functionals with which to reproduce observed 13Cα chemical shifts of proteins in solution, and are among the faster ones. This study also provides evidence for the applicability of these functionals to proteins of any size or class, and for the validation of our previous results and conclusions, obtained from calculations with the slower B3LYP functional. PMID:18780343

  12. Automated curve matching techniques for reproducible, high-resolution palaeomagnetic dating

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Channell, James

    2016-04-01

    High-resolution relative palaeointensity (RPI) and palaeosecular variation (PSV) data are increasingly important for accurate dating of sedimentary sequences, often in combination with oxygen isotope (δ18O) measurements. A chronology is established by matching a measured downcore signal to a dated reference curve, but there is no standard methodology for performing this correlation. Traditionally, matching is done by eye, but this becomes difficult when two parameters (e.g. RPI and δ18O) are being matched simultaneously, and cannot be done entirely objectively or repeatably. More recently, various automated techniques have appeared for matching one or more signals. We present Scoter, a user-friendly program for dating by signal matching and for comparing different matching techniques. Scoter is a cross-platform application implemented in Python, and consists of a general-purpose signal processing and correlation library linked to a graphical desktop front-end. RPI, PSV, and other records can be opened, pre-processed, and automatically matched with reference curves. A Scoter project can be exported as a self-contained bundle, encapsulating the input data, pre-processing steps, and correlation parameters, as well as the program itself. The analysis can be automatically replicated by anyone using only the resources in the bundle, ensuring full reproducibility. The current version of Scoter incorporates an experimental signal-matching algorithm based on simulated annealing, as well as an interface to the well-established Match program of Lisiecki and Lisiecki (2002), enabling results of the two approaches to be compared directly.

  13. The importance of accurate experimental data to marginal field development

    SciTech Connect

    Overa, S.J.; Lingelem, M.N.

    1997-12-31

    Since exploration started in the Norwegian North Sea in 1965 a total of 196 fields have been discovered. Less than one-third of these fields have been developed. The marginal fields can not be developed economically with current technology even though some of those fields have significant reserves. The total cost to develop one of those large installations is estimated to be 2--5 billion US dollars. Therefore new technology is needed to lower the designed and installed costs of each unit. The need for new physical property data is shown. The value of valid operating data from present units is also pointed out.

  14. Fast and Accurate Exhaled Breath Ammonia Measurement

    PubMed Central

    Solga, Steven F.; Mudalel, Matthew L.; Spacek, Lisa A.; Risby, Terence H.

    2014-01-01

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations. PMID:24962141

  15. The R software environment in reproducible geoscientific research

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer; Nüst, Daniel; Bivand, Roger

    2012-04-01

    Reproducibility is an important aspect of scientific research, because the credibility of science is at stake when research is not reproducible. Like science, the development of good, reliable scientific software is a social process. A mature and growing community relies on the R software environment for carrying out geoscientific research. Here we describe why people use R and how it helps in communicating and reproducing research.

  16. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  17. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  18. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy.

    PubMed

    Brinkmann, Benjamin H; Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C; Chen, Min; Tieng, Quang M; He, Jialune; Muñoz-Almaraz, F J; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E; Litt, Brian; Worrell, Gregory A

    2016-06-01

    SEE MORMANN AND ANDRZEJAK DOI101093/BRAIN/AWW091 FOR A SCIENTIFIC COMMENTARY ON THIS ARTICLE  : Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and human

  19. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  20. Ability of a “minimum” microbial food web model to reproduce response patterns observed in mesocosms manipulated with N and P, glucose, and Si

    NASA Astrophysics Data System (ADS)

    Thingstad, T. Frede; Havskum, Harry; Zweifel, Ulla Li; Berdalet, Elisa; Sala, M. Montserrat; Peters, Francesc; Alcaraz, Miquel; Scharek, Renate; Perez, Maite; Jacquet, Stéphan; Flaten, Gro Anita Fonnes; Dolan, John R.; Marrasé, Celia; Rassoulzadegan, Fereidoun; Hagstrøm, Åke; Vaulot, Daniel

    2007-01-01

    We compared an idealised mathematical model of the lower part of the pelagic food web to experimental data from a mesocosm experiment in which the supplies of mineral nutrients (nitrogen and phosphorous), bioavailable dissolved organic carbon (BDOC, as glucose), and silicate were manipulated. The central hypothesis of the experiment was that bacterial consumption of BDOC depends on whether the growth rate of heterotrophic bacteria is limited by organic-C or by mineral nutrients. In previous work, this hypothesis was examined qualitatively using a conceptual food web model. Here we explore the extent to which a "simplest possible" mathematical version of this conceptual model can reproduce the observed dynamics. The model combines algal-bacterial competition for mineral nutrients (phosphorous) and accounts for alternative limitation of bacterial and diatom growth rates by organic carbon and by silicate, respectively. Due to a slower succession in the diatom-copepod, compared to the flagellate-ciliate link, silicate availability increases the magnitude and extends the duration of phytoplankton blooms induced by mineral nutrient addition. As a result, Si interferes negatively with bacterial consumption of BDOC consumption by increasing and prolonging algal-bacterial competition for mineral nutrients. In order to reproduce the difference in primary production between Si and non-Si amended treatments, we had to assume a carbon overflow mechanism in diatom C-fixation. This model satisfactorily reproduced central features observed in the mesocosm experiment, including the dynamics of glucose consumption, algal, bacterial, and mesozooplankton biomass. While the parameter set chosen allows the model to reproduce the pattern seen in bacterial production, we were not able to find a single set of parameters that simultaneously reproduces both the level and the pattern observed for bacterial production. Profound changes in bacterial morphology and stoichiometry were reported in

  1. Virtual Reference Environments: a simple way to make research reproducible

    PubMed Central

    Hurley, Daniel G.; Budden, David M.

    2015-01-01

    Reproducible research’ has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work. PMID:25433467

  2. Interlaboratory reproducibility of large-scale human protein-complex analysis by standardized AP-MS.

    PubMed

    Varjosalo, Markku; Sacco, Roberto; Stukalov, Alexey; van Drogen, Audrey; Planyavsky, Melanie; Hauri, Simon; Aebersold, Ruedi; Bennett, Keiryn L; Colinge, Jacques; Gstaiger, Matthias; Superti-Furga, Giulio

    2013-04-01

    The characterization of all protein complexes of human cells under defined physiological conditions using affinity purification-mass spectrometry (AP-MS) is a highly desirable step in the quest to understand the phenotypic effects of genomic information. However, such a challenging goal has not yet been achieved, as it requires reproducibility of the experimental workflow and high data consistency across different studies and laboratories. We systematically investigated the reproducibility of a standardized AP-MS workflow by performing a rigorous interlaboratory comparative analysis of the interactomes of 32 human kinases. We show that it is possible to achieve high interlaboratory reproducibility of this standardized workflow despite differences in mass spectrometry configurations and subtle sample preparation-related variations and that combination of independent data sets improves the approach sensitivity, resulting in even more-detailed networks. Our analysis demonstrates the feasibility of obtaining a high-quality map of the human protein interactome with a multilaboratory project. PMID:23455922

  3. Development of hydrophobic surface substrates enabling reproducible drop-and-dry spectroscopic measurements.

    PubMed

    Lee, Jinah; Duy, Pham Khac; Park, Seok Chan; Chung, Hoeil

    2016-06-01

    We investigated several spectroscopic substrates with hydrophobic surfaces that were able to form reproducible droplets of aqueous samples for reliable high throughput drop-and-dry measurements. An amine-coated substrate, a polytetrafluoroethylene (PTFE) disk, and a perfluorooctyltrichlorosilane (FTS) coated substrate were prepared and initially evaluated for use in the determination of fat concentrations in milks using near-infrared (NIR) spectroscopy. Since the dried milk spots were not compositionally uniform due to the localization of components during sample drying, NIR spectra were collected by fully covering each spot to ensure a correct compositional representation of the sample. The amine-coated substrate yielded more reproducible dried milk patterns because its hydrophobicity was optimal for loading an appropriate amount of milk with decreased component localization after drying. The relative standard deviation (RSD) of the absorbance at 4330cm(-1) was 1.0%, thereby resulting in the more accurate determination of fat concentration. In addition, infrared (IR) spectroscopic discrimination between wild and transgenic tobaccos using their extracts was attempted. The extracted metabolites had a low concentration, so an FTS-coated CaF2 substrate that maximized sample loading was used to improve measurement sensitivity and produce reproducible droplets. The RSD of the absorbance at 1070cm(-1) was only 0.8%. Our strategy produced droplets that had consistent sizes and provided reproducible IR spectral features, which enabled the differentiation between wild and transgenic tobacco groups in the principal component (PC) score domain. PMID:27130086

  4. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be...

  5. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of Information § 1016.35 Authority to reproduce Restricted Data. Secret Restricted Data will not be...

  6. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  7. Attempts to reproduce vacuolar myelinopathy in domestic swine and chickens.

    PubMed

    Lewis-Weis, Lynn A; Gerhold, Richard W; Fischer, John R

    2004-07-01

    Avian vacuolar myelinopathy (AVM) was first recognized as a cause of bald eagle (Haliaeetus leucocephalus) mortality in 1994 in Arkansas (USA) and has since caused over 90 bald eagle and numerous American coot (Fulica americana) mortalities in five southeastern states. The cause of AVM remains undetermined but is suspected to be a biotoxin. Naturally occurring AVM has been limited to wild waterbirds, raptors, and one species of shorebird, and has been reproduced experimentally in red-tailed hawks (Buteo jamaicensis). In this study, chickens and swine were evaluated for susceptibility to vacuolar myelinopathy with the intent of developing animal models for research and to identify specific tissues in affected coots that contain the causative agent. Additionally, submerged, aquatic vegetation, primarily hydrilla (Hydrilla verticillata), and associated material collected from a reservoir during an AVM outbreak was fed to chickens in an effort to reproduce the disease. In two separate experiments, six 4-wk-old leghorn chickens and ten 5-wk-old leghorn chickens were fed coot tissues. In a third experiment, five 3-mo-old domestic swine and one red-tailed hawk, serving as a positive control, were fed coot tissues. In these experiments, treatment animals received tissues (brain, fat, intestinal tract, kidney, liver, and/or muscle) from coots with AVM lesions collected at a lake during an AVM outbreak. Negative control chickens and one pig received tissues from coots without AVM lesions that had been collected at a lake where AVM has never been documented. In a fourth experiment, eight 3-wk-old leghorn chickens were fed aquatic vegetation material. Four chickens received material from the same lake from which coots with AVM lesions were collected for the previous experiments, and four control chickens were fed material from the lake where AVM has never been documented. Blood was collected and physical and neurologic exams were conducted on animals before and once per week

  8. An Effective and Reproducible Model of Ventricular Fibrillation in Crossbred Yorkshire Swine (Sus scrofa) for Use in Physiologic Research

    PubMed Central

    Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C

    2015-01-01

    Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF. PMID:26473349

  9. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  10. Predict amine solution properties accurately

    SciTech Connect

    Cheng, S.; Meisen, A.; Chakma, A.

    1996-02-01

    Improved process design begins with using accurate physical property data. Especially in the preliminary design stage, physical property data such as density viscosity, thermal conductivity and specific heat can affect the overall performance of absorbers, heat exchangers, reboilers and pump. These properties can also influence temperature profiles in heat transfer equipment and thus control or affect the rate of amine breakdown. Aqueous-amine solution physical property data are available in graphical form. However, it is not convenient to use with computer-based calculations. Developed equations allow improved correlations of derived physical property estimates with published data. Expressions are given which can be used to estimate physical properties of methyldiethanolamine (MDEA), monoethanolamine (MEA) and diglycolamine (DGA) solutions.

  11. Accurate thickness measurement of graphene

    NASA Astrophysics Data System (ADS)

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  12. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  13. Using the mouse to model human disease: increasing validity and reproducibility

    PubMed Central

    Justice, Monica J.; Dhillon, Paraminder

    2016-01-01

    ABSTRACT Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings. PMID:26839397

  14. Reproducible and controllable induction voltage adder for scaled beam experiments.

    PubMed

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments. PMID:27587112

  15. Accurate determination of rates from non-uniformly sampled relaxation data.

    PubMed

    Stetz, Matthew A; Wand, A Joshua

    2016-08-01

    The application of non-uniform sampling (NUS) to relaxation experiments traditionally used to characterize the fast internal motion of proteins is quantitatively examined. Experimentally acquired Poisson-gap sampled data reconstructed with iterative soft thresholding are compared to regular sequentially sampled (RSS) data. Using ubiquitin as a model system, it is shown that 25 % sampling is sufficient for the determination of quantitatively accurate relaxation rates. When the sampling density is fixed at 25 %, the accuracy of rates is shown to increase sharply with the total number of sampled points until eventually converging near the inherent reproducibility of the experiment. Perhaps contrary to some expectations, it is found that accurate peak height reconstruction is not required for the determination of accurate rates. Instead, inaccuracies in rates arise from inconsistencies in reconstruction across the relaxation series that primarily manifest as a non-linearity in the recovered peak height. This indicates that the performance of an NUS relaxation experiment cannot be predicted from comparison of peak heights using a single RSS reference spectrum. The generality of these findings was assessed using three alternative reconstruction algorithms, eight different relaxation measurements, and three additional proteins that exhibit varying degrees of spectral complexity. From these data, it is revealed that non-linearity in peak height reconstruction across the relaxation series is strongly correlated with errors in NUS-derived relaxation rates. Importantly, it is shown that this correlation can be exploited to reliably predict the performance of an NUS-relaxation experiment by using three or more RSS reference planes from the relaxation series. The RSS reference time points can also serve to provide estimates of the uncertainty of the sampled intensity, which for a typical relaxation times series incurs no penalty in total acquisition time. PMID:27393626

  16. A technique to ensure the reproducibility of a cast post and core.

    PubMed

    Naas, Haitem M M; Dashti, Mohammad Hossein; Hashemian, Roxana; Hifeda, Nedda Y

    2014-12-01

    The post-and-core pattern duplication technique is a simple, cost-effective, and accurate method of ensuring the reproducibility of a cast post and core. An acrylic resin pattern is fabricated for an endodontically treated tooth. The post portion of the pattern is duplicated with a polyvinyl siloxane impression material in the lower compartment of a container. The core portion is then duplicated with a polyether impression material in the upper compartment. After the original pattern has been retrieved, the duplicate resin pattern is fabricated in the provided space. This technique will improve efficiency if damage or loss of the pattern or the actual cast post and core occurs. PMID:25156094

  17. Accurate Cross Sections for Microanalysis

    PubMed Central

    Rez, Peter

    2002-01-01

    To calculate the intensity of x-ray emission in electron beam microanalysis requires a knowledge of the energy distribution of the electrons in the solid, the energy variation of the ionization cross section of the relevant subshell, the fraction of ionizations events producing x rays of interest and the absorption coefficient of the x rays on the path to the detector. The theoretical predictions and experimental data available for ionization cross sections are limited mainly to K shells of a few elements. Results of systematic plane wave Born approximation calculations with exchange for K, L, and M shell ionization cross sections over the range of electron energies used in microanalysis are presented. Comparisons are made with experimental measurement for selected K shells and it is shown that the plane wave theory is not appropriate for overvoltages less than 2.5 V. PMID:27446747

  18. Sympathetic neural reactivity to mental stress in humans: test-retest reproducibility.

    PubMed

    Fonkoue, Ida T; Carter, Jason R

    2015-12-01

    Mental stress consistently increases arterial blood pressure, but this reliable pressor response is often associated with highly variable muscle sympathetic nerve activity (MSNA) responsiveness between individuals. Although MSNA has been shown to be reproducible within individuals at rest and during the cold pressor test (CPT), intraindividual reproducibility of MSNA responsiveness to mental stress has not been adequately explored. The purpose of this study was to examine MSNA reactivity to mental stress across three experimental sessions. Sixteen men and women (age 21 ± 1 yr) performed two experimental sessions within a single laboratory visit and a third experimental session 1 mo later. Each experimental session consisted of a mental stress trial via mental arithmetic and a CPT trial. Blood pressure, heart rate (HR), and MSNA were measured, and the consistencies of these variables were determined using intraclass correlation (Cronbach's α coefficient). MSNA, mean arterial pressure (MAP), and HR were highly reproducible across the baselines preceding mental stress (Cronbach's α ≥ 0.816, P ≤ 0.001) and CPT (Cronbach's α ≥ 0.782, P ≤ 0.001). Across the three mental stress trials, changes in MSNA (Cronbach's α = 0.875; P = 0.001), MAP (Cronbach's α = 0.749; P < 0.001), and HR (Cronbach's α = 0.919; P < 0.001) were reproducible. During CPT, changes in MSNA (Cronbach's α = 0.805; P = 0.008), MAP (Cronbach's α = 0.878; P < 0.001), and HR (Cronbach's α = 0.927; P < 0.001) remained consistent across the three sessions. In conclusion, our findings demonstrate that MSNA reactivity to mental stress is consistent within a single laboratory visit and across laboratory sessions conducted on separate days. PMID:26400186

  19. 223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS WITH INDEPENDENT ALIGNMENTS FROM KEY BRIDGE LOOKING NORTHWEST, 1953. - George Washington Memorial Parkway, Along Potomac River from McLean to Mount Vernon, VA, Mount Vernon, Fairfax County, VA

  20. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  1. 8. Historic American Buildings Survey Reproduced from the collections of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey Reproduced from the collections of the Library of Congress, Accession No. 45041 Geographical File ('Nantucket, Mass.') Division of Prints and Photographs c. 1880 - Jethro Coffin House, Sunset Hill, Nantucket, Nantucket County, MA

  2. Towards the computations of accurate spectroscopic parameters and vibrational spectra for organic compounds

    NASA Astrophysics Data System (ADS)

    Hochlaf, M.; Puzzarini, C.; Senent, M. L.

    2015-07-01

    We present multi-component computations for rotational constants, vibrational and torsional levels of medium-sized molecules. Through the treatment of two organic sulphur molecules, ethyl mercaptan and dimethyl sulphide, which are relevant for atmospheric and astrophysical media, we point out the outstanding capabilities of explicitly correlated coupled clusters (CCSD(T)-F12) method in conjunction with the cc-pVTZ-F12 basis set for the accurate predictions of such quantities. Indeed, we show that the CCSD(T)-F12/cc-pVTZ-F12 equilibrium rotational constants are in good agreement with those obtained by means of a composite scheme based on CCSD(T) calculations that accounts for the extrapolation to the complete basis set (CBS) limit and core-correlation effects [CCSD(T)/CBS+CV], thus leading to values of ground-state rotational constants rather close to the corresponding experimental data. For vibrational and torsional levels, our analysis reveals that the anharmonic frequencies derived from CCSD(T)-F12/cc-pVTZ-F12 harmonic frequencies and anharmonic corrections (Δν = ω - ν) at the CCSD/cc-pVTZ level closely agree with experimental results. The pattern of the torsional transitions and the shape of the potential energy surfaces along the torsional modes are also well reproduced using the CCSD(T)-F12/cc-pVTZ-F12 energies. Interestingly, this good accuracy is accompanied with a strong reduction of the computational costs. This makes the procedures proposed here as schemes of choice for effective and accurate prediction of spectroscopic properties of organic compounds. Finally, popular density functional approaches are compared with the coupled cluster (CC) methodologies in torsional studies. The long-range CAM-B3LYP functional of Handy and co-workers is recommended for large systems.

  3. Accurate determination of the interaction between Λ hyperons and nucleons from auxiliary field diffusion Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Lonardoni, D.; Pederiva, F.; Gandolfi, S.

    2014-01-01

    Background: An accurate assessment of the hyperon-nucleon interaction is of great interest in view of recent observations of very massive neutron stars. The challenge is to build a realistic interaction that can be used over a wide range of masses and in infinite matter starting from the available experimental data on the binding energy of light hypernuclei. To this end, accurate calculations of the hyperon binding energy in a hypernucleus are necessary. Purpose: We present a quantum Monte Carlo study of Λ and ΛΛ hypernuclei up to A =91. We investigate the contribution of two- and three-body Λ-nucleon forces to the Λ binding energy. Method: Ground state energies are computed solving the Schrödinger equation for nonrelativistic baryons by means of the auxiliary field diffusion Monte Carlo algorithm extended to the hypernuclear sector. Results: We show that a simple adjustment of the parameters of the ΛNN three-body force yields a very good agreement with available experimental data over a wide range of hypernuclear masses. In some cases no experiments have been performed yet, and we give new predictions. Conclusions: The newly fitted ΛNN force properly describes the physics of medium-heavy Λ hypernuclei, correctly reproducing the saturation property of the hyperon separation energy.

  4. Accurate ab Initio Spin Densities

    PubMed Central

    2012-01-01

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740]. PMID:22707921

  5. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  6. Standardization of Hemagglutination Inhibition Assay for Influenza Serology Allows for High Reproducibility between Laboratories.

    PubMed

    Zacour, Mary; Ward, Brian J; Brewer, Angela; Tang, Patrick; Boivin, Guy; Li, Yan; Warhuus, Michelle; McNeil, Shelly A; LeBlanc, Jason J; Hatchette, Todd F

    2016-03-01

    Standardization of the hemagglutination inhibition (HAI) assay for influenza serology is challenging. Poor reproducibility of HAI results from one laboratory to another is widely cited, limiting comparisons between candidate vaccines in different clinical trials and posing challenges for licensing authorities. In this study, we standardized HAI assay materials, methods, and interpretive criteria across five geographically dispersed laboratories of a multidisciplinary influenza research network and then evaluated intralaboratory and interlaboratory variations in HAI titers by repeatedly testing standardized panels of human serum samples. Duplicate precision and reproducibility from comparisons between assays within laboratories were 99.8% (99.2% to 100%) and 98.0% (93.3% to 100%), respectively. The results for 98.9% (95% to 100%) of the samples were within 2-fold of all-laboratory consensus titers, and the results for 94.3% (85% to 100%) of the samples were within 2-fold of our reference laboratory data. Low-titer samples showed the greatest variability in comparisons between assays and between sites. Classification of seroprotection (titer ≥ 40) was accurate in 93.6% or 89.5% of cases in comparison to the consensus or reference laboratory classification, respectively. This study showed that with carefully chosen standardization processes, high reproducibility of HAI results between laboratories is indeed achievable. PMID:26818953

  7. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  8. Making neurophysiological data analysis reproducible: why and how?

    PubMed

    Delescluse, Matthieu; Franconville, Romain; Joucla, Sébastien; Lieury, Tiffany; Pouzat, Christophe

    2012-01-01

    Reproducible data analysis is an approach aiming at complementing classical printed scientific articles with everything required to independently reproduce the results they present. "Everything" covers here: the data, the computer codes and a precise description of how the code was applied to the data. A brief history of this approach is presented first, starting with what economists have been calling replication since the early eighties to end with what is now called reproducible research in computational data analysis oriented fields like statistics and signal processing. Since efficient tools are instrumental for a routine implementation of these approaches, a description of some of the available ones is presented next. A toy example demonstrates then the use of two open source software programs for reproducible data analysis: the "Sweave family" and the org-mode of emacs. The former is bound to R while the latter can be used with R, Matlab, Python and many more "generalist" data processing software. Both solutions can be used with Unix-like, Windows and Mac families of operating systems. It is argued that neuroscientists could communicate much more efficiently their results by adopting the reproducible research paradigm from their lab books all the way to their articles, thesis and books. PMID:21986476

  9. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  10. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  11. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    PubMed Central

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-01-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research. PMID:27009765

  12. Repeatability, Reproducibility and Standardisation of a Laser Doppler Imaging Technique for the Evaluation of Normal Mouse Hindlimb Perfusion

    PubMed Central

    Greco, Adelaide; Ragucci, Monica; Liuzzi, Raffaele; Gargiulo, Sara; Gramanzini, Matteo; Coda, Anna Rita Daniela; Albanese, Sandra; Mancini, Marcello; Salvatore, Marco; Brunetti, Arturo

    2013-01-01

    Background Preclinical perfusion studies are useful for the improvement of diagnosis and therapy in dermatologic, cardiovascular and rheumatic human diseases. The Laser Doppler Perfusion Imaging (LDPI) technique has been used to evaluate superficial alterations of the skin microcirculation in surgically induced murine hindlimb ischemia. We assessed the reproducibility and the accuracy of LDPI acquisitions and identified several critical factors that could affect LDPI measurements in mice. Methods Twenty mice were analysed. Statistical standardisation and a repeatability and reproducibility analysis were performed on mouse perfusion signals with respect to differences in body temperature, the presence or absence of hair, the type of anaesthesia used for LDPI measurements and the position of the mouse body. Results We found excellent correlations among measurements made by the same operator (i.e., repeatability) under the same experimental conditions and by two different operators (i.e., reproducibility). A Bland-Altman analysis showed the absence of bias in repeatability (p = 0.29) or reproducibility (p = 0.89). The limits of agreement for repeatability were –0.357 and –0.033, and for reproducibility, they were –0.270 and 0.238. Significant differences in perfusion values were observed in different experimental groups. Conclusions Different experimental conditions must be considered as a starting point for the evaluation of new drugs and strategic therapies. PMID:23275085

  13. Towards an accurate bioimpedance identification

    NASA Astrophysics Data System (ADS)

    Sanchez, B.; Louarroudi, E.; Bragos, R.; Pintelon, R.

    2013-04-01

    This paper describes the local polynomial method (LPM) for estimating the time-invariant bioimpedance frequency response function (FRF) considering both the output-error (OE) and the errors-in-variables (EIV) identification framework and compare it with the traditional cross— and autocorrelation spectral analysis techniques. The bioimpedance FRF is measured with the multisine electrical impedance spectroscopy (EIS) technique. To show the overwhelming accuracy of the LPM approach, both the LPM and the classical cross— and autocorrelation spectral analysis technique are evaluated through the same experimental data coming from a nonsteady-state measurement of time-varying in vivo myocardial tissue. The estimated error sources at the measurement frequencies due to noise, σnZ, and the stochastic nonlinear distortions, σZNL, have been converted to Ω and plotted over the bioimpedance spectrum for each framework. Ultimately, the impedance spectra have been fitted to a Cole impedance model using both an unweighted and a weighted complex nonlinear least square (CNLS) algorithm. A table is provided with the relative standard errors on the estimated parameters to reveal the importance of which system identification frameworks should be used.

  14. Accuracy of femoral templating in reproducing anatomical femoral offset in total hip replacement.

    PubMed

    Davies, H; Foote, J; Spencer, R F

    2007-01-01

    Restoration of hip biomechanics is a crucial component of successful total hip replacement. Preoperative templating is recommended to ensure that the size and orientation of implants is optimised. We studied how closely natural femoral offset could be reproduced using the manufacturers' templates for 10 femoral stems in common use in the UK. A series of 23 consecutive preoperative radiographs from patients who had undergone unilateral total hip replacement for unilateral osteoarthritis of the hip was employed. The change in offset between the templated position of the best-fitting template and the anatomical centre of the hip was measured. The templates were then ranked according to their ability to reproduce the normal anatomical offset. The most accurate was the CPS-Plus (Root Mean Square Error 2.0 mm) followed in rank order by: C stem (2.16), CPT (2.40), Exeter (3.23), Stanmore (3.28), Charnley (3.65), Corail (3.72), ABG II (4.30), Furlong HAC (5.08) and Furlong modular (7.14). A similar pattern of results was achieved when the standard error of variability of offset was analysed. We observed a wide variation in the ability of the femoral prosthesis templates to reproduce normal femoral offset. This variation was independent of the seniority of the observer. The templates of modern polished tapered stems with high modularity were best able to reproduce femoral offset. The current move towards digitisation of X-rays may offer manufacturers an opportunity to improve template designs in certain instances, and to develop appropriate computer software. PMID:19197861

  15. Skill, reproducibility and potential predictability of the West African monsoon in coupled GCMs

    NASA Astrophysics Data System (ADS)

    Philippon, N.; Doblas-Reyes, F. J.; Ruti, P. M.

    2010-07-01

    In the framework of the ENSEMBLES FP6 project, an ensemble prediction system based on five different state-of-the-art European coupled models has been developed. This study evaluates the performance of these models for forecasting the West African monsoon (WAM) at the monthly time scale. From simulations started the 1 May of each year and covering the period 1991-2001, the reproducibility and potential predictability (PP) of key parameters of the WAM—rainfall, zonal and meridional wind at four levels from the surface to 200 hPa, and specific humidity, from July to September—are assessed. The Sahelian rainfall mode of variability is not accurately reproduced contrary to the Guinean rainfall one: the correlation between observations (from CMAP) and the multi-model ensemble mean is 0.17 and 0.55, respectively. For the Sahelian mode, the correlation is consistent with a low PP of about ~6%. The PP of the Guinean mode is higher, ~44% suggesting a stronger forcing of the sea surface temperature on rainfall variability over this region. Parameters relative to the atmospheric dynamics are on average much more skillful and reproducible than rainfall. Among them, the first mode of variability of the zonal wind at 200 hPa that depicts the Tropical Easterly Jet, is correlated at 0.79 with its “observed” counterpart (from the NCEP/DOE2 reanalyses) and has a PP of 39%. Moreover, models reproduce the correlations between all the atmospheric dynamics parameters and the Sahelian rainfall in a satisfactory way. In that context, a statistical adaptation of the atmospheric dynamic forecasts, using a linear regression model with the leading principal components of the atmospheric dynamical parameters studied, leads to moderate receiver operating characteristic area under the curve and correlation skill scores for the Sahelian rainfall. These scores are however much higher than those obtained using the modelled rainfall.

  16. Reproducibility of ad libitum energy intake with the use of a computerized vending machine system123

    PubMed Central

    Votruba, Susanne B; Franks, Paul W; Krakoff, Jonathan; Salbe, Arline D

    2010-01-01

    Background: Accurate assessment of energy intake is difficult but critical for the evaluation of eating behavior and intervention effects. Consequently, methods to assess ad libitum energy intake under controlled conditions have been developed. Objective: Our objective was to evaluate the reproducibility of ad libitum energy intake with the use of a computerized vending machine system. Design: Twelve individuals (mean ± SD: 36 ± 8 y old; 41 ± 8% body fat) consumed a weight-maintaining diet for 3 d; subsequently, they self-selected all food with the use of a computerized vending machine system for an additional 3 d. Mean daily energy intake was calculated from the actual weight of foods consumed and expressed as a percentage of weight-maintenance energy needs (%WMEN). Subjects repeated the study multiple times during 2 y. The within-person reproducibility of energy intake was determined through the calculation of the intraclass correlation coefficients (ICCs) between visits. Results: Daily energy intake for all subjects was 5020 ± 1753 kcal during visit 1 and 4855 ± 1615 kcal during visit 2. There were no significant associations between energy intake and body weight, body mass index, or percentage body fat while subjects used the vending machines, which indicates that intake was not driven by body size or need. Despite overconsumption (%WMEN = 181 ± 57%), the reproducibility of intake between visits, whether expressed as daily energy intake (ICC = 0.90), %WMEN (ICC = 0.86), weight of food consumed (ICC = 0.87), or fat intake (g/d; ICC = 0.87), was highly significant (P < 0.0001). Conclusion: Although ad libitum energy intake exceeded %WMEN, the within-person reliability of this intake across multiple visits was high, which makes this a reproducible method for the measurement of ad libitum intake in subjects who reside in a research unit. This trial was registered at clinicaltrials.gov as NCT00342732. PMID:19923376

  17. On the reproducibility of science: unique identification of research resources in the biomedical literature

    PubMed Central

    Brush, Matthew H.; Paddock, Holly; Ponting, Laura; Tripathy, Shreejoy J.; LaRocca, Gregory M.; Haendel, Melissa A.

    2013-01-01

    Scientific reproducibility has been at the forefront of many news stories and there exist numerous initiatives to help address this problem. We posit that a contributor is simply a lack of specificity that is required to enable adequate research reproducibility. In particular, the inability to uniquely identify research resources, such as antibodies and model organisms, makes it difficult or impossible to reproduce experiments even where the science is otherwise sound. In order to better understand the magnitude of this problem, we designed an experiment to ascertain the “identifiability” of research resources in the biomedical literature. We evaluated recent journal articles in the fields of Neuroscience, Developmental Biology, Immunology, Cell and Molecular Biology and General Biology, selected randomly based on a diversity of impact factors for the journals, publishers, and experimental method reporting guidelines. We attempted to uniquely identify model organisms (mouse, rat, zebrafish, worm, fly and yeast), antibodies, knockdown reagents (morpholinos or RNAi), constructs, and cell lines. Specific criteria were developed to determine if a resource was uniquely identifiable, and included examining relevant repositories (such as model organism databases, and the Antibody Registry), as well as vendor sites. The results of this experiment show that 54% of resources are not uniquely identifiable in publications, regardless of domain, journal impact factor, or reporting requirements. For example, in many cases the organism strain in which the experiment was performed or antibody that was used could not be identified. Our results show that identifiability is a serious problem for reproducibility. Based on these results, we provide recommendations to authors, reviewers, journal editors, vendors, and publishers. Scientific efficiency and reproducibility depend upon a research-wide improvement of this substantial problem in science today. PMID:24032093

  18. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  19. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  20. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  1. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  2. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  3. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  4. REPRODUCIBILITY OF SMALL SCALE HYDRAULIC EXPERIMENT ON CROSS-LEVEE BREACH BY OVERFLOW

    NASA Astrophysics Data System (ADS)

    Watanabe, Yasuharu; Yamamoto, Masato; Shimada, Tomonori

    River bank failure causes a heavy disaster. Many experiments had been conducted. Full-scale hydraulic experiment is very few, because of it has a limit to conduct in terms of space and budget. Therefore, many experiments were conducted in small scale and they need to be compared with the actual phenomenon. The large scale hydraulic experiment on river bank failure was conducted at Chiyoda experiment flume in 2008. We conducted small scale experiments on river bank failure and examined the reproducibility of the small scale experiments by comparison with the Chiyoda experiment. Fine-grained soil strongly influences the widening process of bank failure. In order to reproduce the results of experiment at Chiyoda experimental flume on dike failure, it is necessary to remove the fine material from the material of scaled dikes. And it is found that the time scale of widening process of bank failure follows the Froude similarity law.

  5. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.

  6. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  7. Must Kohn-Sham oscillator strengths be accurate at threshold?

    SciTech Connect

    Yang Zenghui; Burke, Kieron; Faassen, Meta van

    2009-09-21

    The exact ground-state Kohn-Sham (KS) potential for the helium atom is known from accurate wave function calculations of the ground-state density. The threshold for photoabsorption from this potential matches the physical system exactly. By carefully studying its absorption spectrum, we show the answer to the title question is no. To address this problem in detail, we generate a highly accurate simple fit of a two-electron spectrum near the threshold, and apply the method to both the experimental spectrum and that of the exact ground-state Kohn-Sham potential.

  8. Accurate Measurement of the Effects of All Amino-Acid Mutations on Influenza Hemagglutinin.

    PubMed

    Doud, Michael B; Bloom, Jesse D

    2016-01-01

    Influenza genes evolve mostly via point mutations, and so knowing the effect of every amino-acid mutation provides information about evolutionary paths available to the virus. We and others have combined high-throughput mutagenesis with deep sequencing to estimate the effects of large numbers of mutations to influenza genes. However, these measurements have suffered from substantial experimental noise due to a variety of technical problems, the most prominent of which is bottlenecking during the generation of mutant viruses from plasmids. Here we describe advances that ameliorate these problems, enabling us to measure with greatly improved accuracy and reproducibility the effects of all amino-acid mutations to an H1 influenza hemagglutinin on viral replication in cell culture. The largest improvements come from using a helper virus to reduce bottlenecks when generating viruses from plasmids. Our measurements confirm at much higher resolution the results of previous studies suggesting that antigenic sites on the globular head of hemagglutinin are highly tolerant of mutations. We also show that other regions of hemagglutinin-including the stalk epitopes targeted by broadly neutralizing antibodies-have a much lower inherent capacity to tolerate point mutations. The ability to accurately measure the effects of all influenza mutations should enhance efforts to understand and predict viral evolution. PMID:27271655

  9. Accurate Measurement of the Effects of All Amino-Acid Mutations on Influenza Hemagglutinin

    PubMed Central

    Doud, Michael B.; Bloom, Jesse D.

    2016-01-01

    Influenza genes evolve mostly via point mutations, and so knowing the effect of every amino-acid mutation provides information about evolutionary paths available to the virus. We and others have combined high-throughput mutagenesis with deep sequencing to estimate the effects of large numbers of mutations to influenza genes. However, these measurements have suffered from substantial experimental noise due to a variety of technical problems, the most prominent of which is bottlenecking during the generation of mutant viruses from plasmids. Here we describe advances that ameliorate these problems, enabling us to measure with greatly improved accuracy and reproducibility the effects of all amino-acid mutations to an H1 influenza hemagglutinin on viral replication in cell culture. The largest improvements come from using a helper virus to reduce bottlenecks when generating viruses from plasmids. Our measurements confirm at much higher resolution the results of previous studies suggesting that antigenic sites on the globular head of hemagglutinin are highly tolerant of mutations. We also show that other regions of hemagglutinin—including the stalk epitopes targeted by broadly neutralizing antibodies—have a much lower inherent capacity to tolerate point mutations. The ability to accurately measure the effects of all influenza mutations should enhance efforts to understand and predict viral evolution. PMID:27271655

  10. History and progress on accurate measurements of the Planck constant

    NASA Astrophysics Data System (ADS)

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10-34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the improved

  11. History and progress on accurate measurements of the Planck constant.

    PubMed

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  12. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  13. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  14. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  15. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  16. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  17. Reproducibility of polycarbonate reference material in toxicity evaluation

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Huttlinger, P. A.

    1981-01-01

    A specific lot of bisphenol A polycarbonate has been used for almost four years as the reference material for the NASA-USF-PSC toxicity screening test method. The reproducibility of the test results over this period of time indicate that certain plastics may be more suitable reference materials than the more traditional cellulosic materials.

  18. Reproducibility of an imaging based prostate cancer prognostic assay

    NASA Astrophysics Data System (ADS)

    Khan, Faisal M.; Powell, Douglas; Bayer-Zubek, Valentina; Soares, Rui; Mott, Allison; Fernandez, Gerardo; Mesa-Tejada, Ricardo; Donovan, Michael J.

    2011-03-01

    The Prostate Px prognostic assay offered by Aureon Biosciences is designed to predict progression post primary treatment for prostate cancer patients based on their diagnostic biopsy specimen. The assay is driven by the automated image analysis of biological specimens. Three different histological sections are analyzed for morphometric as well as immunofluorescence protein expression properties within areas of tumor digitally masked by expert pathologists. The assay was developed on a multi-institution cohort of up to 9 images from each of 1027 patients. The variation in histological sections, staining, pathologist tumor masking and the region of image acquisition all have the potential to significantly impact imaging features and consequently the reproducibility of the assay's results for the same patient. This study analyzed the reproducibility of the assay in 50 patients who were re-processed within 3 months in a blinded fashion as de-novo patients. The key assay results reported were in agreement in 94% of the cases. The two independent endpoints of risk classification reproduced results in 90% and 92% of the predictions. This work presents one of the first assessments of the reproducibility of a commercial assay's results given the inherent variations in images and quantitative imaging characteristics in a commercial setting.

  19. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  20. Measurement of Liver Iron Concentration by MRI Is Reproducible

    PubMed Central

    Alústiza, José María; Emparanza, José I.; Castiella, Agustín; Casado, Alfonso; Aldazábal, Pablo; San Vicente, Manuel; Garcia, Nerea; Asensio, Ana Belén; Banales, Jesús; Salvador, Emma; Moyua, Aranzazu; Arozena, Xabier; Zarco, Miguel; Jauregui, Lourdes; Vicente, Ohiana

    2015-01-01

    Purpose. The objectives were (i) construction of a phantom to reproduce the behavior of iron overload in the liver by MRI and (ii) assessment of the variability of a previously validated method to quantify liver iron concentration between different MRI devices using the phantom and patients. Materials and Methods. A phantom reproducing the liver/muscle ratios of two patients with intermediate and high iron overload. Nine patients with different levels of iron overload were studied in 4 multivendor devices and 8 of them were studied twice in the machine where the model was developed. The phantom was analysed in the same equipment and 14 times in the reference machine. Results. FeCl3 solutions containing 0.3, 0.5, 0.6, and 1.2 mg Fe/mL were chosen to generate the phantom. The average of the intramachine variability for patients was 10% and for the intermachines 8%. For the phantom the intramachine coefficient of variation was always below 0.1 and the average of intermachine variability was 10% for moderate and 5% for high iron overload. Conclusion. The phantom reproduces the behavior of patients with moderate or high iron overload. The proposed method of calculating liver iron concentration is reproducible in several different 1.5 T systems. PMID:25874207

  1. Highly reproducible SERS arrays directly written by inkjet printing.

    PubMed

    Yang, Qiang; Deng, Mengmeng; Li, Huizeng; Li, Mingzhu; Zhang, Cong; Shen, Weizhi; Li, Yanan; Guo, Dan; Song, Yanlin

    2015-01-14

    SERS arrays with uniform gold nanoparticle distribution were fabricated by direct-writing with an inkjet printing method. Quantitative analysis based on Raman detection was achieved with a small standard statistical deviation of less than 4% for the reproducibility and less than 5% for the long-term stability for 12 weeks. PMID:25308163

  2. Slide rule-type color chart predicts reproduced photo tones

    NASA Technical Reports Server (NTRS)

    Griffin, J. D.

    1966-01-01

    Slide rule-type color chart determines the final reproduced gray tones in the production of briefing charts that are photographed in black and white. The chart shows both the color by drafting paint manufacturers name and mixture number, and the gray tone resulting from black and white photographic reproduction.

  3. Reproducibility of topographic measures of the glaucomatous optic nerve head

    PubMed Central

    Geyer, O; Michaeli-Cohen, A; Silver, D; Versano, D; Neudorfer, M; Dzhanov, R; Lazar, M

    1998-01-01

    AIMS/BACKGROUND—Laser scanning tomography provides an assessment of three dimensional optic disc topography. For the clinical purpose of follow up of glaucoma patients, the repeatability of the various measured variables is essential. In the present study, the reproducibility of morphometric variables calculated by the topographic scanning system, TopSS (Laser Diagnostic Technology, San Diego, CA) was investigated.
METHODS—Two independent measurements (30 minutes apart) each consisting of three complete images of the optic disc were performed on 16 eyes of 16 glaucoma patients using a TopSS. The instrument calculates 14 morphometric variables for the characterisation of the optic disc.
RESULTS—From the two tailed paired tests, all variables were seen to have good reproducibility. However, correlation and regression analyses showed that only the three variables, volume below, half depth area, and average cup depth, are acceptably reproducible.
CONCLUSION—The TopSS provides three variables which describe the physiological shape of the optic disc that have high reproducibility. These three variables might be useful for following the progression of optic disc changes in glaucoma patients.

 Keywords: optic nerve head; scanning laser; glaucoma; tomography PMID:9536873

  4. The United States Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Black and white maps, graphs and tables that may be reproduced are presented in this volume focusing on the United States. Some of the features of the United States depicted are: size, population, agriculture and resources, manufactures, trade, citizenship, employment, income, poverty, the federal budget, energy, health, education, crime, and the…

  5. Reproducibility of the vascular response to heating in human skin.

    PubMed

    Savage, M V; Brengelmann, G L

    1994-04-01

    Blood flow in human skin increases enormously in response to direct heating. If local skin temperature is held above 42 degrees C, blood flow eventually stabilizes at a level beyond which other influences, barring change in blood pressure, can produce no further increase. If this maximal level is a reproducible characteristic of an individual's cutaneous vasculature, it could be useful in comparing individuals; for example, in their response to hyperthermia. Our experiments were carried out to discover whether the maximal response of the vasculature of the skin of the forearm can be reproduced within reasonable limits and, also, to clarify the time course of the response. We used water sprayed over the surface of the forearms of 10 subjects to hold skin temperature above 42 degrees C for 60 min. During the last 10 min of heating, forearm blood flow (via venous occlusion plethysmography) was stable, at a level ranging from 16 to 38 ml.min-1.100 ml-1. This level, normalized to a blood pressure of 100 mmHg, was reproduced in a given individual on four or five occasions, with an average coefficient of variation of 10%. The response was 77 +/- 11% (SD) complete after 20 min of heating. Elapsed time at 90% of the final value was 35 +/- 9 (SD) min. We conclude that the maximal forearm blood flow response to local heating is a reproducible characteristic of the cutaneous vasculature with potential utility in the scaling of responses between and within individuals. PMID:8045857

  6. Can quasiclassical trajectory calculations reproduce the extreme kinetic isotope effect observed in the muonic isotopologues of the H + H2 reaction?

    NASA Astrophysics Data System (ADS)

    Jambrina, P. G.; García, Ernesto; Herrero, Víctor J.; Sáez-Rábanos, Vicente; Aoiz, F. J.

    2011-07-01

    Rate coefficients for the mass extreme isotopologues of the H + H2 reaction, namely, Mu + H2, where Mu is muonium, and Heμ + H2, where Heμ is a He atom in which one of the electrons has been replaced by a negative muon, have been calculated in the 200-1000 K temperature range by means of accurate quantum mechanical (QM) and quasiclassical trajectory (QCT) calculations and compared with the experimental and theoretical results recently reported by Fleming et al. [Science 331, 448 (2011)], 10.1126/science.1199421. The QCT calculations can reproduce the experimental and QM rate coefficients and kinetic isotope effect (KIE), kMu(T)/kHeμ(T), if the Gaussian binning procedure (QCT-GB) - weighting the trajectories according to their proximity to the right quantal vibrational action - is applied. The analysis of the results shows that the large zero point energy of the MuH product is the key factor for the large KIE observed.

  7. Reproducibility of dual-photon absorptiometry using a clinical phantom

    SciTech Connect

    DaCosta, M.; DeLaney, M.; Goldsmith, S.J.

    1985-05-01

    The use of dual-photon absorptiometry (DPA) bone mineral density (BMD) to monitor bone for diagnosis and monitoring therapy of osteoporosis has been established. The objective of this study is to determine the reproducibility of DPA measurements. A phantom was constructed using a section of human boney pelvis and lumbo-sacral spine. Provisions were made to mimic changes in patient girth. To evaluate the DPA reproducibility within a single day, 12 consecutive studies were performed on the phantom using standard acquisition and processing procedures. The mean BMD +-1 SD in gms/cm/sup 2/ (BMD-bar)of lumbar vertebrae 2-4 was 0.771 +- 0.007 with a 0.97% coefficient of variation (1SD) (CV). This evaluation was repeated 7 times over the next 4 months with the performance of 3 to 6 studies each time, the maximum CV found was 1.93. In order to evaluate the DPA reproducibility with time, phantom studies were performed over a 7 month period which included a 153-Gd source change. The BMD-bar was 0.770 +- 0.017 with a 2.15CV. DPA reproducibility with patient girth changes was evaluated by performing the phantom studies at water depths of 12.5, 17.0 and 20.0cm. Five studies of each were performed using standard acquisition and processing procedures. The BMD-bar was 0.779 +- 0.012 with a 1.151CV. based on these results, BMD measurements by DPA are reproducible within 2%. This reliability is maintained for studies performed over extended period of time and are independent of changes in patient girth.

  8. Tract Specific Reproducibility of Tractography Based Morphology and Diffusion Metrics

    PubMed Central

    Besseling, René M. H.; Jansen, Jacobus F. A.; Overvliet, Geke M.; Vaessen, Maarten J.; Braakman, Hilde M. H.; Hofman, Paul A. M.; Aldenkamp, Albert P.; Backes, Walter H.

    2012-01-01

    Introduction The reproducibility of tractography is important to determine its sensitivity to pathological abnormalities. The reproducibility of tract morphology has not yet been systematically studied and the recently developed tractography contrast Tract Density Imaging (TDI) has not yet been assessed at the tract specific level. Materials and Methods Diffusion tensor imaging (DTI) and probabilistic constrained spherical deconvolution (CSD) tractography are performed twice in 9 healthy subjects. Tractography is based on common space seed and target regions and performed for several major white matter tracts. Tractograms are converted to tract segmentations and inter-session reproducibility of tract morphology is assessed using Dice similarity coefficient (DSC). The coefficient of variation (COV) and intraclass correlation coefficient (ICC) are calculated of the following tract metrics: fractional anisotropy (FA), apparent diffusion coefficient (ADC), volume, and TDI. Analyses are performed both for proximal (deep white matter) and extended (including subcortical white matter) tract segmentations. Results Proximal DSC values were 0.70–0.92. DSC values were 5–10% lower in extended compared to proximal segmentations. COV/ICC values of FA, ADC, volume and TDI were 1–4%/0.65–0.94, 2–4%/0.62–0.94, 3–22%/0.53–0.96 and 8–31%/0.48–0.70, respectively, with the lower COV and higher ICC values found in the proximal segmentations. Conclusion For all investigated metrics, reproducibility depended on the segmented tract. FA and ADC had relatively low COV and relatively high ICC, indicating clinical potential. Volume had higher COV but its moderate to high ICC values in most tracts still suggest subject-differentiating power. Tract TDI had high COV and relatively low ICC, which reflects unfavorable reproducibility. PMID:22485157

  9. Dynamic pseudos: How accurate outside their parent case?

    SciTech Connect

    Ekrann, S.; Mykkeltveit, J.

    1995-12-31

    If properly constructed, dynamic pseudos allow the parent solution from which they were derived to be exactly reproduced, in a certain well-defined sense, in a subsequent coarse grid simulation. The paper reports extensive numerical experimentation, in 1D homogeneous and heterogeneous media, to determine the performance of pseudos when used outside their parent case. The authors perturb fluid viscosities and injection rate, as well as realization. Parent solutions are produced analytically, via a generalization of the Buckley-Leverett technique, as are true solutions in off-parent cases. Capillarity is neglected in these experiments, while gravity is sometimes retained in order to force rate sensitivity.

  10. Reproducibility of intensity-based estimates of lung ventilation

    PubMed Central

    Du, Kaifang; Bayouth, John E.; Ding, Kai; Christensen, Gary E.; Cao, Kunlin; Reinhardt, Joseph M.

    2013-01-01

    Purpose: Lung function depends on lung expansion and contraction during the respiratory cycle. Respiratory-gated CT imaging and image registration can be used to estimate the regional lung volume change by observing CT voxel density changes during inspiration or expiration. In this study, the authors examine the reproducibility of intensity-based estimates of lung tissue expansion and contraction in three mechanically ventilated sheep and ten spontaneously breathing humans. The intensity-based estimates are compared to the estimates of lung function derived from image registration deformation field. Methods: 4DCT data set was acquired for a cohort of spontaneously breathing humans and anesthetized and mechanically ventilated sheep. For each subject, two 4DCT scans were performed with a short time interval between acquisitions. From each 4DCT data set, an image pair consisting of a volume reconstructed near end inspiration and a volume reconstructed near end exhalation was selected. The end inspiration and end exhalation images were registered using a tissue volume preserving deformable registration algorithm. The CT density change in the registered image pair was used to compute intensity-based specific air volume change (SAC) and the intensity-based Jacobian (IJAC), while the transformation-based Jacobian (TJAC) was computed directly from the image registration deformation field. IJAC is introduced to make the intensity-based and transformation-based methods comparable since SAC and Jacobian may not be associated with the same physiological phenomenon and have different units. Scan-to-scan variations in respiratory effort were corrected using a global scaling factor for normalization. A gamma index metric was introduced to quantify voxel-by-voxel reproducibility considering both differences in ventilation and distance between matching voxels. The authors also tested how different CT prefiltering levels affected intensity-based ventilation reproducibility. Results

  11. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  12. The general AMBER force field (GAFF) can accurately predict thermodynamic and transport properties of many ionic liquids.

    PubMed

    Sprenger, K G; Jaeger, Vance W; Pfaendtner, Jim

    2015-05-01

    We have applied molecular dynamics to calculate thermodynamic and transport properties of a set of 19 room-temperature ionic liquids. Since accurately simulating the thermophysical properties of solvents strongly depends upon the force field of choice, we tested the accuracy of the general AMBER force field, without refinement, for the case of ionic liquids. Electrostatic point charges were developed using ab initio calculations and a charge scaling factor of 0.8 to more accurately predict dynamic properties. The density, heat capacity, molar enthalpy of vaporization, self-diffusivity, and shear viscosity of the ionic liquids were computed and compared to experimentally available data, and good agreement across a wide range of cation and anion types was observed. Results show that, for a wide range of ionic liquids, the general AMBER force field, with no tuning of parameters, can reproduce a variety of thermodynamic and transport properties with similar accuracy to that of other published, often IL-specific, force fields. PMID:25853313

  13. Accurate Determination of Conformational Transitions in Oligomeric Membrane Proteins

    PubMed Central

    Sanz-Hernández, Máximo; Vostrikov, Vitaly V.; Veglia, Gianluigi; De Simone, Alfonso

    2016-01-01

    The structural dynamics governing collective motions in oligomeric membrane proteins play key roles in vital biomolecular processes at cellular membranes. In this study, we present a structural refinement approach that combines solid-state NMR experiments and molecular simulations to accurately describe concerted conformational transitions identifying the overall structural, dynamical, and topological states of oligomeric membrane proteins. The accuracy of the structural ensembles generated with this method is shown to reach the statistical error limit, and is further demonstrated by correctly reproducing orthogonal NMR data. We demonstrate the accuracy of this approach by characterising the pentameric state of phospholamban, a key player in the regulation of calcium uptake in the sarcoplasmic reticulum, and by probing its dynamical activation upon phosphorylation. Our results underline the importance of using an ensemble approach to characterise the conformational transitions that are often responsible for the biological function of oligomeric membrane protein states. PMID:26975211

  14. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  15. On the importance of recrystallization to reproduce the Taylor impact specimen shape of a pure nickel

    NASA Astrophysics Data System (ADS)

    Couque, Hervé

    2015-09-01

    Taylor tests are a mean to investigate the dynamic plastic and failure behaviour of metals under compression. By taking in account the strengthening occurring at high strain rates, the Taylor final diameter of a pure nickel impacted at 453 m/s have been numerically reproduced by 13%. Through post-mortem observations of the specimen impacted at 453 m/s, a recrystallization process has been found to occur resulting in a softening of the pure nickel. Subsequent numerical simulations taking in account this softening have been found to reduce the difference between experimental and numerical diameter by 10%.

  16. Automated, Reproducible, Titania-Based Phosphopeptide Enrichment Strategy for Label-Free Quantitative Phosphoproteomics

    PubMed Central

    Richardson, Brenna McJury; Soderblom, Erik J.; Thompson, J. Will; Moseley, M. Arthur

    2013-01-01

    An automated phosphopeptide enrichment strategy is described using titanium dioxide (TiO2)-packed, fused silica capillaries for use with liquid chromatography (LC)-mass spectrometry (MS)/MS-based, label-free proteomics workflows. To correlate an optimum peptide:TiO2 loading ratio between different particle types, the ratio of phenyl phosphate-binding capacities was used. The optimum loading for the column was then verified through replicate enrichments of a range of quantities of digested rat brain tissue cell lysate. Fractions were taken during sample loading, multiple wash steps, and the elution steps and analyzed by LC-MS/MS to gauge the efficiency and reproducibility of the enrichment. Greater than 96% of the total phosphopeptides were detected in the elution fractions, indicating efficient trapping of the phosphopeptides on the first pass of enrichment. The quantitative reproducibility of the automated setup was also improved greatly with phosphopeptide intensities from replicate enrichments exhibiting a median coefficient of variation (CV) of 5.8%, and 80% of the identified phosphopeptides had CVs below 11.1%, while maintaining >85% specificity. By providing this high degree of analytical reproducibility, this method allows for label-free phosphoproteomics over large sample sets with complex experimental designs (multiple biological conditions, multiple biological replicates, multiple time-points, etc.), including large-scale clinical cohorts. PMID:23542237

  17. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  18. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales. PMID:24805343

  19. Managing risks in drug discovery: reproducibility of published findings.

    PubMed

    Kannt, Aimo; Wieland, Thomas

    2016-04-01

    In spite of tremendous advances in biopharmaceutical science and technology, the productivity of pharmaceutical research and development has been steadily declining over the last decades. The reasons for this decline are manifold and range from improved standard of care that is more and more difficult to top to inappropriate management of technical and translational risks along the R&D value chain. In this short review, major types of risks in biopharmaceutical R&D and means to address them will be described. A special focus will be on a risk, i.e., the lack of reproducibility of published information, that has so far not been fully appreciated and systematically analyzed. Measures to improve reproducibility and trust in published information will be discussed. PMID:26883784

  20. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  1. Longitudinal study of free running exercise challenge: reproducibility.

    PubMed Central

    Powell, C V; White, R D; Primhak, R A

    1996-01-01

    The reproducibility of free running exercise challenge has been examined in an unselected population of 8-10 year olds. Using a standardised protocol, monthly exercise tests were performed on 143 children over one year. A positive test was defined using both a 15% and 20% fall in peak expiratory flow after exercise. The mean (95% confidence interval, CI) population frequency for a positive test at 15% fall was 14.9% (6.5 to 23.3) and coefficient of variation 24.6%. For a 20% fall, the mean (95% CI) population frequency was 7.9% (2.9 to 12.9) and coefficient of variation 27.8%. Seventy two (50.3%) of the children gave at least one positive response at 15% fall. Exercise testing is not reproducible in the community setting and should not be used as a screening test. Exercise data from epidemiological studies of asthma should be interpreted with caution. PMID:8660071

  2. Reproducibility of the Tronzo and AO classifications for transtrochanteric fractures☆

    PubMed Central

    Mattos, Carlos Augusto; Jesus, Alexandre Atsushi Koza; Floter, Michelle dos Santos; Nunes, Luccas Franco Bettencourt; Sanches, Bárbara de Baptista; Zabeu, José Luís Amim

    2015-01-01

    Objective To analyze the reproducibility of the Tronzo and AO classifications for transtrochanteric fractures. Method This was a cross-sectional study in which the intraobserver and interobserver concordance between two readings made by 11 observers was analyzed. The analysis of the variations used the kappa statistical method. Results Moderate concordance was found in relation to the AO classification, while slight concordance was found for the Tronzo classification. Conclusion This study found that the AO/Asif classification for transtrochanteric presented greater intra and interobserver reproducibility and that greater concordance was correlated with greater experience of the observers. Without division into subgroups, the AO/Asif classification was shown, as described in the literature, to be acceptable for clinical use in relation to transtrochanteric fractures of the femur, although it did not show absolute concordance, given that its concordance level was only moderate. Nonetheless, its concordance was better than that of the Tronzo classification. PMID:26535193

  3. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  4. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2011-10-06

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the ?Reproducible Research Standard? (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  5. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  6. Intersubject variability and reproducibility of 15O PET studies.

    PubMed

    Coles, Jonathan P; Fryer, Tim D; Bradley, Peter G; Nortje, Jurgens; Smielewski, Peter; Rice, Kenneth; Clark, John C; Pickard, John D; Menon, David K

    2006-01-01

    Oxygen-15 positron emission tomography (15O PET) can provide important data regarding patients with head injury. We provide reference data on intersubject variability and reproducibility of cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolism (CMRO2) and oxygen extraction fraction (OEF) in patients and healthy controls, and explored alternative ways of assessing reproducibility within the context of a single PET study. In addition, we used independent measurements of CBF and CMRO2 to investigate the effect of mathematical correlation on the relationship between flow and metabolism. In patients, intersubject coefficients of variation (CoV) for CBF, CMRO2 and OEF were larger than in controls (32.9%+/-2.2%, 23.2%+/-2.0% and 22.5%+/-3.4% versus 13.5%+/-1.4%, 12.8%+/-1.1% and 7.3%+/-1.2%), while CoV for CBV were lower (15.2%+/-2.1% versus 22.5%+/-2.8%) (P<0.001). The CoV for the test-retest reproducibility of CBF, CBV, CMRO2 and OEF in patients were 2.1%+/-1.5%, 3.8%+/-3.0%, 3.7%+/-3.0% and 4.6%+/-3.5%, respectively. These were much lower than the intersubject CoV figures, and were similar to alternative measures of reproducibility obtained by fractionating data from a single study. The physiological relationship between flow and metabolism was preserved even when mathematically independent measures were used for analysis. These data provide a context for the design and interpretation of interventional PET studies. While ideally each centre should develop its own bank of such data, the figures provided will allow initial generic approximations of sample size for such studies. PMID:15988475

  7. Improved reproducibility by assuring confidence in measurements in biomedical research.

    PubMed

    Plant, Anne L; Locascio, Laurie E; May, Willie E; Gallagher, Patrick D

    2014-09-01

    ‘Irreproducibility’ is symptomatic of a broader challenge in measurement in biomedical research. From the US National Institute of Standards and Technology (NIST) perspective of rigorous metrology, reproducibility is only one aspect of establishing confidence in measurements. Appropriate controls, reference materials, statistics and informatics are required for a robust measurement process. Research is required to establish these tools for biological measurements, which will lead to greater confidence in research results. PMID:25166868

  8. Ab initio molecular dynamics of liquid water using embedded-fragment second-order many-body perturbation theory towards its accurate property prediction.

    PubMed

    Willow, Soohaeng Yoo; Salim, Michael A; Kim, Kwang S; Hirata, So

    2015-01-01

    A direct, simultaneous calculation of properties of a liquid using an ab initio electron-correlated theory has long been unthinkable. Here we present structural, dynamical, and response properties of liquid water calculated by ab initio molecular dynamics using the embedded-fragment spin-component-scaled second-order many-body perturbation method with the aug-cc-pVDZ basis set. This level of theory is chosen as it accurately and inexpensively reproduces the water dimer potential energy surface from the coupled-cluster singles, doubles, and noniterative triples with the aug-cc-pVQZ basis set, which is nearly exact. The calculated radial distribution function, self-diffusion coefficient, coordinate number, and dipole moment, as well as the infrared and Raman spectra are in excellent agreement with experimental results. The shapes and widths of the OH stretching bands in the infrared and Raman spectra and their isotropic-anisotropic Raman noncoincidence, which reflect the diverse local hydrogen-bond environment, are also reproduced computationally. The simulation also reveals intriguing dynamic features of the environment, which are difficult to probe experimentally, such as a surprisingly large fluctuation in the coordination number and the detailed mechanism by which the hydrogen donating water molecules move across the first and second shells, thereby causing this fluctuation. PMID:26400690

  9. Ab initio molecular dynamics of liquid water using embedded-fragment second-order many-body perturbation theory towards its accurate property prediction

    PubMed Central

    Willow, Soohaeng Yoo; Salim, Michael A.; Kim, Kwang S.; Hirata, So

    2015-01-01

    A direct, simultaneous calculation of properties of a liquid using an ab initio electron-correlated theory has long been unthinkable. Here we present structural, dynamical, and response properties of liquid water calculated by ab initio molecular dynamics using the embedded-fragment spin-component-scaled second-order many-body perturbation method with the aug-cc-pVDZ basis set. This level of theory is chosen as it accurately and inexpensively reproduces the water dimer potential energy surface from the coupled-cluster singles, doubles, and noniterative triples with the aug-cc-pVQZ basis set, which is nearly exact. The calculated radial distribution function, self-diffusion coefficient, coordinate number, and dipole moment, as well as the infrared and Raman spectra are in excellent agreement with experimental results. The shapes and widths of the OH stretching bands in the infrared and Raman spectra and their isotropic-anisotropic Raman noncoincidence, which reflect the diverse local hydrogen-bond environment, are also reproduced computationally. The simulation also reveals intriguing dynamic features of the environment, which are difficult to probe experimentally, such as a surprisingly large fluctuation in the coordination number and the detailed mechanism by which the hydrogen donating water molecules move across the first and second shells, thereby causing this fluctuation. PMID:26400690

  10. Git can facilitate greater reproducibility and increased transparency in science

    PubMed Central

    2013-01-01

    Background Reproducibility is the hallmark of good science. Maintaining a high degree of transparency in scientific reporting is essential not just for gaining trust and credibility within the scientific community but also for facilitating the development of new ideas. Sharing data and computer code associated with publications is becoming increasingly common, motivated partly in response to data deposition requirements from journals and mandates from funders. Despite this increase in transparency, it is still difficult to reproduce or build upon the findings of most scientific publications without access to a more complete workflow. Findings Version control systems (VCS), which have long been used to maintain code repositories in the software industry, are now finding new applications in science. One such open source VCS, Git, provides a lightweight yet robust framework that is ideal for managing the full suite of research outputs such as datasets, statistical code, figures, lab notes, and manuscripts. For individual researchers, Git provides a powerful way to track and compare versions, retrace errors, explore new approaches in a structured manner, while maintaining a full audit trail. For larger collaborative efforts, Git and Git hosting services make it possible for everyone to work asynchronously and merge their contributions at any time, all the while maintaining a complete authorship trail. In this paper I provide an overview of Git along with use-cases that highlight how this tool can be leveraged to make science more reproducible and transparent, foster new collaborations, and support novel uses. PMID:23448176

  11. Mechanostructure and composition of highly reproducible decellularized liver matrices.

    PubMed

    Mattei, G; Di Patria, V; Tirella, A; Alaimo, A; Elia, G; Corti, A; Paolicchi, A; Ahluwalia, A

    2014-02-01

    Despite the increasing number of papers on decellularized scaffolds, there is little consensus on the optimum method of decellularizing biological tissue such that the micro-architecture and protein content of the matrix are conserved as far as possible. Focusing on the liver, the aim of this study was therefore to develop a method for the production of well-characterized and reproducible matrices that best preserves the structure and composition of the native extra cellular matrix (ECM). Given the importance of matrix stiffness in regulating cell response, the mechanical properties of the decellularized tissue were also considered. The testing and analysis framework is based on the characterization of decellularized and untreated samples in the same reproducible initial state (i.e., the equilibrium swollen state). Decellularized ECM (dECM) were characterized using biochemical, histological, mechanical and structural analyses to identify the best procedure to ensure complete cell removal while preserving most of the native ECM structure and composition. Using this method, sterile decellularized porcine ECM with highly conserved intra-lobular micro-structure and protein content were obtained in a consistent and reproducible manner using the equilibrium swollen state of tissue or matrix as a reference. A significant reduction in the compressive elastic modulus was observed for liver dECM with respect to native tissue, suggesting a re-examination of design parameters for ECM-mimicking scaffolds for engineering tissues in vitro. PMID:24184179

  12. Reproducibility of SELDI Spectra Across Time and Laboratories

    PubMed Central

    Diao, Lixia; Clarke, Charlotte H.; Coombes, Kevin R.; Hamilton, Stanley R.; Roth, Jack; Mao, Li; Czerniak, Bogdan; Baggerly, Keith A.; Morris, Jeffrey S.; Fung, Eric T.; Bast, Robert C.

    2011-01-01

    This is an open access article. Unrestricted non-commercial use is permitted provided the original work is properly cited. The reproducibility of mass spectrometry (MS) data collected using surface enhanced laser desorption/ionization-time of flight (SELDI-TOF) has been questioned. This investigation was designed to test the reproducibility of SELDI data collected over time by multiple users and instruments. Five laboratories prepared arrays once every week for six weeks. Spectra were collected on separate instruments in the individual laboratories. Additionally, all of the arrays produced each week were rescanned on a single instrument in one laboratory. Lab-to-lab and array-to-array variability in alignment parameters were larger than the variability attributable to running samples during different weeks. The coefficient of variance (CV) in spectrum intensity ranged from 25% at baseline, to 80% in the matrix noise region, to about 50% during the exponential drop from the maximum matrix noise. Before normalization, the median CV of the peak heights was 72% and reduced to about 20% after normalization. Additionally, for the spectra from a common instrument, the CV ranged from 5% at baseline, to 50% in the matrix noise region, to 20% during the drop from the maximum matrix noise. Normalization reduced the variability in peak heights to about 18%. With proper processing methods, SELDI instruments produce spectra containing large numbers of reproducibly located peaks, with consistent heights. PMID:21552492

  13. Planar heterojunction perovskite solar cells with superior reproducibility

    NASA Astrophysics Data System (ADS)

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-11-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method.

  14. Interrater reproducibility of clinical tests for rotator cuff lesions

    PubMed Central

    Ostor, A; Richards, C; Prevost, A; Hazleman, B; Speed, C

    2004-01-01

    Background: Rotator cuff lesions are common in the community but reproducibility of tests for shoulder assessment has not been adequately appraised and there is no uniform approach to their use. Objective: To study interrater reproducibility of standard tests for shoulder evaluation among a rheumatology specialist, rheumatology trainee, and research nurse. Methods: 136 patients were reviewed over 12 months at a major teaching hospital. The three assessors examined each patient in random order and were unaware of each other's evaluation. Each shoulder was examined in a standard manner by recognised tests for specific lesions and a diagnostic algorithm was used. Between-observer agreement was determined by calculating Cohen's κ coefficients (measuring agreement beyond that expected by chance). Results: Fair to substantial agreement was obtained for the observations of tenderness, painful arc, and external rotation. Tests for supraspinatus and subscapularis also showed at least fair agreement between observers. 40/55 (73%) κ coefficient assessments were rated at >0.2, indicating at least fair concordance between observers; 21/55 (38%) were rated at >0.4, indicating at least moderate concordance between observers. Conclusion: The reproducibility of certain tests, employed by observers of varying experience, in the assessment of the rotator cuff and general shoulder disease was determined. This has implications for delegation of shoulder assessment to nurse specialists, the development of a simplified evaluation schedule for general practitioners, and uniformity in epidemiological research studies. PMID:15361389

  15. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  16. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  17. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  18. Reproducing Kernels in Harmonic Spaces and Their Numerical Implementation

    NASA Astrophysics Data System (ADS)

    Nesvadba, Otakar

    2010-05-01

    In harmonic analysis such as the modelling of the Earth's gravity field, the importance of Hilbert's space of harmonic functions with the reproducing kernel is often discussed. Moreover, in case of an unbounded domain given by the exterior of the sphere or an ellipsoid, the reproducing kernel K(x,y) can be expressed analytically by means of closed formulas or by infinite series. Nevertheless, the straightforward numerical implementation of these formulas leads to dozen of problems, which are mostly connected with the floating-point arithmetic and a number representation. The contribution discusses numerical instabilities in K(x,y) and gradK(x,y) that can be overcome by employing elementary functions, in particular expm1 and log1p. Suggested evaluation scheme for reproducing kernels offers uniform formulas within the whole solution domain as well as superior speed and near-perfect accuracy (10-16 for IEC 60559 double-precision numbers) when compared with the straightforward formulas. The formulas can be easily implemented on the majority of computer platforms, especially when C standard library ISO/IEC 9899:1999 is available.

  19. Accurate theoretical chemistry with coupled pair models.

    PubMed

    Neese, Frank; Hansen, Andreas; Wennmohs, Frank; Grimme, Stefan

    2009-05-19

    Quantum chemistry has found its way into the everyday work of many experimental chemists. Calculations can predict the outcome of chemical reactions, afford insight into reaction mechanisms, and be used to interpret structure and bonding in molecules. Thus, contemporary theory offers tremendous opportunities in experimental chemical research. However, even with present-day computers and algorithms, we cannot solve the many particle Schrodinger equation exactly; inevitably some error is introduced in approximating the solutions of this equation. Thus, the accuracy of quantum chemical calculations is of critical importance. The affordable accuracy depends on molecular size and particularly on the total number of atoms: for orientation, ethanol has 9 atoms, aspirin 21 atoms, morphine 40 atoms, sildenafil 63 atoms, paclitaxel 113 atoms, insulin nearly 800 atoms, and quaternary hemoglobin almost 12,000 atoms. Currently, molecules with up to approximately 10 atoms can be very accurately studied by coupled cluster (CC) theory, approximately 100 atoms with second-order Møller-Plesset perturbation theory (MP2), approximately 1000 atoms with density functional theory (DFT), and beyond that number with semiempirical quantum chemistry and force-field methods. The overwhelming majority of present-day calculations in the 100-atom range use DFT. Although these methods have been very successful in quantum chemistry, they do not offer a well-defined hierarchy of calculations that allows one to systematically converge to the correct answer. Recently a number of rather spectacular failures of DFT methods have been found-even for seemingly simple systems such as hydrocarbons, fueling renewed interest in wave function-based methods that incorporate the relevant physics of electron correlation in a more systematic way. Thus, it would be highly desirable to fill the gap between 10 and 100 atoms with highly correlated ab initio methods. We have found that one of the earliest (and now

  20. Learning the Inverse Dynamics of Robotic Manipulators in Structured Reproducing Kernel Hilbert Space.

    PubMed

    Cheng, Ching-An; Huang, Han-Pang; Hsu, Huan-Kun; Lai, Wei-Zh; Cheng, Chih-Chun

    2016-07-01

    We investigate the modeling of inverse dynamics without prior kinematic information for holonomic rigid-body robots. Despite success in compensating robot dynamics and friction, general inverse dynamics models are nontrivial. Rigid-body models are restrictive or inefficient; learning-based models are generalizable yet require large training data. The structured kernels address the dilemma by embedding the robot dynamics in reproducing kernel Hilbert space. The proposed kernels autonomously converge to rigid-body models but require fewer samples; with a semi-parametric framework that incorporates additional parametric basis for friction, the structured kernels can efficiently model general rigid-body robots. We tested the proposed scheme in simulations and experiments; the models that consider the structure of function space are more accurate. PMID:26316286

  1. Initial experience of MAGIC gels, their reproducibility and their practical application in the clinic

    NASA Astrophysics Data System (ADS)

    Barry, A.; Lewis, D. G.

    2004-01-01

    In this study we report on our initial experience with MAGIC gels as a dosimetric tool. In particular, we address the issue of the reproducibility of the gel's response to radiation by measuring the spin-spin relaxation times of gels irradiated to known doses using magnetic resonance imaging (MRI) and a conventional multi-echo CPMG pulse sequence. As a practical implementation of MAGIC gels into the clinic is required, the time to acquire images using MRI must be short. For this reason, the effect of the echo train length used in determining the spin-spin relaxation times was assessed as an initial investigation into whether alternative pulse sequences could be used to accurately measure the gels relaxation properties.

  2. High-Reproducibility and High-Accuracy Method for Automated Topic Classification

    NASA Astrophysics Data System (ADS)

    Lancichinetti, Andrea; Sirer, M. Irmak; Wang, Jane X.; Acuna, Daniel; Körding, Konrad; Amaral, Luís A. Nunes

    2015-01-01

    Much of human knowledge sits in large databases of unstructured text. Leveraging this knowledge requires algorithms that extract and record metadata on unstructured text documents. Assigning topics to documents will enable intelligent searching, statistical characterization, and meaningful classification. Latent Dirichlet allocation (LDA) is the state of the art in topic modeling. Here, we perform a systematic theoretical and numerical analysis that demonstrates that current optimization techniques for LDA often yield results that are not accurate in inferring the most suitable model parameters. Adapting approaches from community detection in networks, we propose a new algorithm that displays high reproducibility and high accuracy and also has high computational efficiency. We apply it to a large set of documents in the English Wikipedia and reveal its hierarchical structure.

  3. Automated, reproducible delineation of zones at risk from inundation by large volcanic debris flows

    USGS Publications Warehouse

    Schilling, Steve P.; Iverson, Richard M.

    1997-01-01

    Large debris flows can pose hazards to people and property downstream from volcanoes. We have developed a rapid, reproducible, objective, and inexpensive method to delineate distal debris-flow hazard zones. Our method employs the results of scaling and statistical analyses of the geometry of volcanic debris flows (lahars) to predict inundated valley cross-sectional areas (A) and planimetric areas (B) as functions of lahar volume. We use a range of specified lahar volumes to evaluate A and B. In a Geographic Information System (GIS) we employ the resulting range of predicted A and B to delineate gradations in inundation hazard, which is highest near the volcano and along valley thalwegs and diminishes as distances from the volcano and elevations above valley floors increase. Comparison of our computer-generated hazard maps with those constructed using traditional, field-based methods indicates that our method can provide an accurate means of delineating lahar hazard zones.

  4. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  5. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  6. Accurate evaluation of homogenous and nonhomogeneous gas emissivities

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Lee, K. P.

    1984-01-01

    Spectral transmittance and total band adsorptance of selected infrared bands of carbon dioxide and water vapor are calculated by using the line-by-line and quasi-random band models and these are compared with available experimental results to establish the validity of the quasi-random band model. Various wide-band model correlations are employed to calculate the total band absorptance and total emissivity of these two gases under homogeneous and nonhomogeneous conditions. These results are compared with available experimental results under identical conditions. From these comparisons, it is found that the quasi-random band model can provide quite accurate results and is quite suitable for most atmospheric applications.

  7. Reproducible and Consistent Quantification of the Saccharomyces cerevisiae Proteome by SWATH-mass spectrometry*

    PubMed Central

    Selevsek, Nathalie; Chang, Ching-Yun; Gillet, Ludovic C.; Navarro, Pedro; Bernhardt, Oliver M.; Reiter, Lukas; Cheng, Lin-Yang; Vitek, Olga; Aebersold, Ruedi

    2015-01-01

    Targeted mass spectrometry by selected reaction monitoring (S/MRM) has proven to be a suitable technique for the consistent and reproducible quantification of proteins across multiple biological samples and a wide dynamic range. This performance profile is an important prerequisite for systems biology and biomedical research. However, the method is limited to the measurements of a few hundred peptides per LC-MS analysis. Recently, we introduced SWATH-MS, a combination of data independent acquisition and targeted data analysis that vastly extends the number of peptides/proteins quantified per sample, while maintaining the favorable performance profile of S/MRM. Here we applied the SWATH-MS technique to quantify changes over time in a large fraction of the proteome expressed in Saccharomyces cerevisiae in response to osmotic stress. We sampled cell cultures in biological triplicates at six time points following the application of osmotic stress and acquired single injection data independent acquisition data sets on a high-resolution 5600 tripleTOF instrument operated in SWATH mode. Proteins were quantified by the targeted extraction and integration of transition signal groups from the SWATH-MS datasets for peptides that are proteotypic for specific yeast proteins. We consistently identified and quantified more than 15,000 peptides and 2500 proteins across the 18 samples. We demonstrate high reproducibility between technical and biological replicates across all time points and protein abundances. In addition, we show that the abundance of hundreds of proteins was significantly regulated upon osmotic shock, and pathway enrichment analysis revealed that the proteins reacting to osmotic shock are mainly involved in the carbohydrate and amino acid metabolism. Overall, this study demonstrates the ability of SWATH-MS to efficiently generate reproducible, consistent, and quantitatively accurate measurements of a large fraction of a proteome across multiple samples. PMID

  8. Reproducibility of erythrocyte polyamine measurements and correlation with plasma micronutrients in an antioxidant vitamin intervention study.

    PubMed

    Wang, W; Kucuk, O; Franke, A A; Liu, L Q; Custer, L J; Higuchi, C M

    1996-07-01

    Erythrocyte polyamine measurements have been previously investigated as candidate biomarkers for hyperproliferation and recently as a potential intermediate endpoint in clinical chemoprevention trials with difluoromethylornithine, an inhibitor of polyamine biosynthesis. This study was performed to determine the reproducibility of erythrocyte polyamine measurements and their possible correlation with plasma micronutrients in seven healthy adults in an antioxidant vitamin intervention study. As part of this cross-over intervention study, three subjects took beta-carotene (31.4 mg/day) plus D-alpha-tocopherol acetate (720 IU/day) supplements during the first 3 months and four subjects took the supplements during the second 3 months. Heparinized blood samples were collected at baseline and every month over total 6 months for simultaneous determination of erythrocyte polyamines and plasma micronutrients by the high-performance liquid chromatographic method. For all the measures of erythrocyte polyamines the intraindividual variation was smaller than that between subjects, and three or four measurements required to accurately characterize long-term erythrocyte polyamines for an individual. The intra-class correlations were moderately high for all erythrocyte polyamine measurements, indicating a good reproducibility for intra-individual erythrocyte polyamine measurements. Based on monthly values, significant inverse correlations were found between erythrocyte spermidine and the plasma levels of retinol (r = -0.50) and lutein (r = -0.52). There were also significant inverse associations between erythrocyte spermine and plasma levels of alpha-tocopherol (r = -0.29), lutein (r = -0.44), lycopene (r = -0.29), beta-cryptoxanthin (r = -0.30), and total carotenoids (r = -0.29). The effects of supplementation upon the associations between erythrocyte polyamines and plasma nutrient levels were additionally addressed. The results indicate an acceptable longitudinal reproducibility

  9. Reproducible and consistent quantification of the Saccharomyces cerevisiae proteome by SWATH-mass spectrometry.

    PubMed

    Selevsek, Nathalie; Chang, Ching-Yun; Gillet, Ludovic C; Navarro, Pedro; Bernhardt, Oliver M; Reiter, Lukas; Cheng, Lin-Yang; Vitek, Olga; Aebersold, Ruedi

    2015-03-01

    Targeted mass spectrometry by selected reaction monitoring (S/MRM) has proven to be a suitable technique for the consistent and reproducible quantification of proteins across multiple biological samples and a wide dynamic range. This performance profile is an important prerequisite for systems biology and biomedical research. However, the method is limited to the measurements of a few hundred peptides per LC-MS analysis. Recently, we introduced SWATH-MS, a combination of data independent acquisition and targeted data analysis that vastly extends the number of peptides/proteins quantified per sample, while maintaining the favorable performance profile of S/MRM. Here we applied the SWATH-MS technique to quantify changes over time in a large fraction of the proteome expressed in Saccharomyces cerevisiae in response to osmotic stress. We sampled cell cultures in biological triplicates at six time points following the application of osmotic stress and acquired single injection data independent acquisition data sets on a high-resolution 5600 tripleTOF instrument operated in SWATH mode. Proteins were quantified by the targeted extraction and integration of transition signal groups from the SWATH-MS datasets for peptides that are proteotypic for specific yeast proteins. We consistently identified and quantified more than 15,000 peptides and 2500 proteins across the 18 samples. We demonstrate high reproducibility between technical and biological replicates across all time points and protein abundances. In addition, we show that the abundance of hundreds of proteins was significantly regulated upon osmotic shock, and pathway enrichment analysis revealed that the proteins reacting to osmotic shock are mainly involved in the carbohydrate and amino acid metabolism. Overall, this study demonstrates the ability of SWATH-MS to efficiently generate reproducible, consistent, and quantitatively accurate measurements of a large fraction of a proteome across multiple samples. PMID

  10. REPRODUCIBILITY OF INTRA-ABDOMINAL PRESSURE MEASURED DURING PHYSICAL ACTIVITIES VIA A WIRELESS VAGINAL TRANSDUCER

    PubMed Central

    Egger, Marlene J.; Hamad, Nadia M.; Hitchcock, Robert W.; Coleman, Tanner J.; Shaw, Janet M.; Hsu, Yvonne; Nygaard, Ingrid E.

    2014-01-01

    Aims In the urodynamics laboratory setting, a wireless pressure transducer, developed to facilitate research exploring intra-abdominal pressure (IAP) and pelvic floor disorders, was highly accurate. We aimed to study reproducibility of IAP measured using this transducer in women during activities performed in an exercise science laboratory. Methods Fifty seven women (mean ± SD: age 30.4 ±9.3 years; body mass index=22.4 ± 2.68 kg/m2) completed two standardized activity sessions using the same transducer at least three days apart. Pressure data for 31 activities were transmitted wirelessly to a base station and analyzed for mean net maximal IAP, area under the curve (AUC) and first moment of the area (FMA.) Activities included typical exercises, lifting 13.6 to 18.2 kg, and simulated household tasks. Analysis for test-retest reliability included Bland-Altman plots with absolute limits of agreement (ALOA), Wilcoxon signed rank tests to assess significant differences between sessions, intraclass correlations, and kappa statistics to assess inter-session agreement in highest vs. other quintiles of maximal IAP. Results Few activities exhibited significant differences between sessions in maximal IAP, or in AUC and FMA values. For 13 activities, the agreement between repeat measures of maximal IAP was better than ± 10 cm H20; for 20 activities, better than ± 15 cm H20. ALOA increased with mean IAP. The highest quintile of IAP demonstrated fair/substantial agreement between sessions in 25 of 30 activities. Conclusion Reproducibility of IAP depends on the activity undertaken. Interventions geared towards lowering IAP should account for this, and maximize efforts to improve IAP reproducibility. PMID:25730430

  11. A geostatistical algorithm to reproduce lateral gradual facies transitions: Description and implementation

    NASA Astrophysics Data System (ADS)

    Falivene, Oriol; Cabello, Patricia; Arbués, Pau; Muñoz, Josep Anton; Cabrera, Lluís

    2009-08-01

    Valid representations of geological heterogeneity are fundamental inputs for quantitative models used in managing subsurface activities. Consequently, the simulation of realistic facies distributions is a significant aim. Realistic facies distributions are typically obtained by pixel-based, object-based or process-based methods. This work presents a pixel-based geostatistical algorithm suitable for reproducing lateral gradual facies transitions (LGFT) between two adjacent sedimentary bodies. Lateral contact (i.e. interfingering) between distinct depositional facies is a widespread geometric relationship that occurs at different scales in any depositional system. The algorithm is based on the truncation of the sum of a linear expectation trend and a random Gaussian field, and can be conditioned to well data. The implementation introduced herein also includes subroutines to clean and geometrically characterize the obtained LGFT. The cleaned sedimentary body transition provides a more appropriate and realistic facies distribution for some depositional settings. The geometric measures of the LGFT yield an intuitive measure of the morphology of the sedimentary body boundary, which can be compared to analogue data. An example of a LGFT obtained by the algorithm presented herein is also flow simulated, quantitatively demonstrating the importance of realistically reproducing them in subsurface models, if further flow-related accurate predictions are to be made.

  12. Dissecting the determinants of malaria chronicity: why within-host models struggle to reproduce infection dynamics.

    PubMed

    Childs, Lauren M; Buckee, Caroline O

    2015-03-01

    The duration of infection is fundamental to the epidemiological behaviour of any infectious disease, but remains one of the most poorly understood aspects of malaria. In endemic areas, the malaria parasite Plasmodium falciparum can cause both acute, severe infections and asymptomatic, chronic infections through its interaction with the host immune system. Frequent superinfection and massive parasite genetic diversity make it extremely difficult to accurately measure the distribution of infection lengths, complicating the estimation of basic epidemiological parameters and the prediction of the impact of interventions. Mathematical models have qualitatively reproduced parasite dynamics early during infection, but reproducing long-lived chronic infections remains much more challenging. Here, we construct a model of infection dynamics to examine the consequences of common biological assumptions for the generation of chronicity and the impact of co-infection. We find that although a combination of host and parasite heterogeneities are capable of generating chronic infections, they do so only under restricted parameter choices. Furthermore, under biologically plausible assumptions, co-infection of parasite genotypes can alter the course of infection of both the resident and co-infecting strain in complex non-intuitive ways. We outline the most important puzzles for within-host models of malaria arising from our analysis, and their implications for malaria epidemiology and control. PMID:25673299

  13. An R package that automatically collects and archives details for reproducible computing

    PubMed Central

    2014-01-01

    Background It is scientifically and ethically imperative that the results of statistical analysis of biomedical research data be computationally reproducible in the sense that the reported results can be easily recapitulated from the study data. Some statistical analyses are computationally a function of many data files, program files, and other details that are updated or corrected over time. In many applications, it is infeasible to manually maintain an accurate and complete record of all these details about a particular analysis. Results Therefore, we developed the rctrack package that automatically collects and archives read only copies of program files, data files, and other details needed to computationally reproduce an analysis. Conclusions The rctrack package uses the trace function to temporarily embed detail collection procedures into functions that read files, write files, or generate random numbers so that no special modifications of the primary R program are necessary. At the conclusion of the analysis, rctrack uses these details to automatically generate a read only archive of data files, program files, result files, and other details needed to recapitulate the analysis results. Information about this archive may be included as an appendix of a report generated by Sweave or knitR. Here, we describe the usage, implementation, and other features of the rctrack package. The rctrack package is freely available from http://www.stjuderesearch.org/site/depts/biostats/rctrack under the GPL license. PMID:24886202

  14. Inter-examiner reproducibility of tests for lumbar motor control

    PubMed Central

    2011-01-01

    Background Many studies show a relation between reduced lumbar motor control (LMC) and low back pain (LBP). However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS), sitting forward lean (SFL), sitting knee extension (SKE), and bent knee fall out (BKFO), all measured in cm, and leg lowering (LL), measured in mm Hg. A total of 40 subjects (14 males, 26 females) 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8), were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27) cm for RPS, 1.01 (0.62) cm for SFL, 0.40 (0.29) cm for SKE, 1.07 (0.52) cm for BKFO, and 32.9 (7.1) mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these tests is as good

  15. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  16. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  17. Measuring the stagnation phase of NIF implosions: reproducibility and intentional asymmetry

    NASA Astrophysics Data System (ADS)

    Spears, Brian; Benedetti, R.; Callahan, D.; Casey, D.; Eder, D.; Gaffney, J.; Ma, T.; Munro, D.; Knauer, J.; Kilkenny, J.

    2015-11-01

    We report here data from a 5-shot sequence of cryogenic DT layered implosions designed to measure NIF implosion stagnation, the reproducibility of stagnation, and the response of the stagnation phase to intentional perturbation. We emphasize new analysis of the neutron spectral moments. These features provide an experimental measurement of hot spot thermal (temperature) and fluid (residual flow) processes. They also provide strong constraints for code validation. In implosions that were intentionally perturbed by laser drive and DT layer asymmetry, the experimental measurements show clear signs of the damaged stagnation. These signatures also match well our expectations from simulation, reproducing the variation of apparent temperature with line of sight and the down scattered neutron ratio, among others. The suite of implosions provides a demonstration of our ability to measure stagnated flow performance and highlights the several precision diagnostic signatures that are correctly captured by radhydro codes. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

    SciTech Connect

    Jimenez, Ivo; Maltzahn, Carlos; Lofstead, Jay; Moody, Adam; Mohror, Kathryn; Arpaci-Dusseau, Remzi; Arpaci-Dusseau, Andrea

    2015-05-04

    Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.

  19. Communication: An accurate full 15 dimensional permutationally invariant potential energy surface for the OH + CH4 → H2O + CH3 reaction.

    PubMed

    Li, Jun; Guo, Hua

    2015-12-14

    A globally accurate full-dimensional potential energy surface (PES) for the OH + CH4 → H2O + CH3 reaction is developed using the permutation invariant polynomial-neural network approach based on ∼135,000 points at the level of correlated coupled cluster singles, doubles, and perturbative triples level with the augmented correlation consistent polarized valence triple-zeta basis set. The total root mean square fitting error is only 3.9 meV or 0.09 kcal/mol. This PES is shown to reproduce energies, geometries, and harmonic frequencies of stationary points along the reaction path. Kinetic and dynamical calculations on the PES indicated a good agreement with the available experimental data. PMID:26671351

  20. Communication: An accurate full 15 dimensional permutationally invariant potential energy surface for the OH + CH4 → H2O + CH3 reaction

    NASA Astrophysics Data System (ADS)

    Li, Jun; Guo, Hua

    2015-12-01

    A globally accurate full-dimensional potential energy surface (PES) for the OH + CH4 → H2O + CH3 reaction is developed using the permutation invariant polynomial-neural network approach based on ˜135 000 points at the level of correlated coupled cluster singles, doubles, and perturbative triples level with the augmented correlation consistent polarized valence triple-zeta basis set. The total root mean square fitting error is only 3.9 meV or 0.09 kcal/mol. This PES is shown to reproduce energies, geometries, and harmonic frequencies of stationary points along the reaction path. Kinetic and dynamical calculations on the PES indicated a good agreement with the available experimental data.

  1. Limit analysis assessment of experimental behavior of arches reinforced with GFRP materials

    NASA Astrophysics Data System (ADS)

    Basilio, Ismael; Fedele, Roberto; Lourenço, Paulo B.; Milani, Gabriele

    2014-10-01

    In this paper, a comparison between results furnished by a 3D FE upper bound limit analysis and experimental results for some reinforced masonry arches tested at the University of Minho (Portugal) is provided. While the delamination from arches support can be modelled only in an approximate way within limit analysis, the aim of the paper is to accurately reproduce the change in the failure mechanism observed in experimentation, due to the introduction of strengthening elements. Both experimental and numerical results showa clear change in the failure mechanism and in the corresponding ultimate peak load. A set of simulations is also performed on reinforced arches previously damaged, to investigate the role played by the reinforcement within a proper repairing procedure. Good correlation with experimental work and numerical simulations is achieved.

  2. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  3. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  4. Repeatability and Reproducibility of Decisions by Latent Fingerprint Examiners

    PubMed Central

    Ulery, Bradford T.; Hicklin, R. Austin; Buscaglia, JoAnn; Roberts, Maria Antonia

    2012-01-01

    The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner) results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive) was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as “difficult” than for “easy” or “moderate” comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4); 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization). Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases. PMID:22427888

  5. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  6. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  7. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    PubMed Central

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~<10%). Some sequences are particularly variable in specific structures (ASL exhibited variation of 28% in the cerebral white matter) or in thin structures (quantitative T2 varied by up to 73% in the caudate) due, in large part, to variability in automated ROI placement. The

  8. Accurate equilibrium structures of fluoro- and chloroderivatives of methane

    NASA Astrophysics Data System (ADS)

    Vogt, Natalja; Demaison, Jean; Rudolph, Heinz Dieter

    2014-11-01

    This work is a systematic study of molecular structure of fluoro-, chloro-, and fluorochloromethanes. For the first time, the accurate ab initio structure is computed for 10 molecules (CF4, CClF3, CCl2F2, CCl3F, CHClF2, CHCl2F, CH2F2, CH2ClF, CH2Cl2, and CCl4) at the coupled cluster level of electronic structure theory including single and double excitations augmented by a perturbational estimate of the effects of connected triple excitations [CCSD(T)] with all electrons being correlated and Gaussian basis sets of at least quadruple-ζ quality. Furthermore, when possible, namely for the molecules CH2F2, CH2Cl2, CH2ClF, CHClF2, and CCl2F2, accurate semi-experimental equilibrium (rSEe) structure has also been determined. This is achieved through a least-squares structural refinement procedure based on the equilibrium rotational constants of all available isotopomers, determined by correcting the experimental ground-state rotational constants with computed ab initio vibration-rotation interaction constants and electronic g-factors. The computed and semi-experimental equilibrium structures are in excellent agreement with each other, but the rSEe structure is generally more accurate, in particular for the CF and CCl bond lengths. The carbon-halogen bond length is discussed within the framework of the ligand close-packing model as a function of the atomic charges. For this purpose, the accurate equilibrium structures of some other molecules with alternative ligands, such as CH3Li, CF3CCH, and CF3CN, are also computed.

  9. On the reproducibility of protein crystal structures: five atomic resolution structures of trypsin

    SciTech Connect

    Liebschner, Dorothee; Dauter, Miroslawa; Brzuszkiewicz, Anna; Dauter, Zbigniew

    2013-08-01

    Details of five very high-resolution accurate structures of bovine trypsin are compared in the context of the reproducibility of models obtained from crystals grown under identical conditions. Structural studies of proteins usually rely on a model obtained from one crystal. By investigating the details of this model, crystallographers seek to obtain insight into the function of the macromolecule. It is therefore important to know which details of a protein structure are reproducible or to what extent they might differ. To address this question, the high-resolution structures of five crystals of bovine trypsin obtained under analogous conditions were compared. Global parameters and structural details were investigated. All of the models were of similar quality and the pairwise merged intensities had large correlation coefficients. The C{sup α} and backbone atoms of the structures superposed very well. The occupancy of ligands in regions of low thermal motion was reproducible, whereas solvent molecules containing heavier atoms (such as sulfur) or those located on the surface could differ significantly. The coordination lengths of the calcium ion were conserved. A large proportion of the multiple conformations refined to similar occupancies and the residues adopted similar orientations. More than three quarters of the water-molecule sites were conserved within 0.5 Å and more than one third were conserved within 0.1 Å. An investigation of the protonation states of histidine residues and carboxylate moieties was consistent for all of the models. Radiation-damage effects to disulfide bridges were observed for the same residues and to similar extents. Main-chain bond lengths and angles averaged to similar values and were in agreement with the Engh and Huber targets. Other features, such as peptide flips and the double conformation of the inhibitor molecule, were also reproducible in all of the trypsin structures. Therefore, many details are similar in models obtained

  10. Quantum theory as the most robust description of reproducible experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel

    2014-08-01

    suggests that quantum theory is a powerful language to describe a certain class of statistical experiments but remains vague about the properties of the class. Similar views were expressed by other fathers of quantum mechanics, e.g., Max Born and Wolfgang Pauli [50]. They can be summarized as "Quantum theory describes our knowledge of the atomic phenomena rather than the atomic phenomena themselves". Our aim is, in a sense, to replace the philosophical components of these statements by well-defined mathematical concepts and to carefully study their relevance for physical phenomena. Specifically, by applying the general formalism of logical inference to a well-defined class of statistical experiments, the present paper shows that quantum theory is indeed the kind of language envisaged by Bohr.Theories such as Newtonian mechanics, Maxwell's electrodynamics, and Einstein's (general) relativity are deductive in character. Starting from a few axioms, abstracted from experimental observations and additional assumptions about the irrelevance of a large number of factors for the description of the phenomena of interest, deductive reasoning is used to prove or disprove unambiguous statements, propositions, about the mathematical objects which appear in the theory.The method of deductive reasoning conforms to the Boolean algebra of propositions. The deductive, reductionist methodology has the appealing feature that one can be sure that the propositions are either right or wrong, and disregarding the possibility that some of the premises on which the deduction is built may not apply, there is no doubt that the conclusions are correct. Clearly, these theories successfully describe a wide range of physical phenomena in a manner and language which is unambiguous and independent of the individual.At the same time, the construction of a physical theory, and a scientific theory in general, from "first principles" is, for sure, not something self-evident, and not even safe. Our basic

  11. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well. PMID:26930054

  12. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data

    PubMed Central

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well. PMID:26930054

  13. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  14. Remote balance weighs accurately amid high radiation

    NASA Technical Reports Server (NTRS)

    Eggenberger, D. N.; Shuck, A. B.

    1969-01-01

    Commercial beam-type balance, modified and outfitted with electronic controls and digital readout, can be remotely controlled for use in high radiation environments. This allows accurate weighing of breeder-reactor fuel pieces when they are radioactively hot.

  15. Understanding the Code: keeping accurate records.

    PubMed

    Griffith, Richard

    2015-10-01

    In his continuing series looking at the legal and professional implications of the Nursing and Midwifery Council's revised Code of Conduct, Richard Griffith discusses the elements of accurate record keeping under Standard 10 of the Code. This article considers the importance of accurate record keeping for the safety of patients and protection of district nurses. The legal implications of records are explained along with how district nurses should write records to ensure these legal requirements are met. PMID:26418404

  16. A Reproducible Oral Microcosm Biofilm Model for Testing Dental Materials

    PubMed Central

    Rudney, J.D.; Chen, R.; Lenton, P.; Li, J.; Li, Y.; Jones, R.S.; Reilly, C.; Fok, A.S.; Aparicio, C.

    2012-01-01

    Aims Most studies of biofilm effects on dental materials use single-species biofilms, or consortia. Microcosm biofilms grown directly from saliva or plaque are much more diverse, but difficult to characterize. We used the Human Oral Microbial Identification Microarray (HOMIM) to validate a reproducible oral microcosm model. Methods and Results Saliva and dental plaque were collected from adults and children. Hydroxyapatite and dental composite disks were inoculated with either saliva or plaque, and microcosm biofilms were grown in a CDC biofilm reactor. In later experiments, the reactor was pulsed with sucrose. DNA from inoculums and microcosms were analyzed by HOMIM for 272 species. Microcosms included about 60% of species from the original inoculum. Biofilms grown on hydroxyapatite and composites were extremely similar. Sucrose-pulsing decreased diversity and pH, but increased the abundance of Streptococcus and Veilonella. Biofilms from the same donor, grown at different times, clustered together. Conclusions This model produced reproducible microcosm biofilms that were representative of the oral microbiota. Sucrose induced changes associated with dental caries. Significance and Impact of the Study This is the first use of HOMIM to validate an oral microcosm model that can be used to study the effects of complex biofilms on dental materials. PMID:22925110

  17. A precision translation stage for reproducing measured target volume motions.

    PubMed

    Litzenberg, Dale W; Hadley, Scott W; Lam, Kwok L; Balter, James M

    2007-01-01

    The development of 4D imaging, treatment planning and treatment delivery methods for radiation therapy require the use of a high-precision translation stage for testing and validation. These technologies may require spatial resolutions of 1 mm, and temporal resolutions of 2-30 Hz for CT imaging, electromagnetic tracking, and fluoroscopic imaging. A 1D programmable translation stage capable of reproducing idealized and measured anatomic motions common to the thorax has been design and built to meet these spatial and temporal resolution requirement with phantoms weighing up to 27 kg. The stage consists of a polycarbonate base and table, driven by an AC servo motor with encoder feedback by means of a belt-coupled precision screw. Complex motions are possible through a programmable motion controller that is capable of running multiple independent control and monitoring programs concurrently. Programmable input and output ports allow motion to be synchronized with beam delivery and other imaging and treatment delivery devices to within 2.0 ms. Average deviations from the programmed positions are typically 0.2 mm or less, while the average typical maximum positional errors are typically 0.5 mm for an indefinite number of idealized breathing motion cycles and while reproducing measured target volume motions for several minutes. PMID:17712294

  18. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  19. ASSESSMENT OF REPRODUCIBILITY OF SANDERS CLASSIFICATION FOR CALCANEAL FRACTURES

    PubMed Central

    Piovesana, Lucas Gonzaga; Lopes, Hériston Cristovam; Pacca, Daniel Moreira; Ninomiya, André Felipe; Dinato, Mauro César Mattos e; Pagnano, Rodrigo Gonçalves

    2016-01-01

    Objective : To assess intra- and interobserver reproducibility of Sanders Classification System of calcaneal fractures among experienced and less experienced observers. Methods : Forty-six CT scans of intra-articular calcaneal fractures were reviewed. Four observers, two with ten years of experience in foot and ankle surgery and two third-year residents in Orthopedics and Traumatology classified the fractures on two separate occasions three weeks apart from each other. The intra and inter-observer reliability was analyzed using the Kappa index. Results : There was good intraobserver reliability for the two experienced observers and one less experienced observer (Kappa values 0.640, 0.632 and 0.629, respectively). The interobserver reliability was fair between the experienced observers (Kappa = 0.289) and moderate among the less experienced observers (Kappa = 0.527). Conclusions : The Sanders Classification System showed good intraobserver reliability, but interobserver reproducibility below the ideal level, both among experienced and less experienced observers. Level of Evidence III, Diagnostic Studies. PMID:26981043

  20. Reproducing elder male power through ritual performance in Japan.

    PubMed

    Traphagan, J W

    2000-01-01

    Most research by gerontologists into the relationship between religion and aging has focused upon the potential health benefits of religious participation among Americans who follow Judeo-Christian oriented forms of worship and belief. This research has shown that both as a social institution and source of existential meaning, religion provides an important resource for older people in terms of fellowship and as a means of coping and adapting to social change and personal loss. Other religious traditions and other aspects of salience of religious participation for older people have been less thoroughly considered. This article investigates a religious ritual in Japan, that, rather than being a source of consolation, is an expression of symbolic capital associated with elder status and, thus, gerontocratic power. The ritual contributes to representing and reproducing the power of older residents in a rural Japanese community, partly due to its being administratively situated within an age-grade system that is a part of neighborhood political organization. Through its performance, the ritual visually reproduces and represents stratified social structures that concentrate power in the hands of male members of the senior age grade. PMID:14618004

  1. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  2. Interobserver Reproducibility of Histological Grading of Canine Simple Mammary Carcinomas.

    PubMed

    Santos, M; Correia-Gomes, C; Santos, A; de Matos, A; Dias-Pereira, P; Lopes, C

    2015-07-01

    Histological grading of canine mammary carcinomas (CMCs) has been performed using an adaptation of the human Nottingham method. The histological grade could be a prognostic factor in CMC; however, no data are available concerning interobserver variability in grading. In this study we analyzed the interobserver reproducibility between three observers when assigning individual parameter scores and grade to 46 CMCs. The influence of tumour size and vascular invasion and/or lymph node metastases on the odds of grading disagreement was also evaluated. The mean kappa values were 0.71, 0.51, 0.69 and 0.70 for tubule formation, nuclear pleomorphism, mitotic counts and grade, respectively. There was moderate to good agreement in scoring parameters and tumour grading, with nuclear pleomorphism being least reproducible. These findings are similar to those of human studies. The odds of grading disagreement increased with tumour size, but decreased with the presence of vascular invasion and/or lymph node metastases. Individual scoring differences were moderated by reaching a consensus between two observers. PMID:25979682

  3. Reproducible Research Practices and Transparency across the Biomedical Literature.

    PubMed

    Iqbal, Shareen A; Wallach, Joshua D; Khoury, Muin J; Schully, Sheri D; Ioannidis, John P A

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000-2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  4. Reproducibility in Nerve Morphometry: Comparison between Methods and among Observers

    PubMed Central

    Bilego Neto, Antônio Paulo da Costa; Silveira, Fernando Braga Cassiano; Rodrigues da Silva, Greice Anne; Sanada, Luciana Sayuri; Fazan, Valéria Paula Sassoli

    2013-01-01

    We investigated the reproducibility of a semiautomated method (computerized with manual intervention) for nerve morphometry (counting and measuring myelinated fibers) between three observers with different levels of expertise and experience with the method. Comparisons between automatic (fully computerized) and semiautomated morphometric methods performed by the same computer software using the same nerve images were also performed. Sural nerves of normal adult rats were used. Automatic and semiautomated morphometry of the myelinated fibers were made through the computer software KS-400. Semiautomated morphometry was conducted by three independent observers on the same images, using the semiautomated method. Automatic morphometry overestimated the myelin sheath area, thus overestimating the myelinated fiber size and underestimating the axon size. Fiber distributions overestimation was of 0.5 μm. For the semiautomated morphometry, no differences were found between observers for myelinated fiber and axon size distributions. Overestimation of the myelin sheath size of normal fibers by the fully automatic method might have an impact when morphometry is used for diagnostic purposes. We suggest that not only semiautomated morphometry results can be compared between different centers in clinical trials but it can also be performed by more than one investigator in one single experiment, being a reliable and reproducible method. PMID:23841086

  5. Robust Reproducible Resting State Networks in the Awake Rodent Brain

    PubMed Central

    Becerra, Lino; Pendse, Gautam; Chang, Pei-Ching; Bishop, James; Borsook, David

    2011-01-01

    Resting state networks (RSNs) have been studied extensively with functional MRI in humans in health and disease to reflect brain function in the un-stimulated state as well as reveal how the brain is altered with disease. Rodent models of disease have been used comprehensively to understand the biology of the disease as well as in the development of new therapies. RSN reported studies in rodents, however, are few, and most studies are performed with anesthetized rodents that might alter networks and differ from their non-anesthetized state. Acquiring RSN data in the awake rodent avoids the issues of anesthesia effects on brain function. Using high field fMRI we determined RSNs in awake rats using an independent component analysis (ICA) approach, however, ICA analysis can produce a large number of components, some with biological relevance (networks). We further have applied a novel method to determine networks that are robust and reproducible among all the components found with ICA. This analysis indicates that 7 networks are robust and reproducible in the rat and their putative role is discussed. PMID:22028788

  6. Modified Taylor reproducing formulas and h-p clouds

    NASA Astrophysics Data System (ADS)

    Zuppa, Carlos

    2008-03-01

    We study two different approximations of a multivariate function f by operators of the form sum_{iD1}^{N}mathcal{T}_{r}[f,x_{i}](x) mathcal{W} _{i}(x) , where \\{mathcal{W}_{i}\\} is an m -reproducing partition of unity and mathcal{T}_{r}[f,x_{i}](x) are modified Taylor polynomials of degree r expanded at x_{i} . The first approximation was introduced by Xuli (2003) in the univariate case and generalized for convex domains by Guessab et al. (2005). The second one was introduced by Duarte (1995) and proved in the univariate case. In this paper, we first relax the Guessab's convexity assumption and we prove Duarte's reproduction formula in the multivariate case. Then, we introduce two related reproducing quasi-interpolation operators in Sobolev spaces. A weighted error estimate and Jackson's type inequalities for h-p cloud function spaces are obtained. Last, numerical examples are analyzed to show the approximative power of the method.

  7. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  8. Reproducibility of Neonate Ocular Circulation Measurements Using Laser Speckle Flowgraphy.

    PubMed

    Matsumoto, Tadashi; Itokawa, Takashi; Shiba, Tomoaki; Katayama, Yuji; Arimura, Tetsushi; Mizukaki, Norio; Yoda, Hitoshi; Hori, Yuichi

    2015-01-01

    Measuring the ocular blood flow in neonates may clarify the relationships between eye diseases and ocular circulation abnormalities. However, no method for noninvasively measuring ocular circulation in neonates is established. We used laser speckle flowgraphy (LSFG) modified for neonates to measure their ocular circulation and investigated whether this method is reproducible. During their normal sleep, we studied 16 subjects (adjusted age of 34-48 weeks) whose blood flow could be measured three consecutive times. While the subjects slept in the supine position, three mean blur rate (MBR) values of the optic nerve head (ONH) were obtained: the MBR-A (mean of all values), MBR-V (vessel mean), and MBR-T (tissue mean), and nine blood flow pulse waveform parameters in the ONH were examined. We analyzed the coefficient of variation (COV) and the intraclass correlation coefficient (ICC) for each parameter. The COVs of the MBR values were all ≤ 10%. The ICCs of the MBR values were all >0.8. Good COVs were observed for the blowout score, blowout time, rising rate, falling rate, and acceleration time index. Although the measurement of ocular circulation in the neonates was difficult, our results exhibited reproducibility, suggesting that this method could be used in clinical research. PMID:26557689

  9. Reproducibility of Neonate Ocular Circulation Measurements Using Laser Speckle Flowgraphy

    PubMed Central

    Matsumoto, Tadashi; Itokawa, Takashi; Shiba, Tomoaki; Katayama, Yuji; Arimura, Tetsushi; Mizukaki, Norio; Yoda, Hitoshi; Hori, Yuichi

    2015-01-01

    Measuring the ocular blood flow in neonates may clarify the relationships between eye diseases and ocular circulation abnormalities. However, no method for noninvasively measuring ocular circulation in neonates is established. We used laser speckle flowgraphy (LSFG) modified for neonates to measure their ocular circulation and investigated whether this method is reproducible. During their normal sleep, we studied 16 subjects (adjusted age of 34–48 weeks) whose blood flow could be measured three consecutive times. While the subjects slept in the supine position, three mean blur rate (MBR) values of the optic nerve head (ONH) were obtained: the MBR-A (mean of all values), MBR-V (vessel mean), and MBR-T (tissue mean), and nine blood flow pulse waveform parameters in the ONH were examined. We analyzed the coefficient of variation (COV) and the intraclass correlation coefficient (ICC) for each parameter. The COVs of the MBR values were all ≤10%. The ICCs of the MBR values were all >0.8. Good COVs were observed for the blowout score, blowout time, rising rate, falling rate, and acceleration time index. Although the measurement of ocular circulation in the neonates was difficult, our results exhibited reproducibility, suggesting that this method could be used in clinical research. PMID:26557689

  10. A Telescope Inventor's Spyglass Possibly Reproduced in a Brueghel's Painting

    NASA Astrophysics Data System (ADS)

    Molaro, P.; Selvelli, P.

    2011-06-01

    Jan Brueghel the Elder depicted spyglasses belonging to the Archduke Albert VII of Habsburg in at least five paintings in the period between 1608 and 1625. Albert VII was fascinated by art and science and he obtained spyglasses directly from Lipperhey and Sacharias Janssen approximately at the time when the telescope was first shown at The Hague at the end of 1608. In the Extensive Landscape with View of the Castle of Mariemont, dated 1608-1612, the Archduke is looking at his Mariemont castle through an optical tube and this is the first time a spyglass was painted whatsoever. It is quite possible that the painting reproduces one of the first telescopes ever made. Two other Albert VII's telescopes are prominently reproduced in two Allegories of Sight painted a few years later (1617-1618). They are sophisticated instruments and their structure, in particular the shape of the eyepiece, suggests that they are composed by two convex lenses in a Keplerian optical configuration which became of common use only more than two decades later. If this is the case, these paintings are the first available record of a Keplerian telescope.

  11. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  12. Assessment of Modeling Capability for Reproducing Storm Impacts on TEC

    NASA Astrophysics Data System (ADS)

    Shim, J. S.; Kuznetsova, M. M.; Rastaetter, L.; Bilitza, D.; Codrescu, M.; Coster, A. J.; Emery, B. A.; Foerster, M.; Foster, B.; Fuller-Rowell, T. J.; Huba, J. D.; Goncharenko, L. P.; Mannucci, A. J.; Namgaladze, A. A.; Pi, X.; Prokhorov, B. E.; Ridley, A. J.; Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Zhu, L.

    2014-12-01

    During geomagnetic storm, the energy transfer from solar wind to magnetosphere-ionosphere system adversely affects the communication and navigation systems. Quantifying storm impacts on TEC (Total Electron Content) and assessment of modeling capability of reproducing storm impacts on TEC are of importance to specifying and forecasting space weather. In order to quantify storm impacts on TEC, we considered several parameters: TEC changes compared to quiet time (the day before storm), TEC difference between 24-hour intervals, and maximum increase/decrease during the storm. We investigated the spatial and temporal variations of the parameters during the 2006 AGU storm event (14-15 Dec. 2006) using ground-based GPS TEC measurements in the selected 5 degree eight longitude sectors. The latitudinal variations were also studied in two longitude sectors among the eight sectors where data coverage is relatively better. We obtained modeled TEC from various ionosphere/thermosphere (IT) models. The parameters from the models were compared with each other and with the observed values. We quantified performance of the models in reproducing the TEC variations during the storm using skill scores. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  13. A Bayesian Perspective on the Reproducibility Project: Psychology

    PubMed Central

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors—a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis—for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable. PMID:26919473

  14. Robust reproducible resting state networks in the awake rodent brain.

    PubMed

    Becerra, Lino; Pendse, Gautam; Chang, Pei-Ching; Bishop, James; Borsook, David

    2011-01-01

    Resting state networks (RSNs) have been studied extensively with functional MRI in humans in health and disease to reflect brain function in the un-stimulated state as well as reveal how the brain is altered with disease. Rodent models of disease have been used comprehensively to understand the biology of the disease as well as in the development of new therapies. RSN reported studies in rodents, however, are few, and most studies are performed with anesthetized rodents that might alter networks and differ from their non-anesthetized state. Acquiring RSN data in the awake rodent avoids the issues of anesthesia effects on brain function. Using high field fMRI we determined RSNs in awake rats using an independent component analysis (ICA) approach, however, ICA analysis can produce a large number of components, some with biological relevance (networks). We further have applied a novel method to determine networks that are robust and reproducible among all the components found with ICA. This analysis indicates that 7 networks are robust and reproducible in the rat and their putative role is discussed. PMID:22028788

  15. Effect of Soil Moisture Content on the Splash Phenomenon Reproducibility

    PubMed Central

    Ryżak, Magdalena; Bieganowski, Andrzej; Polakowski, Cezary

    2015-01-01

    One of the methods for testing splash (the first phase of water erosion) may be an analysis of photos taken using so-called high-speed cameras. The aim of this study was to determine the reproducibility of measurements using a single drop splash of simulated precipitation. The height from which the drops fell resulted in a splash of 1.5 m. Tests were carried out using two types of soil: Eutric Cambisol (loamy silt) and Orthic Luvisol (sandy loam); three initial pressure heads were applied equal to 16 kPa, 3.1 kPa, and 0.1 kPa. Images for one, five, and 10 drops were recorded at a rate of 2000 frames per second. It was found that (i) the dispersion of soil caused by the striking of the 1st drop was significantly different from the splash impact caused by subsequent drops; (ii) with every drop, the splash phenomenon proceeded more reproducibly, that is, the number of particles of soil and/or water that splashed were increasingly close to each other; (iii) the number of particles that were detached during the splash were strongly correlated with its surface area; and (iv) the higher the water film was on the surface the smaller the width of the crown was. PMID:25785859

  16. Data reproducibility of pace strategy in a laboratory test run.

    PubMed

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-06-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  17. Nonperturbative amplification of inhomogeneities in a self-reproducing universe

    NASA Astrophysics Data System (ADS)

    Linde, Andrei; Linde, Dmitri; Mezhlumian, Arthur

    1996-08-01

    We investigate the distribution of energy density in a stationary self-reproducing inflationary universe. We show that the main fraction of volume of the Universe in a state with a given density ρ at any given moment of proper time t is concentrated near the centers of deep exponentially wide spherically symmetric wells in the density distribution. Since this statement is very surprising and counterintuitive, we perform our investigation by three different analytical methods to verify our conclusions, and then confirm our analytical results by computer simulations. If one assumes that we are typical observers living in the Universe at a given moment of time, then our results may imply that we should live near the center of a deep and exponentially large void, which we will call an infloid. The validity of this particular interpretation of our results is not quite clear since it depends on the as-yet unsolved problem of measure in quantum cosmology. Therefore, at the moment we would prefer to consider our results simply as a demonstration of nontrivial properties of the hypersurface of a given time in the fractal self-reproducing universe, without making any far-reaching conclusions concerning the structure of our own part of the Universe. Still we believe that our results may be of some importance since they demonstrate that nonperturbative effects in quantum cosmology, at least in principle, may have significant observational consequences, including an apparent violation of the Copernican principle.

  18. Accurate 12D dipole moment surfaces of ethylene

    NASA Astrophysics Data System (ADS)

    Delahaye, Thibault; Nikitin, Andrei V.; Rey, Michael; Szalay, Péter G.; Tyuterev, Vladimir G.

    2015-10-01

    Accurate ab initio full-dimensional dipole moment surfaces of ethylene are computed using coupled-cluster approach and its explicitly correlated counterpart CCSD(T)-F12 combined respectively with cc-pVQZ and cc-pVTZ-F12 basis sets. Their analytical representations are provided through 4th order normal mode expansions. First-principles prediction of the line intensities using variational method up to J = 30 are in excellent agreement with the experimental data in the range of 0-3200 cm-1. Errors of 0.25-6.75% in integrated intensities for fundamental bands are comparable with experimental uncertainties. Overall calculated C2H4 opacity in 600-3300 cm-1 range agrees with experimental determination better than to 0.5%.

  19. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  20. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  1. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654

  2. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  3. Multimodal Spatial Calibration for Accurately Registering EEG Sensor Positions

    PubMed Central

    Chen, Shengyong; Xiao, Gang; Li, Xiaoli

    2014-01-01

    This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views' calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain. PMID:24803954

  4. Multimodal spatial calibration for accurately registering EEG sensor positions.

    PubMed

    Zhang, Jianhua; Chen, Jian; Chen, Shengyong; Xiao, Gang; Li, Xiaoli

    2014-01-01

    This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views' calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain. PMID:24803954

  5. Reproducible, Scalable Fusion Gene Detection from RNA-Seq.

    PubMed

    Arsenijevic, Vladan; Davis-Dusenbery, Brandi N

    2016-01-01

    Chromosomal rearrangements resulting in the creation of novel gene products, termed fusion genes, have been identified as driving events in the development of multiple types of cancer. As these gene products typically do not exist in normal cells, they represent valuable prognostic and therapeutic targets. Advances in next-generation sequencing and computational approaches have greatly improved our ability to detect and identify fusion genes. Nevertheless, these approaches require significant computational resources. Here we describe an approach which leverages cloud computing technologies to perform fusion gene detection from RNA sequencing data at any scale. We additionally highlight methods to enhance reproducibility of bioinformatics analyses which may be applied to any next-generation sequencing experiment. PMID:26667464

  6. GigaDB: promoting data dissemination and reproducibility.

    PubMed

    Sneddon, Tam P; Zhe, Xiao Si; Edmunds, Scott C; Li, Peter; Goodman, Laurie; Hunter, Christopher I

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org. PMID:24622612

  7. Investigating the reproducibility of a complex multifocal radiosurgery treatment

    NASA Astrophysics Data System (ADS)

    Niebanck, M.; Juang, T.; Newton, J.; Adamovics, J.; Wang, Z.; Oldham, M.

    2013-06-01

    Stereotactic radiosurgery has become a widely used technique to treat solid tumors and secondary metastases of the brain. Multiple targets can be simultaneously treated with a single isocenter in order to reduce the set-up time to improve patient comfort and workflow. In this study, a 5-arc multifocal RapidArc treatment was delivered to multiple PRESAGE® dosimeters in order to explore the repeatability of the treatment. The three delivery measurements agreed well with each other, with less than 3% standard deviation of dose in the target. The deliveries also agreed well with the treatment plan, with gamma passing rates greater than 90% (5% dose-difference, and 2 mm distance-to-agreement criteria). The optical-CT PRESAGE® system provided a reproducible measurement for treatment verification, provided measurements were made immediately following treatment.

  8. Reproducing the kinematics of damped Lyman α systems

    NASA Astrophysics Data System (ADS)

    Bird, Simeon; Haehnelt, Martin; Neeleman, Marcel; Genel, Shy; Vogelsberger, Mark; Hernquist, Lars

    2015-02-01

    We examine the kinematic structure of damped Lyman α systems (DLAs) in a series of cosmological hydrodynamic simulations using the AREPO code. We are able to match the distribution of velocity widths of associated low-ionization metal absorbers substantially better than earlier work. Our simulations produce a population of DLAs dominated by haloes with virial velocities around 70 km s-1, consistent with a picture of relatively small, faint objects. In addition, we reproduce the observed correlation between velocity width and metallicity and the equivalent width distribution of Si II. Some discrepancies of moderate statistical significance remain; too many of our spectra show absorption concentrated at the edge of the profile and there are slight differences in the exact shape of the velocity width distribution. We show that the improvement over previous work is mostly due to our strong feedback from star formation and our detailed modelling of the metal ionization state.

  9. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements. PMID:27013736

  10. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily Kara; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  11. Reproducibility of cardiac biomarkers response to prolonged treadmill exercise.

    PubMed

    Tian, Ye; Nie, Jinlei; George, Keith P; Huang, Chuanye

    2014-03-01

    We examined the reproducibility of alterations in cardiac biomarkers after two identical bouts of prolonged exercise in young athletes. Serum high-sensitivity cardiac troponin T (hs-cTnT) and N-terminal pro-brain natriuretic peptide (NT-proBNP) levels were assessed before and after exercise. Significant rises in median hs-cTnT and NT-proBNP occurred in both trials. While the absolute changes in hs-cTnT were smaller after trial 2, the pattern of change was similar and the delta scores were significantly related. However, the change in NT-proBNP was not correlated between trials. The hs-cTnT release demonstrates some consistency after exercise although the blunted hc-cTnT response requires further study. PMID:24451016

  12. Reproducibility of pre-syncopal responses to repeated orthostatic challenge

    NASA Astrophysics Data System (ADS)

    Goswami, Nandu; Grasser, Erik; Roessler, Andreas; Hinghofer-Szalkay, Helmut

    Aims: To study individual patterns of hemodynamic adjustments in subjects reaching orthostatically induced presyncope and to observe whether these are reproducible across three runs. Procedures and methods: 10 healthy young males were subjected to extreme cardiovascular stress three times: Graded orthostatic stress (GOS), consisting of head-up tilt combined with lower body negative pressure, was used to achieve a pre-syncopal end-point. All test runs were separated by two week intervals. Orthostatic effects on cardiac and vascular function were continuously monitored and standing times noted. Results: Across the group, heart rate (HR) increased 112 percent, while mean arterial blood pressure dropped by 15 percent, pulse pressure by 36 percent, and stroke volume index by 51 percent on average from supine control to presyncope. Repetitions of the orthostatic protocols did not influence standing times of test persons from the 1st to the 3rd trial (15 plus minus 6 to 17 plus minus 7 min). Some individuals responded either with an increase in HR only, while the others with combined HR and total peripheral resistance increase, albeit shortly, and this individual specifc pattern was observed across the three runs of combined GOS. Conclusion: Strategies for maintaining blood pressure in response to central hypovolemia in subjects induced by orthostatic stress are different between subjects. However, the same individual specific hemodynamic mechanism is employed each time to maintain the blood pressure when reconfronted by this stress. Individual patterns of hemodynamic adjustments to orthostatic stress are highly reproducible when these subjects reach pre-syncope three times.

  13. An International Ki67 Reproducibility Study in Adrenal Cortical Carcinoma.

    PubMed

    Papathomas, Thomas G; Pucci, Eugenio; Giordano, Thomas J; Lu, Hao; Duregon, Eleonora; Volante, Marco; Papotti, Mauro; Lloyd, Ricardo V; Tischler, Arthur S; van Nederveen, Francien H; Nose, Vania; Erickson, Lori; Mete, Ozgur; Asa, Sylvia L; Turchini, John; Gill, Anthony J; Matias-Guiu, Xavier; Skordilis, Kassiani; Stephenson, Timothy J; Tissier, Frédérique; Feelders, Richard A; Smid, Marcel; Nigg, Alex; Korpershoek, Esther; van der Spek, Peter J; Dinjens, Winand N M; Stubbs, Andrew P; de Krijger, Ronald R

    2016-04-01

    Despite the established role of Ki67 labeling index in prognostic stratification of adrenocortical carcinomas and its recent integration into treatment flow charts, the reproducibility of the assessment method has not been determined. The aim of this study was to investigate interobserver variability among endocrine pathologists using a web-based virtual microscopy approach. Ki67-stained slides of 76 adrenocortical carcinomas were analyzed independently by 14 observers, each according to their method of preference including eyeballing, formal manual counting, and digital image analysis. The interobserver variation was statistically significant (P<0.001) in the absence of any correlation between the various methods. Subsequently, 61 static images were distributed among 15 observers who were instructed to follow a category-based scoring approach. Low levels of interobserver (F=6.99; Fcrit=1.70; P<0.001) as well as intraobserver concordance (n=11; Cohen κ ranging from -0.057 to 0.361) were detected. To improve harmonization of Ki67 analysis, we tested the utility of an open-source Galaxy virtual machine application, namely Automated Selection of Hotspots, in 61 virtual slides. The software-provided Ki67 values were validated by digital image analysis in identical images, displaying a strong correlation of 0.96 (P<0.0001) and dividing the cases into 3 classes (cutoffs of 0%-15%-30% and/or 0%-10%-20%) with significantly different overall survivals (P<0.05). We conclude that current practices in Ki67 scoring assessment vary greatly, and interobserver variation sets particular limitations to its clinical utility, especially around clinically relevant cutoff values. Novel digital microscopy-enabled methods could provide critical aid in reducing variation, increasing reproducibility, and improving reliability in the clinical setting. PMID:26685085

  14. The effect of saline iontophoresis on skin integrity in human volunteers. I. Methodology and reproducibility.

    PubMed

    Camel, E; O'Connell, M; Sage, B; Gross, M; Maibach, H

    1996-08-01

    This study, conducted in 36 human volunteers, was an evaluation of the effects of saline iontophoresis on skin temperature, irritation, and barrier function. The major objectives were to assess the effects of low-level ionic currents, to validate the proposed methodology of assessment, and to establish reproducibility in repeated saline iontophoresis applications. This was the first of a multistage study designed to assess the safety of 24-hr saline iontophoresis episodes at selected currents and current densities. Since an iontophoresis patch challenges the skin barrier both by occluding the skin surface and by passing ionic current through the skin, the experimental protocol was designed to permit measurement of the contribution of each of these processes to the overall response. In this first stage we investigated the effect of 10 min of current delivery, at 0.1 mA/cm2 on a 1-cm2 area patch and 0.2 mA/cm2 on a 6.5-cm2 area patch compared to unpowered control patches. Twelve subjects were tested under each condition on two separate occasions to examine reproducibility of the response variable measurements. A further 12 subjects were tested once under the 0.2 mA/cm2, 6.5-cm2 condition. Skin irritation was evaluated via repeated measurements of transepidermal water loss, capacitance, skin temperature, skin color, and a visual scoring system, before the iontophoresis episode and after patch removal. No damage to skin barrier function in terms of skin-water loss or skin-water content was detected. Slight, subclinical, short-lasting erythema was observed for both conditions. Assessment of correlation coefficients showed highly statistically significant indications of reproducibility for all five response variables measured. The experimental design, in combination with a repeated measures analysis, provided clear separation of the occlusion and ionic current components of the iontophoretic patch challenge. Further, the repeated measures analysis gave a highly sensitive

  15. Reproducibility of measuring cerebral blood flow by laser-Doppler flowmetry in mice.

    PubMed

    Tajima, Yosuke; Takuwa, Hiroyuki; Kawaguchi, Hiroshi; Masamoto, Kazuto; Ikoma, Yoko; Seki, Chie; Taniguchi, Junko; Kanno, Iwao; Saeki, Naokatsu; Ito, Hiroshi

    2014-01-01

    Laser-Doppler flowmetry has been widely used to trace hemodynamic changes in experimental stroke research. The purpose of the present study was to evaluate the day-to-day test-retest reproducibility of measuring cerebral blood flow by LDF in awake mice. The flux indicating cerebral blood flow (CBF), red blood cell (RBC) velocity, and RBC concentration were measured with LDF via cranial windows for the bilateral somatosensory cortex in awake mice. LDF measurements were performed three times, at baseline, 1 hour after, and 7 days after the baseline measurement. Moreover, breathing rate (BR) and partial pressure of transcutaneous CO₂ (PtCO₂) were measured simultaneously with LDF measurement. Intraclass correlation coefficient (ICC) and within-subject coefficient of variation (CVw) were calculated. CBF, RBC velocity, and RBC concentration showed good day-to-day test-retest reproducibility (ICC: 0.61 - 0.95, CVw: 8.3% - 15.4%). BR and PtCO₂ in awake mice were stable during the course of the experiments. The evaluation of cerebral microcirculation using LDF appears to be applicable to long-term studies. PMID:24389142

  16. PH Tester Gauge Repeatability and Reproducibility Study for WO3 Nanostructure Hydrothermal Growth Process

    NASA Astrophysics Data System (ADS)

    Abd Rashid, Amirul; Hayati Saad, Nor; Bien Chia Sheng, Daniel; Yee, Lee Wai

    2014-06-01

    PH value is one of the important variables for tungsten trioxide (WO3) nanostructure hydrothermal synthesis process. The morphology of the synthesized nanostructure can be properly controlled by measuring and controlling the pH value of the solution used in this facile synthesis route. Therefore, it is very crucial to ensure the gauge used for pH measurement is reliable in order to achieve the expected result. In this study, gauge repeatability and reproducibility (GR&R) method was used to assess the repeatability and reproducibility of the pH tester. Based on ANOVA method, the design of experimental metrics as well as the result of the experiment was analyzed using Minitab software. It was found that the initial GR&R value for the tester was at 17.55 % which considered as acceptable. To further improve the GR&R level, a new pH measuring procedure was introduced. With the new procedure, the GR&R value was able to be reduced to 2.05%, which means the tester is statistically very ideal to measure the pH of the solution prepared for WO3 hydrothermal synthesis process.

  17. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever

    PubMed Central

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Krasemann, Susanne

    2016-01-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology. PMID:27191716

  18. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    SciTech Connect

    Al-Taie, H. Kelly, M. J.; Smith, L. W.; Lesage, A. A. J.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Smith, C. G.; See, P.

    2015-08-21

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  19. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever.

    PubMed

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Pérez-Girón, José V; Krasemann, Susanne; Günther, Stephan; Muñoz-Fontela, César

    2016-05-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology. PMID:27191716

  20. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions

    PubMed Central

    2016-01-01

    Objectives To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Methods Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005–Dec 2007 and Nov 2013–Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Results Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3–4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). Clinical Significance The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance. PMID:27556230

  1. Reproducibility of corneal astigmatism measurements with a hand held keratometer in preschool children.

    PubMed Central

    Harvey, E M; Miller, J M; Dobson, V

    1995-01-01

    AIMS--To evaluate the overall accuracy and reproducibility of the Alcon portable autokeratometer (PAK) measurements in infants and young children. METHODS--The accuracy of the Alcon PAK in measuring toric reference surfaces (1, 3, 5, and 7 D) under various suboptimal measurement conditions was assessed, and the reproducibility of PAK measurements of corneal astigmatism in newborn infants (n = 5), children (n = 19, age 3-5 years), and adults (n = 14) was evaluated. RESULTS--Measurements of toric reference surfaces indicated (a) no significant effect of distance (17-30 mm) on accuracy of measurements, (b) no systematic relation between amount of toricity and accuracy of measurements, (c) no systematic relation between angle of measurement and accuracy, (d) no difference in accuracy of measurements when the PAK is hand held in comparison with when it is mounted, (e) no difference in accuracy of measurements when axis of toricity is oriented obliquely than when it is oriented horizontally, with respect to the PAK, and (f) a small positive bias (+0.16 D) in measurement of spherical equivalent. The PAK did not prove useful for screening newborns. However, measurements were successfully obtained from 18/19 children and 14/14 adults. There was no significant difference in median measurement deviation (deviation of a subject's five measurements from his/her mean) between children (0.21 D) and adults (0.13 D). CONCLUSIONS--The PAK produces accurate measurements of surface curvature under a variety of suboptimal conditions. Variability of PAK measurements in preschool children is small enough to suggest that it would be useful for screening for corneal astigmatism in young children. PMID:8534668

  2. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  3. Reproducibility of UAV-based earth surface topography based on structure-from-motion algorithms.

    NASA Astrophysics Data System (ADS)

    Clapuyt, François; Vanacker, Veerle; Van Oost, Kristof

    2014-05-01

    A representation of the earth surface at very high spatial resolution is crucial to accurately map small geomorphic landforms with high precision. Very high resolution digital surface models (DSM) can then be used to quantify changes in earth surface topography over time, based on differencing of DSMs taken at various moments in time. However, it is compulsory to have both high accuracy for each topographic representation and consistency between measurements over time, as DSM differencing automatically leads to error propagation. This study investigates the reproducibility of reconstructions of earth surface topography based on structure-from-motion (SFM) algorithms. To this end, we equipped an eight-propeller drone with a standard reflex camera. This equipment can easily be deployed in the field, as it is a lightweight, low-cost system in comparison with classic aerial photo surveys and terrestrial or airborne LiDAR scanning. Four sets of aerial photographs were created for one test field. The sets of airphotos differ in focal length, and viewing angles, i.e. nadir view and ground-level view. In addition, the importance of the accuracy of ground control points for the construction of a georeferenced point cloud was assessed using two different GPS devices with horizontal accuracy at resp. the sub-meter and sub-decimeter level. Airphoto datasets were processed with SFM algorithm and the resulting point clouds were georeferenced. Then, the surface representations were compared with each other to assess the reproducibility of the earth surface topography. Finally, consistency between independent datasets is discussed.

  4. Accurate strain measurements in highly strained Ge microbridges

    NASA Astrophysics Data System (ADS)

    Gassenq, A.; Tardif, S.; Guilloy, K.; Osvaldo Dias, G.; Pauc, N.; Duchemin, I.; Rouchon, D.; Hartmann, J.-M.; Widiez, J.; Escalante, J.; Niquet, Y.-M.; Geiger, R.; Zabel, T.; Sigg, H.; Faist, J.; Chelnokov, A.; Rieutord, F.; Reboud, V.; Calvo, V.

    2016-06-01

    Ge under high strain is predicted to become a direct bandgap semiconductor. Very large deformations can be introduced using microbridge devices. However, at the microscale, strain values are commonly deduced from Raman spectroscopy using empirical linear models only established up to ɛ100 = 1.2% for uniaxial stress. In this work, we calibrate the Raman-strain relation at higher strain using synchrotron based microdiffraction. The Ge microbridges show unprecedented high tensile strain up to 4.9% corresponding to an unexpected Δω = 9.9 cm-1 Raman shift. We demonstrate experimentally and theoretically that the Raman strain relation is not linear and we provide a more accurate expression.

  5. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    PubMed

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  6. Reproducibility test on a children's insole for measuring the dynamic plantar pressure distribution.

    PubMed

    Hayes, A; Seitz, P

    1997-04-01

    the treadmill and free walking data. There was no significant difference between these data. The 95% confidence intervals for the mean of the peak pressure under the big toe was 9% for the treadmill data and 16% for the free walking data. CONCLUSIONS:: These results, together with other published data (McPoil et al. 1995) show that the pedar children's insole system provides accurate and reproducible measurements of the dynamic plantar pressure distribution. The clinician can therefore use this system with confidence as a therapeutic or rehabilitative tool. PMID:11415700

  7. Accurate Calculation of Hydration Free Energies using Pair-Specific Lennard-Jones Parameters in the CHARMM Drude Polarizable Force Field

    PubMed Central

    Baker, Christopher M.; Lopes, Pedro E. M.; Zhu, Xiao; Roux, Benoît; MacKerell, Alexander D.

    2010-01-01

    Lennard-Jones (LJ) parameters for a variety of model compounds have previously been optimized within the CHARMM Drude polarizable force field to reproduce accurately pure liquid phase thermodynamic properties as well as additional target data. While the polarizable force field resulting from this optimization procedure has been shown to satisfactorily reproduce a wide range of experimental reference data across numerous series of small molecules, a slight but systematic overestimate of the hydration free energies has also been noted. Here, the reproduction of experimental hydration free energies is greatly improved by the introduction of pair-specific LJ parameters between solute heavy atoms and water oxygen atoms that override the standard LJ parameters obtained from combining rules. The changes are small and a systematic protocol is developed for the optimization of pair-specific LJ parameters and applied to the development of pair-specific LJ parameters for alkanes, alcohols and ethers. The resulting parameters not only yield hydration free energies in good agreement with experimental values, but also provide a framework upon which other pair-specific LJ parameters can be added as new compounds are parametrized within the CHARMM Drude polarizable force field. Detailed analysis of the contributions to the hydration free energies reveals that the dispersion interaction is the main source of the systematic errors in the hydration free energies. This information suggests that the systematic error may result from problems with the LJ combining rules and is combined with analysis of the pair-specific LJ parameters obtained in this work to identify a preliminary improved combining rule. PMID:20401166

  8. Question 8: From a Set of Chemical Reactions to Reproducing Cells

    NASA Astrophysics Data System (ADS)

    Kaneko, Kunihiko

    2007-10-01

    As a prerequisite for a reproducing cell, we first discuss how non-equilibrium states are sustained endogenously in a catalytic reaction network. Negative correlation between abundances of resource chemicals and of catalysts for their consumption is shown to lead to hindrance of relaxation to equilibrium. Mutual reinforcement among such sustainment of non-equilibrium state, spatial structure formation, and reproduction of a compartment is discussed as a mechanism to further suppress the relaxation to equilibrium. As a next step to protocell, consistency between cell reproduction and replication of constituent molecules is theoretically studied, which leads to a power-law on abundance of molecules as well as log-normal distribution over cells, which are shown to be universal, and also confirmed experimentally in the present cells.

  9. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    NASA Astrophysics Data System (ADS)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  10. Bipedal spring-damper-mass model reproduces external mechanical power of human walking.

    PubMed

    Etenzi, Ettore; Monaco, Vito

    2015-08-01

    Previous authors have long investigated the behavior of different models of passive walkers with stiff or compliant limbs. We investigated a model of bipedal mechanism whose limba are provided with damping and elastic elements. This model is designed for walking along an inclined plane, in order to make up the energy lost due to the damping element with that gained thanks to the lowering the CoM. The proposed model is hence able to steadily walk. In particular we investigated the stability of this model by using the Poincaré return map for different dynamical configurations. Then we compared the estimated external mechanical power with experimental data from literature in order to validate the model. Results show that the model is able to reproduce the main features of the time course of the external mechanical power during the gait cycle. Accordingly, dissipative elements coupled with limbs' compliant behavior represent a suitable paradigm, to mimic human locomotion. PMID:26736788

  11. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  12. Scallops skeletons as tools for accurate proxy calibration

    NASA Astrophysics Data System (ADS)

    Lorrain, A.; Paulet, Y.-M.; Chauvaud, L.; Dunbar, R.; Mucciarone, D.; Pécheyran, C.; Amouroux, D.; Fontugne, M.

    2003-04-01

    Bivalves skeletons are able to produce great geochemical proxies. But general calibration of those proxies are based on approximate time basis because of misunderstanding of growth rhythm. In this context, the Great scallop, Pecten maximus, appears to be a powerful tool as a daily growth deposit has been clearly identified for this species (Chauvaud et al, 1998; Lorrain et al, 2000), allowing accurate environmental calibration. Indeed, using this species, a date can be affiliated to each growth increment, and as a consequence environmental parameters can be closely compared (at a daily scale) to observed chemical and structural shell variations. This daily record provides an unequivocal basis to calibrate proxies. Isotopic (Delta-13C and Delta-15N) and trace element analysis (LA-ICP-MS) have been performed on several individuals and different years depending on the analysed parameter. Seawater parameters measured one meter above the sea-bottom were compared to chemical variations in the calcitic shell. Their confrontation showed that even with a daily basis for data interpretation, calibration is still a challenge. Inter-individual variations are found and correlations are not always reproducible from one year to the others. The first explanation could be an inaccurate appreciation of the proximate environment of the animal, notably the water-sediment interface could best represent Pecten maximus environment. Secondly, physiological parameters could be inferred for those discrepancies. In particular, calcification takes places in the extrapallial fluid, which composition might be very different from external environment. Accurate calibration of chemical proxies should consider biological aspects to gain better insights into the processes controlling the incorporation of those chemical elements. The characterisation of isotopic and trace element composition of the extrapallial fluid and hemolymph could greatly help our understanding of chemical shell variations.

  13. Developing simulations to reproduce in vivo fluoroscopy kinematics in total knee replacement patients.

    PubMed

    Fitzpatrick, Clare K; Komistek, Richard D; Rullkoetter, Paul J

    2014-07-18

    For clinically predictive testing and design-phase evaluation of prospective total knee replacement (TKR) implants, devices should ideally be evaluated under physiological loading conditions which incorporate population-level variability. A challenge exists for experimental and computational researchers in determining appropriate loading conditions for wear and kinematic knee simulators which reflect in vivo joint loading conditions. There is a great deal of kinematic data available from fluoroscopy studies. The purpose of this work was to develop computational methods to derive anterior-posterior (A-P) and internal-external (I-E) tibiofemoral (TF) joint loading conditions from in vivo kinematic data. Two computational models were developed, a simple TF model, and a more complex lower limb model. These models were driven through external loads applied to the tibia and femur in the TF model, and applied to the hip, ankle and muscles in the lower limb model. A custom feedback controller was integrated with the finite element environment and used to determine the external loads required to reproduce target kinematics at the TF joint. The computational platform was evaluated using in vivo kinematic data from four fluoroscopy patients, and reproduced in vivo A-P and I-E motions and compressive force with a root-mean-square (RMS) accuracy of less than 1mm, 0.1°, and 40 N in the TF model and in vivo A-P and I-E motions, TF flexion, and compressive loads with a RMS accuracy of less than 1mm, 0.1°, 1.4°, and 48 N in the lower limb model. The external loading conditions derived from these models can ultimately be used to establish population variability in loading conditions, for eventual use in computational as well as experimental activity simulations. PMID:24845696

  14. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  15. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  16. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  17. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  18. Finding reproducible cluster partitions for the k-means algorithm

    PubMed Central

    2013-01-01

    K-means clustering is widely used for exploratory data analysis. While its dependence on initialisation is well-known, it is common practice to assume that the partition with lowest sum-of-squares (SSQ) total i.e. within cluster variance, is both reproducible under repeated initialisations and also the closest that k-means can provide to true structure, when applied to synthetic data. We show that this is generally the case for small numbers of clusters, but for values of k that are still of theoretical and practical interest, similar values of SSQ can correspond to markedly different cluster partitions. This paper extends stability measures previously presented in the context of finding optimal values of cluster number, into a component of a 2-d map of the local minima found by the k-means algorithm, from which not only can values of k be identified for further analysis but, more importantly, it is made clear whether the best SSQ is a suitable solution or whether obtaining a consistently good partition requires further application of the stability index. The proposed method is illustrated by application to five synthetic datasets replicating a real world breast cancer dataset with varying data density, and a large bioinformatics dataset. PMID:23369085

  19. Interlaboratory Reproducibility of Blood Morphology Using the Digital Microscope.

    PubMed

    Riedl, Jurgen A; Stouten, Karlijn; Ceelie, Huib; Boonstra, Joke; Levin, Mark-David; van Gelder, Warry

    2015-12-01

    Differential counting of peripheral blood cells is an important diagnostic tool. However, manual morphological analysis using the microscope is time-consuming and requires highly trained personnel. The digital microscope is capable of performing an automated peripheral blood cell differential, which is as reliable as manual classification by experienced laboratory technicians. To date, information concerning the interlaboratory variation and quality of cell classification by independently operated digital microscopy systems is limited. We compared four independently operated digital microscope systems for their ability in classifying the five main peripheral blood cell classes and detection of blast cells in 200 randomly selected samples. Set against the averaged results, the R(2) values for neutrophils ranged between 0.90 and 0.96, for lymphocytes between 0.83 and 0.94, for monocytes between 0.77 and 0.82, for eosinophils between 0.70 and 0.78, and for blast cells between 0.94 and 0.99. The R(2) values for the basophils were between 0.28 and 0.34. This study shows that independently operated digital microscopy systems yield reproducible preclassification results when determining the percentages of neutrophils, eosinophils, lymphocytes, monocytes, and blast cells in a peripheral blood smear. Detection of basophils was hampered by the low incidence of this cell class in the samples. PMID:25925737

  20. Reproducibility problems with the AMPLICOR PCR Chlamydia trachomatis test.

    PubMed Central

    Peterson, E M; Darrow, V; Blanding, J; Aarnaes, S; de la Maza, L M

    1997-01-01

    In an attempt to use an expanded "gold standard" in an evaluation of an antigen detection test for Chlamydia trachomatis, the AMPLICOR (Roche Diagnostics Systems, Inc., Branchburg, N.J.) PCR Chlamydia trachomatis test and culture were used with 591 sets of cervical specimens. Of the 591 specimens assayed, 35 were retested due to either an equivocal result by the PCR (19 samples) or a discrepancy between the results of culture, PCR, and the antigen detection method. During the repeat testing of the samples with equivocal and discrepant results, all but one interpretation change was due to the PCR result. In addition, upon repeat testing the PCR assay value measured in optical density units varied widely for 13 of these specimens. These 13 specimens were then tested in triplicate by the manufacturer with primers to the chlamydia plasmid and in duplicate with primers to the major outer membrane protein. Only 3 of the 13 specimens gave the same interpretation with these five replicates. In summary, reproducibility problems with the AMPLICOR test should be considered before it is incorporated as part of routine testing or used as an expanded gold standard for chlamydia testing. PMID:9157161

  1. Reproducibility of cardioventilatory measurements using a respiratory mass spectrometer.

    PubMed

    Narang, Indra; Rosenthal, Mark; Bush, Andrew

    2007-08-01

    The aim of this study was to assess the within subject reproducibility of cardioventilatory measurements and the maximum permitted 'normal' variability over time at rest and exercise using the respiratory mass spectrometer (RMS). Ten subjects underwent an incremental exercise test on three separate occasions utilising rebreathing (RB) and helium dilution mixed expired gas analysis (HME) functions of the RMS. Measurements included heart rate (HR), oxygen consumption (V(O2)), carbon dioxide excretion (V(VO2)), effective pulmonary blood flow (Q(eff)), stroke volume (SV), arteriovenous oxygen content difference (AVO), transfer factor (Dl(CO)), functional residual capacity (FRC), minute ventilation (VE), tidal volume (VT) and respiratory quotient (RQ). The coefficients of variation for each variable for the 10 subjects were calculated. At rest, the 90th centile variability for measured cardiopulmonary variables (RB only) was <35%. During exercise, the 90th centile for variability for measured cardiopulmonary variables for HME and RB were < or =20 and <40%, respectively. These measurements in healthy adults should inform sample size in research studies. PMID:17188945

  2. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  3. Magnetofection: a reproducible method for gene delivery to melanoma cells.

    PubMed

    Prosen, Lara; Prijic, Sara; Music, Branka; Lavrencak, Jaka; Cemazar, Maja; Sersa, Gregor

    2013-01-01

    Magnetofection is a nanoparticle-mediated approach for transfection of cells, tissues, and tumors. Specific interest is in using superparamagnetic iron oxide nanoparticles (SPIONs) as delivery system of therapeutic genes. Magnetofection has already been described in some proof-of-principle studies; however, fine tuning of the synthesis of SPIONs is necessary for its broader application. Physicochemical properties of SPIONs, synthesized by the co-precipitation in an alkaline aqueous medium, were tested after varying different parameters of the synthesis procedure. The storage time of iron(II) sulfate salt, the type of purified water, and the synthesis temperature did not affect physicochemical properties of SPIONs. Also, varying the parameters of the synthesis procedure did not influence magnetofection efficacy. However, for the pronounced gene expression encoded by plasmid DNA it was crucial to functionalize poly(acrylic) acid-stabilized SPIONs (SPIONs-PAA) with polyethyleneimine (PEI) without the adjustment of its elementary alkaline pH water solution to the physiological pH. In conclusion, the co-precipitation of iron(II) and iron(III) sulfate salts with subsequent PAA stabilization, PEI functionalization, and plasmid DNA binding is a robust method resulting in a reproducible and efficient magnetofection. To achieve high gene expression is important, however, the pH of PEI water solution for SPIONs-PAA functionalization, which should be in the alkaline range. PMID:23862136

  4. How to Obtain Reproducible Results for Lithium Sulfur Batteries

    SciTech Connect

    Zheng, Jianming; Lu, Dongping; Gu, Meng; Wang, Chong M.; Zhang, Jiguang; Liu, Jun; Xiao, Jie

    2013-01-01

    The basic requirements for getting reliable Li-S battery data have been discussed in this work. Unlike Li-ion batteries, electrolyte-rich environment significantly affects the cycling stability of Li-S batteries prepared and tested under the same conditions. The reason has been assigned to the different concentrations of polysulfide-containing electrolytes in the cells, which have profound influences on both sulfur cathode and lithium anode. At optimized S/E ratio of 50 g L-1, a good balance among electrolyte viscosity, wetting ability, diffusion rate dissolved polysulfide and nucleation/growth of short-chain Li2S/Li2S2 has been built along with largely reduced contamination on the lithium anode side. Accordingly, good cyclability, high reversible capacity and Coulombic efficiency are achieved in Li-S cell with controlled S/E ratio without any additive. Other factors such as sulfur content in the composite and sulfur loading on the electrode also need careful concern in Li-S system in order to generate reproducible results and gauge the various methods used to improve Li-S battery technology.

  5. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  6. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  7. Stochastic simulations of minimal self-reproducing cellular systems.

    PubMed

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2007-10-29

    This paper is a theoretical attempt to gain insight into the problem of how self-assembling vesicles (closed bilayer structures) could progressively turn into minimal self-producing and self-reproducing cells, i.e. into interesting candidates for (proto)biological systems. With this aim, we make use of a recently developed object-oriented platform to carry out stochastic simulations of chemical reaction networks that take place in dynamic cellular compartments. We apply this new tool to study the behaviour of different minimal cell models, making realistic assumptions about the physico-chemical processes and conditions involved (e.g. thermodynamic equilibrium/non-equilibrium, variable volume-to-surface relationship, osmotic pressure, solute diffusion across the membrane due to concentration gradients, buffering effect). The new programming platform has been designed to analyse not only how a single protometabolic cell could maintain itself, grow or divide, but also how a collection of these cells could 'evolve' as a result of their mutual interactions in a common environment. PMID:17510021

  8. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  9. Automated and Reproducible Full Waveform Inversion with Multiple Data Sets

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; Villasenor, A.; Krischer, L.; Ermert, L. A.; Afanasiev, M.

    2014-12-01

    We present a series of methodological developments intended to (1) accelerate and automise full seismic waveform inversion from local to global scales, and (2) improve tomographic resolution and its quantification. Our developments include an open-source framework for the management of seismic data and iterative non-linear inversions. This ensures that information on provenance, processing, modelling and inversion is systematically archived, thus facilitating reproducibility. Furthermore, tools for automised window selection, misfit measurements and input file generation for various forward solvers are provided. To enhance resolution in regions poorly covered by earthquake data, we incorporate ambient noise correlations in the inversion. Since correlations are affected by the distribution of noise sources, we only measure the more robust traveltime differences of narrow-band surface waves; disregarding waveform details that would be exploitable in the case of earthquake data. To quantify resolution of full waveform inversion models at minimal computational cost, we employ a newly developed stochastic sampling technique that extracts various resolution proxies from the Hessian through the application of quasi-random test models. The Western Mediterranean serves as the real-data testing ground for our developments. Data from the IberArray project combined with noise and earthquake recordings from nearly 1000 stations throughout Europe provide exceptional coverage. Embedded within a multi-scale model of the Globe, our tomographic images provide a detailed snapshot of Western Mediterranean geodynamics, including, for instance, the lateral extent and fine-scale details of subducted lithospheric slabs in the region.

  10. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  11. Histopathologic reproducibility of thyroid disease in an epidemiologic study

    SciTech Connect

    Ron, E.; Griffel, B.; Liban, E.; Modan, B.

    1986-03-01

    An investigation of the long-term effects of childhood scalp irradiation demonstrated a significantly increased risk of thyroid tumors in the irradiated population. Because of the complexity of thyroid cancer diagnosis, a histopathologic slide review of 59 of the 68 patients (irradiated and nonirradiated) with thyroid disease was undertaken. The review revealed 90% agreement (kappa = +0.85, P less than 0.01) between the original and review diagnosis. Four of 27 cases previously diagnosed as malignant were reclassified as benign, yielding a cancer misdiagnosis rate of 14.8%. All four of the misdiagnosed cancers were of follicular or mixed papillary-follicular type. As a result of the histologic review, the ratio of malignant to benign tumors decreased from 2.55 to 1.75. Since disagreement in diagnosis was similar in the irradiated and nonirradiated groups, the relative risk of radiation-associated neoplasms did not change substantially. The histopathologic review shows that although there were some problems in diagnostic reproducibility, they were not statistically significant and did not alter our previous conclusions regarding radiation exposure. However, a 15% reduction in the number of malignancies might affect epidemiologic studies with an external comparison as well as geographic or temporal comparisons.

  12. Reproducibility of Vibrionaceae population structure in coastal bacterioplankton.

    PubMed

    Szabo, Gitta; Preheim, Sarah P; Kauffman, Kathryn M; David, Lawrence A; Shapiro, Jesse; Alm, Eric J; Polz, Martin F

    2013-03-01

    How reproducibly microbial populations assemble in the wild remains poorly understood. Here, we assess evidence for ecological specialization and predictability of fine-scale population structure and habitat association in coastal ocean Vibrionaceae across years. We compare Vibrionaceae lifestyles in the bacterioplankton (combinations of free-living, particle, or zooplankton associations) measured using the same sampling scheme in 2006 and 2009 to assess whether the same groups show the same environmental association year after year. This reveals complex dynamics with populations falling primarily into two categories: (i) nearly equally represented in each of the two samplings and (ii) highly skewed, often to an extent that they appear exclusive to one or the other sampling times. Importantly, populations recovered at the same abundance in both samplings occupied highly similar habitats suggesting predictable and robust environmental association while skewed abundances of some populations may be triggered by shifts in ecological conditions. The latter is supported by difference in the composition of large eukaryotic plankton between years, with samples in 2006 being dominated by copepods, and those in 2009 by diatoms. Overall, the comparison supports highly predictable population-habitat linkage but highlights the fact that complex, and often unmeasured, environmental dynamics in habitat occurrence may have strong effects on population dynamics. PMID:23178668

  13. The determination of accurate dipole polarizabilities alpha and gamma for the noble gases

    NASA Technical Reports Server (NTRS)

    Rice, Julia E.; Taylor, Peter R.; Lee, Timothy J.; Almlof, Jan

    1991-01-01

    Accurate static dipole polarizabilities alpha and gamma of the noble gases He through Xe were determined using wave functions of similar quality for each system. Good agreement with experimental data for the static polarizability gamma was obtained for Ne and Xe, but not for Ar and Kr. Calculations suggest that the experimental values for these latter ions are too low.

  14. Accurate and occlusion-robust multi-view stereo

    NASA Astrophysics Data System (ADS)

    Zhu, Zhaokun; Stamatopoulos, Christos; Fraser, Clive S.

    2015-11-01

    This paper proposes an accurate multi-view stereo method for image-based 3D reconstruction that features robustness in the presence of occlusions. The new method offers improvements in dealing with two fundamental image matching problems. The first concerns the selection of the support window model, while the second centers upon accurate visibility estimation for each pixel. The support window model is based on an approximate 3D support plane described by a depth and two per-pixel depth offsets. For the visibility estimation, the multi-view constraint is initially relaxed by generating separate support plane maps for each support image using a modified PatchMatch algorithm. Then the most likely visible support image, which represents the minimum visibility of each pixel, is extracted via a discrete Markov Random Field model and it is further augmented by parameter clustering. Once the visibility is estimated, multi-view optimization taking into account all redundant observations is conducted to achieve optimal accuracy in the 3D surface generation for both depth and surface normal estimates. Finally, multi-view consistency is utilized to eliminate any remaining observational outliers. The proposed method is experimentally evaluated using well-known Middlebury datasets, and results obtained demonstrate that it is amongst the most accurate of the methods thus far reported via the Middlebury MVS website. Moreover, the new method exhibits a high completeness rate.

  15. Accurate projector calibration method by using an optical coaxial camera.

    PubMed

    Huang, Shujun; Xie, Lili; Wang, Zhangying; Zhang, Zonghua; Gao, Feng; Jiang, Xiangqian

    2015-02-01

    Digital light processing (DLP) projectors have been widely utilized to project digital structured-light patterns in 3D imaging systems. In order to obtain accurate 3D shape data, it is important to calibrate DLP projectors to obtain the internal parameters. The existing projector calibration methods have complicated procedures or low accuracy of the obtained parameters. This paper presents a novel method to accurately calibrate a DLP projector by using an optical coaxial camera. The optical coaxial geometry is realized by a plate beam splitter, so the DLP projector can be treated as a true inverse camera. A plate having discrete markers on the surface is used to calibrate the projector. The corresponding projector pixel coordinate of each marker on the plate is determined by projecting vertical and horizontal sinusoidal fringe patterns on the plate surface and calculating the absolute phase. The internal parameters of the DLP projector are obtained by the corresponding point pair between the projector pixel coordinate and the world coordinate of discrete markers. Experimental results show that the proposed method can accurately calibrate the internal parameters of a DLP projector. PMID:25967789

  16. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  17. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  18. Accurate ab initio vibrational energies of methyl chloride

    SciTech Connect

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH{sub 3}{sup 35}Cl and CH{sub 3}{sup 37}Cl. The respective PESs, CBS-35{sup  HL}, and CBS-37{sup  HL}, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY {sub 3}Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35{sup  HL} and CBS-37{sup  HL} PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm{sup −1}, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH{sub 3}Cl without empirical refinement of the respective PESs.

  19. Accurate ab initio vibrational energies of methyl chloride.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH3 (35)Cl and CH3 (37)Cl. The respective PESs, CBS-35( HL), and CBS-37( HL), are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY 3Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35( HL) and CBS-37( HL) PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm(-1), respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH3Cl without empirical refinement of the respective PESs. PMID:26133427

  20. Accurate ab initio vibrational energies of methyl chloride

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-01

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH335Cl and CH337Cl. The respective PESs, CBS-35 HL, and CBS-37 HL, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY 3Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35 HL and CBS-37 HL PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm-1, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH3Cl without empirical refinement of the respective PESs.

  1. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  2. Visceral leishmaniasis: experimental models for drug discovery.

    PubMed

    Gupta, Suman

    2011-01-01

    Visceral leishmaniasis (VL) or kala-azar is a chronic protozoan infection in humans associated with significant global morbidity and mortality. The causative agent is a haemoflagellate protozoan Leishmania donovani, an obligate intracellular parasite that resides and multiplies within macrophages of the reticulo-endothelial system. Most of the existing anti-leishmanial drugs have serious side effects that limit their clinical application. As an alternate strategy, vaccination is also under experimental and clinical trials. The in vitro evaluation designed to facilitate rapid testing of a large number of drugs has been focussed on the promastigotes milt little attention on the clinically relevant parasite stage, amastigotes. Screening designed to closely reflect the situation in vivo is currently time consuming, laborious, and expensive, since it requires intracellular amastigotes and animal model. The ability to select transgenic Leishmania expressing reporter proteins, such as the green fluorescent proteins (GFP) or the luciferase opened up new possibilities for the development of drug screening models. Many experimental animal models like rodents, dogs and monkeys have been developed, each with specific features, but none accurately reproduces what happens in humans. Available in vitro and in vivo methodologies for antileishmanial drug screening and their respective advantages and disadvantages are reviewed. PMID:21321417

  3. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    PubMed

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  4. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  5. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  6. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  7. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  8. Two highly accurate methods for pitch calibration

    NASA Astrophysics Data System (ADS)

    Kniel, K.; Härtig, F.; Osawa, S.; Sato, O.

    2009-11-01

    Among profiles, helix and tooth thickness pitch is one of the most important parameters of an involute gear measurement evaluation. In principle, coordinate measuring machines (CMM) and CNC-controlled gear measuring machines as a variant of a CMM are suited for these kinds of gear measurements. Now the Japan National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) and the German national metrology institute the Physikalisch-Technische Bundesanstalt (PTB) have each developed independently highly accurate pitch calibration methods applicable to CMM or gear measuring machines. Both calibration methods are based on the so-called closure technique which allows the separation of the systematic errors of the measurement device and the errors of the gear. For the verification of both calibration methods, NMIJ/AIST and PTB performed measurements on a specially designed pitch artifact. The comparison of the results shows that both methods can be used for highly accurate calibrations of pitch standards.

  9. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  10. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  11. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  12. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-10-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  13. Detection of infectious laryngotracheitis virus by real-time PCR in naturally and experimentally infected chickens.

    PubMed

    Zhao, Yan; Kong, Congcong; Cui, Xianlan; Cui, Hongyu; Shi, Xingming; Zhang, Xiaomin; Hu, Shunlei; Hao, Lianwei; Wang, Yunfeng

    2013-01-01

    Infectious laryngotracheitis (ILT) is an acute, highly contagious upper-respiratory infectious disease of chickens. In this study, a real-time PCR method was developed for fast and accurate detection and quantitation of ILTV DNA of chickens experimentally infected with ILTV strain LJS09 and naturally infected chickens. The detection lower limit of the assay was 10 copies of DNA. There were no cross reactions with the DNA and RNA of infectious bursal disease virus, chicken anemia virus, reticuloendotheliosis virus, avian reovirus, Newcastle disease virus, and Marek's disease virus. The real-time PCR was reproducible as the coefficients of variation of reproducibility of the intra-assay and the inter-assay were less than 2%. The real-time PCR was used to detect the levels of the ILTV DNA in the tissues of specific pathogen free (SPF) chickens infected with ILTV at different times post infection. ILTV DNA was detected by real-time PCR in the heart, liver, spleen, lung, kidney, larynx, tongue, thymus, glandular stomach, duodenum, pancreatic gland, small intestine, large intestine, cecum, cecal tonsil, bursa of Fabricius, and brain of chickens in the infection group and the contact-exposure group. The sensitivity, specificity, and reproducibility of the ILTV real-time PCR assay revealed its suitability for detection and quantitation of ILTV in the samples from clinically and experimentally ILTV infected chickens. PMID:23840745

  14. Detection of Infectious Laryngotracheitis Virus by Real-Time PCR in Naturally and Experimentally Infected Chickens

    PubMed Central

    Zhao, Yan; Kong, Congcong; Cui, Xianlan; Cui, Hongyu; Shi, Xingming; Zhang, Xiaomin; Hu, Shunlei; Hao, Lianwei; Wang, Yunfeng

    2013-01-01

    Infectious laryngotracheitis (ILT) is an acute, highly contagious upper-respiratory infectious disease of chickens. In this study, a real-time PCR method was developed for fast and accurate detection and quantitation of ILTV DNA of chickens experimentally infected with ILTV strain LJS09 and naturally infected chickens. The detection lower limit of the assay was 10 copies of DNA. There were no cross reactions with the DNA and RNA of infectious bursal disease virus, chicken anemia virus, reticuloendotheliosis virus, avian reovirus, Newcastle disease virus, and Marek's disease virus. The real-time PCR was reproducible as the coefficients of variation of reproducibility of the intra-assay and the inter-assay were less than 2%. The real-time PCR was used to detect the levels of the ILTV DNA in the tissues of specific pathogen free (SPF) chickens infected with ILTV at different times post infection. ILTV DNA was detected by real-time PCR in the heart, liver, spleen, lung, kidney, larynx, tongue, thymus, glandular stomach, duodenum, pancreatic gland, small intestine, large intestine, cecum, cecal tonsil, bursa of Fabricius, and brain of chickens in the infection group and the contact-exposure group. The sensitivity, specificity, and reproducibility of the ILTV real-time PCR assay revealed its suitability for detection and quantitation of ILTV in the samples from clinically and experimentally ILTV infected chickens. PMID:23840745

  15. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models. PMID:27111139

  16. Accurate phase-shift velocimetry in rock

    NASA Astrophysics Data System (ADS)

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  17. Scan-rescan reproducibility of CT densitometric measures of emphysema

    NASA Astrophysics Data System (ADS)

    Chong, D.; van Rikxoort, E. M.; Kim, H. J.; Goldin, J. G.; Brown, M. S.

    2011-03-01

    This study investigated the reproducibility of HRCT densitometric measures of emphysema in patients scanned twice one week apart. 24 emphysema patients from a multicenter study were scanned at full inspiration (TLC) and expiration (RV), then again a week later for four scans total. Scans for each patient used the same scanner and protocol, except for tube current in three patients. Lung segmentation with gross airway removal was performed on the scans. Volume, weight, mean lung density (MLD), relative area under -950HU (RA-950), and 15th percentile (PD-15) were calculated for TLC, and volume and an airtrapping mask (RA-air) between -950 and -850HU for RV. For each measure, absolute differences were computed for each scan pair, and linear regression was performed against volume difference in a subgroup with volume difference <500mL. Two TLC scan pairs were excluded due to segmentation failure. The mean lung volumes were 5802 +/- 1420mL for TLC, 3878 +/- 1077mL for RV. The mean absolute differences were 169mL for TLC volume, 316mL for RV volume, 14.5g for weight, 5.0HU for MLD, 0.66p.p. for RA-950, 2.4HU for PD-15, and 3.1p.p. for RA-air. The <500mL subgroup had 20 scan pairs for TLC and RV. The R2 values were 0.8 for weight, 0.60 for MLD, 0.29 for RA-950, 0.31 for PD-15, and 0.64 for RA-air. Our results indicate that considerable variability exists in densitometric measures over one week that cannot be attributed to breathhold or physiology. This has implications for clinical trials relying on these measures to assess emphysema treatment efficacy.

  18. Development of a Consistent and Reproducible Porcine Scald Burn Model.

    PubMed

    Andrews, Christine J; Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  19. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  20. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  1. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  2. Reproducibility of magnetic resonance spectroscopy in correlation with signal-to-noise ratio.

    PubMed

    Okada, Tomohisa; Sakamoto, Setsu; Nakamoto, Yuji; Kohara, Nobuo; Senda, Michio

    2007-11-15

    An increased amount of myoinositol (mI) relative to creatine (Cr) by proton MR spectroscopy ((1)H-MRS) measurement gives a useful aid for the diagnosis of Alzheimer's disease (AD). Previous results of test-retest measurement of mI, however, have shown variability more than twice as large as for other metabolites. The aims of this study were to analyze test-retest variability of (1)H-MRS measurements in correlation with signal-to-noise ratio (SNR). Ten subjects clinically suspected of mild AD were examined twice (2-14 days apart) with (1)H-MRS measurements of voxels placed at anterior and posterior cingulate cortex. The percent differences between two measurements (%differences) of mI/Cr showed a significant linear trend to decrease as average SNR increased, but %differences of N-acetylaspartate (NAA)/Cr and choline (Cho)/Cr did not. The average of %differences was 10.5, 15.0 and 20.8 for NAA/Cr, Cho/Cr, and mI/Cr, respectively, indicating a prominent deterioration of mI/Cr measurement reproducibility, which decreased to 6.96, 15.4 and 9.87, respectively, when the analysis was limited to measurements with SNR over 25. The results indicate that MRS measurements with high SNR should be used to obtain reliable assessments of mI/Cr as accurate diagnostic indicator of AD in clinical MR examinations. PMID:17900878

  3. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets. PMID:27479329

  4. Optimized PCR Conditions and Increased shRNA Fold Representation Improve Reproducibility of Pooled shRNA Screens

    PubMed Central

    Strezoska, Žaklina; Licon, Abel; Haimes, Josh; Spayd, Katie Jansen; Patel, Kruti M.; Sullivan, Kevin; Jastrzebski, Katarzyna; Simpson, Kaylene J.; Leake, Devin; van Brabant Smith, Anja; Vermeulen, Annaleen

    2012-01-01

    RNAi screening using pooled shRNA libraries is a valuable tool for identifying genetic regulators of biological processes. However, for a successful pooled shRNA screen, it is imperative to thoroughly optimize experimental conditions to obtain reproducible data. Here we performed viability screens with a library of ∼10 000 shRNAs at two different fold representations (100- and 500-fold at transduction) and report the reproducibility of shRNA abundance changes between screening replicates determined by microarray and next generation sequencing analyses. We show that the technical reproducibility between PCR replicates from a pooled screen can be drastically improved by ensuring that PCR amplification steps are kept within the exponential phase and by using an amount of genomic DNA input in the reaction that maintains the average template copies per shRNA used during library transduction. Using these optimized PCR conditions, we then show that higher reproducibility of biological replicates is obtained by both microarray and next generation sequencing when screening with higher average shRNA fold representation. shRNAs that change abundance reproducibly in biological replicates (primary hits) are identified from screens performed with both 100- and 500-fold shRNA representation, however a higher percentage of primary hit overlap between screening replicates is obtained from 500-fold shRNA representation screens. While strong hits with larger changes in relative abundance were generally identified in both screens, hits with smaller changes were identified only in the screens performed with the higher shRNA fold representation at transduction. PMID:22870320

  5. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  6. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate

    PubMed Central

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-01-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the “V”-shaped or “U”-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications. PMID:27586562

  7. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate.

    PubMed

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-01-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the "V"-shaped or "U"-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications. PMID:27586562

  8. Accurate Molecular Dimensions from Stearic Acid Monolayers.

    ERIC Educational Resources Information Center

    Lane, Charles A.; And Others

    1984-01-01

    Discusses modifications in the fatty acid monolayer experiment to reduce the inaccurate moleculary data students usually obtain. Copies of the experimental procedure used and a Pascal computer program to work up the data are available from the authors. (JN)

  9. Procedure for accurate fabrication of tissue compensators with high-density material

    NASA Astrophysics Data System (ADS)

    Mejaddem, Younes; Lax, Ingmar; Adakkai K, Shamsuddin

    1997-02-01

    An accurate method for producing compensating filters using high-density material (Cerrobend) is described. The procedure consists of two cutting steps in a Styrofoam block: (i) levelling a surface of the block to a reference level; (ii) depth-modulated milling of the levelled block in accordance with pre-calculated thickness profiles of the compensator. The calculated thickness (generated by a dose planning system) can be reproduced within acceptable accuracy. The desired compensator thickness manufactured according to this procedure is reproduced to within 0.1 mm, corresponding to a 0.5% change in dose at a beam quality of 6 MV. The results of our quality control checks performed with the technique of stylus profiling measurements show an accuracy of 0.04 mm in the milling process over an arbitrary profile along the milled-out Styrofoam block.

  10. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records

    PubMed Central

    Zozus, Meredith N.; Richesson, Rachel L.; Walden, Anita; Tenenbaum, Jessie D.; Hammond, W.E.

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  11. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records.

    PubMed

    Zozus, Meredith N; Richesson, Rachel L; Walden, Anita; Tenenbaum, Jessie D; Hammond, W E

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  12. Research Elements: new article types by Elsevier to facilitate reproducibility in science

    NASA Astrophysics Data System (ADS)

    Zudilova-Seinstra, Elena; van Hensbergen, Kitty; Wacek, Bart

    2016-04-01

    When researchers start to make plans for new experiments, this is the beginning of a whole cycle of work, including experimental designs, tweaking of existing methods, developing protocols, writing code, collecting and processing experimental data, etc. A large part of this very useful information rarely gets published, which makes experiments difficult to reproduce. The same holds for experimental data, which is not always provided in a reusable format and lacks descriptive information. Furthermore, many types of data, such as a replication data, negative datasets or data from "intermediate experiments" often don't get published because they have no place in a research journal. To address this concern, Elsevier launched a series of peer-reviewed journal titles grouped under the umbrella of Research Elements (https://www.elsevier.com/books-and-journals/research-elements) that allow researchers to publish their data, software, materials and methods and other elements of the research cycle in a brief article format. To facilitate reproducibility, Research Elements have thoroughly thought out submission templates that include all necessary information and metadata as well as peer-review criteria defined per article type. Research Elements can be applicable to multiple research areas; for example, a number of multidisciplinary journals (Data in Brief, SoftwareX, MethodsX) welcome submissions from a large number of subject areas. At other times, these elements are better served within a single field; therefore, a number of domain-specific journals (e.g.: Genomics Data, Chemical Data Collections, Neurocomputing) support the new article formats, too. Upon publication, all Research Elements are assigned with persistent identifiers for direct citation and easy discoverability. Persistent identifiers are also used for interlinking Research Elements and relevant research papers published in traditional journals. Some Research Elements allow post-publication article updates

  13. Can global hydrological models reproduce large scale river flood regimes?

    NASA Astrophysics Data System (ADS)

    Eisner, Stephanie; Flörke, Martina

    2013-04-01

    River flooding remains one of the most severe natural hazards. On the one hand, major flood events pose a serious threat to human well-being, causing deaths and considerable economic damage. On the other hand, the periodic occurrence of flood pulses is crucial to maintain the functioning of riverine floodplains and wetlands, and to preserve the ecosystem services the latter provide. In many regions, river floods reveal a distinct seasonality, i.e. they occur at a particular time during the year. This seasonality is related to regionally dominant flood generating processes which can be expressed in river flood types. While in data-rich regions (esp. Europe and North America) the analysis of flood regimes can be based on observed river discharge time series, this data is sparse or lacking in many other regions of the world. This gap of knowledge can be filled by global modeling approaches. However, to date most global modeling studies have focused on mean annual or monthly water availability and their change over time while simulating discharge extremes, both floods and droughts, still remains a challenge for large scale hydrological models. This study will explore the ability of the global hydrological model WaterGAP3 to simulate the large scale patterns of river flood regimes, represented by seasonal pattern and the dominant flood type. WaterGAP3 simulates the global terrestrial water balance on a 5 arc minute spatial grid (excluding Greenland and Antarctica) at a daily time step. The model accounts for human interference on river flow, i.e. water abstraction for various purposes, e.g. irrigation, and flow regulation by large dams and reservoirs. Our analysis will provide insight in the general ability of global hydrological models to reproduce river flood regimes and thus will promote the creation of a global map of river flood regimes to provide a spatially inclusive and comprehensive picture. Understanding present-day flood regimes can support both flood risk

  14. Reproducing kernel potential energy surfaces in biomolecular simulations: Nitric oxide binding to myoglobin

    SciTech Connect

    Soloviov, Maksym; Meuwly, Markus

    2015-09-14

    Multidimensional potential energy surfaces based on reproducing kernel-interpolation are employed to explore the energetics and dynamics of free and bound nitric oxide in myoglobin (Mb). Combining a force field description for the majority of degrees of freedom and the higher-accuracy representation for the NO ligand and the Fe out-of-plane motion allows for a simulation approach akin to a mixed quantum mechanics/molecular mechanics treatment. However, the kernel-representation can be evaluated at conventional force-field speed. With the explicit inclusion of the Fe-out-of-plane (Fe-oop) coordinate, the dynamics and structural equilibrium after photodissociation of the ligand are correctly described compared to experiment. Experimentally, the Fe-oop coordinate plays an important role for the ligand dynamics. This is also found here where the isomerization dynamics between the Fe–ON and Fe–NO state is significantly affected whether or not this co-ordinate is explicitly included. Although the Fe–ON conformation is metastable when considering only the bound {sup 2}A state, it may disappear once the {sup 4}A state is included. This explains the absence of the Fe–ON state in previous experimental investigations of MbNO.

  15. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  16. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2016-07-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  17. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  18. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  19. Are Kohn-Sham conductances accurate?

    PubMed

    Mera, H; Niquet, Y M

    2010-11-19

    We use Fermi-liquid relations to address the accuracy of conductances calculated from the single-particle states of exact Kohn-Sham (KS) density functional theory. We demonstrate a systematic failure of this procedure for the calculation of the conductance, and show how it originates from the lack of renormalization in the KS spectral function. In certain limits this failure can lead to a large overestimation of the true conductance. We also show, however, that the KS conductances can be accurate for single-channel molecular junctions and systems where direct Coulomb interactions are strongly dominant. PMID:21231333

  20. Accurate density functional thermochemistry for larger molecules.

    SciTech Connect

    Raghavachari, K.; Stefanov, B. B.; Curtiss, L. A.; Lucent Tech.

    1997-06-20

    Density functional methods are combined with isodesmic bond separation reaction energies to yield accurate thermochemistry for larger molecules. Seven different density functionals are assessed for the evaluation of heats of formation, Delta H 0 (298 K), for a test set of 40 molecules composed of H, C, O and N. The use of bond separation energies results in a dramatic improvement in the accuracy of all the density functionals. The B3-LYP functional has the smallest mean absolute deviation from experiment (1.5 kcal mol/f).

  1. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material. PMID:11366835

  2. Comprehensive and Reproducible Phosphopeptide Enrichment Using Iron Immobilized Metal Ion Affinity Chromatography (Fe-IMAC) Columns

    PubMed Central

    Ruprecht, Benjamin; Koch, Heiner; Medard, Guillaume; Mundt, Max; Kuster, Bernhard; Lemeer, Simone

    2015-01-01

    Advances in phosphopeptide enrichment methods enable the identification of thousands of phosphopeptides from complex samples. Current offline enrichment approaches using TiO2, Ti, and Fe immobilized metal ion affinity chromatography (IMAC) material in batch or microtip format are widely used, but they suffer from irreproducibility and compromised selectivity. To address these shortcomings, we revisited the merits of performing phosphopeptide enrichment in an HPLC column format. We found that Fe-IMAC columns enabled the selective, comprehensive, and reproducible enrichment of phosphopeptides out of complex lysates. Column enrichment did not suffer from bead-to-sample ratio issues and scaled linearly from 100 μg to 5 mg of digest. Direct measurements on an Orbitrap Velos mass spectrometer identified >7500 unique phosphopeptides with 90% selectivity and good quantitative reproducibility (median cv of 15%). The number of unique phosphopeptides could be increased to more than 14,000 when the IMAC eluate was subjected to a subsequent hydrophilic strong anion exchange separation. Fe-IMAC columns outperformed Ti-IMAC and TiO2 in batch or tip mode in terms of phosphopeptide identification and intensity. Permutation enrichments of flow-throughs showed that all materials largely bound the same phosphopeptide species, independent of physicochemical characteristics. However, binding capacity and elution efficiency did profoundly differ among the enrichment materials and formats. As a result, the often quoted orthogonality of the materials has to be called into question. Our results strongly suggest that insufficient capacity, inefficient elution, and the stochastic nature of data-dependent acquisition in mass spectrometry are the causes of the experimentally observed complementarity. The Fe-IMAC enrichment workflow using an HPLC format developed here enables rapid and comprehensive phosphoproteome analysis that can be applied to a wide range of biological systems. PMID

  3. Accurate Calculation of Solvation Free Energies in Supercritical Fluids by Fully Atomistic Simulations: Probing the Theory of Solutions in Energy Representation.

    PubMed

    Frolov, Andrey I

    2015-05-12

    Accurate calculation of solvation free energies (SFEs) is a fundamental problem of theoretical chemistry. In this work we perform a careful validation of the theory of solutions in energy representation (ER method) developed by Matubayasi et al. [J. Chem. Phys. 2000, 113, 6070-6081] for SFE calculations in supercritical solvents. This method can be seen as a bridge between the molecular simulations and the classical (not quantum) density functional theory (DFT) formulated in energy representation. We performed extensive calculations of SFEs of organic molecules of different chemical natures in pure supercritical CO2 (sc-CO2) and in sc-CO2 with addition of 6 mol % of ethanol, acetone, and n-hexane as cosolvents. We show that the ER method reproduces SFE data calculated by a method free of theoretical approximations (the Bennett's acceptance ratio) with the mean absolute error of only 0.05 kcal/mol. However, the ER method requires by an order less computational resources. Also, we show that the quality of ER calculations should be carefully monitored since the lack of sampling can result into a considerable bias in predictions. The present calculations reproduce the trends in the cosolvent-induced solubility enhancement factors observed in experimental data. Thus, we think that molecular simulations coupled with the ER method can be used for quick calculations of the effect of variation of temperature, pressure, and cosolvent concentration on SFE and hence solubility of bioactive compounds in supercritical fluids. This should dramatically reduce the burden of experimental work on optimizing solvency of supercritical solvents. PMID:26574423

  4. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    PubMed Central

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  5. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture.

    PubMed

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  6. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  7. Accurate radiative transfer calculations for layered media.

    PubMed

    Selden, Adrian C

    2016-07-01

    Simple yet accurate results for radiative transfer in layered media with discontinuous refractive index are obtained by the method of K-integrals. These are certain weighted integrals applied to the angular intensity distribution at the refracting boundaries. The radiative intensity is expressed as the sum of the asymptotic angular intensity distribution valid in the depth of the scattering medium and a transient term valid near the boundary. Integrated boundary equations are obtained, yielding simple linear equations for the intensity coefficients, enabling the angular emission intensity and the diffuse reflectance (albedo) and transmittance of the scattering layer to be calculated without solving the radiative transfer equation directly. Examples are given of half-space, slab, interface, and double-layer calculations, and extensions to multilayer systems are indicated. The K-integral method is orders of magnitude more accurate than diffusion theory and can be applied to layered scattering media with a wide range of scattering albedos, with potential applications to biomedical and ocean optics. PMID:27409700

  8. Fast and accurate propagation of coherent light

    PubMed Central

    Lewis, R. D.; Beylkin, G.; Monzón, L.

    2013-01-01

    We describe a fast algorithm to propagate, for any user-specified accuracy, a time-harmonic electromagnetic field between two parallel planes separated by a linear, isotropic and homogeneous medium. The analytical formulation of this problem (ca 1897) requires the evaluation of the so-called Rayleigh–Sommerfeld integral. If the distance between the planes is small, this integral can be accurately evaluated in the Fourier domain; if the distance is very large, it can be accurately approximated by asymptotic methods. In the large intermediate region of practical interest, where the oscillatory Rayleigh–Sommerfeld kernel must be applied directly, current numerical methods can be highly inaccurate without indicating this fact to the user. In our approach, for any user-specified accuracy ϵ>0, we approximate the kernel by a short sum of Gaussians with complex-valued exponents, and then efficiently apply the result to the input data using the unequally spaced fast Fourier transform. The resulting algorithm has computational complexity , where we evaluate the solution on an N×N grid of output points given an M×M grid of input samples. Our algorithm maintains its accuracy throughout the computational domain. PMID:24204184

  9. How Accurately can we Calculate Thermal Systems?

    SciTech Connect

    Cullen, D; Blomquist, R N; Dean, C; Heinrichs, D; Kalugin, M A; Lee, M; Lee, Y; MacFarlan, R; Nagaya, Y; Trkov, A

    2004-04-20

    I would like to determine how accurately a variety of neutron transport code packages (code and cross section libraries) can calculate simple integral parameters, such as K{sub eff}, for systems that are sensitive to thermal neutron scattering. Since we will only consider theoretical systems, we cannot really determine absolute accuracy compared to any real system. Therefore rather than accuracy, it would be more precise to say that I would like to determine the spread in answers that we obtain from a variety of code packages. This spread should serve as an excellent indicator of how accurately we can really model and calculate such systems today. Hopefully, eventually this will lead to improvements in both our codes and the thermal scattering models that they use in the future. In order to accomplish this I propose a number of extremely simple systems that involve thermal neutron scattering that can be easily modeled and calculated by a variety of neutron transport codes. These are theoretical systems designed to emphasize the effects of thermal scattering, since that is what we are interested in studying. I have attempted to keep these systems very simple, and yet at the same time they include most, if not all, of the important thermal scattering effects encountered in a large, water-moderated, uranium fueled thermal system, i.e., our typical thermal reactors.

  10. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  11. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  12. Accurate determination of characteristic relative permeability curves

    NASA Astrophysics Data System (ADS)

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  13. NEAMS Experimental Support for Code Validation, INL FY2009

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatore; C. Rabiti

    2009-09-01

    The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Whereas the Verification part of the process does not rely on experiment, the Validation part, on the contrary, necessitates as many relevant and precise experimental data as possible to make sure the models reproduce reality as closely as possible. Hence, this report presents a limited selection of experimental data that could be used to validate the codes devoted mainly to Fast Neutron Reactor calculations in the US. Emphasis has been put on existing data for thermal-hydraulics, fuel and reactor physics. The principles of a new “smart” experiment that could be used to improve our knowledge of neutron cross-sections are presented as well. In short, it consists in irradiating a few milligrams of actinides and analyzing the results with Accelerator Mass Spectroscopy to infer the neutron cross-sections. Finally, the wealth of experimental data relevant to Fast Neutron Reactors in the US should not be taken for granted and efforts should be put on saving these 30-40 years old data and on making sure they are validation-worthy, i.e. that the experimental conditions and uncertainties are well documented.

  14. Magnetohydrodynamic generator experimental studies

    NASA Technical Reports Server (NTRS)

    Pierson, E. S.

    1972-01-01

    The results for an experimental study of a one wavelength MHD induction generator operating on a liquid flow are presented. First the design philosophy and the experimental generator design are summarized, including a description of the flow loop and instrumentation. Next a Fourier series method of treating the fact that the magnetic flux density produced by the stator is not a pure traveling sinusoid is described and some results summarized. This approach appears to be of interest after revisions are made, but the initial results are not accurate. Finally, some of the experimental data is summarized for various methods of excitation.

  15. Accurate spectral modeling for infrared radiation

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gupta, S. K.

    1977-01-01

    Direct line-by-line integration and quasi-random band model techniques are employed to calculate the spectral transmittance and total band absorptance of 4.7 micron CO, 4.3 micron CO2, 15 micron CO2, and 5.35 micron NO bands. Results are obtained for different pressures, temperatures, and path lengths. These are compared with available theoretical and experimental investigations. For each gas, extensive tabulations of results are presented for comparative purposes. In almost all cases, line-by-line results are found to be in excellent agreement with the experimental values. The range of validity of other models and correlations are discussed.

  16. SOPROLIFE System: An Accurate Diagnostic Enhancer

    PubMed Central

    Zeitouny, Mona; Feghali, Mireille; Nasr, Assaad; Abou-Samra, Philippe; Saleh, Nadine; Bourgeois, Denis; Farge, Pierre

    2014-01-01

    Objectives. The aim of this study was to evaluate a light-emitting diode fluorescence tool, the SOPROLIFE light-induced fluorescence evaluator, and compare it to the international caries detection and assessment system-II (ICDAS-II) in the detection of occlusal caries. Methods. A total of 219 permanent posterior teeth in 21 subjects, with age ranging from 15 to 65 years, were examined. An intraclass correlation coefficient (ICC) was computed to assess the reliability between the two diagnostic methods. Results. The results showed a high reliability between the two methods (ICC = 0.92; IC = 0.901–0.940; P < 0.001). The SOPROLIFE blue fluorescence mode had a high sensitivity (87%) and a high specificity (99%) when compared to ICDAS-II. Conclusion. Compared to the most used visual method in the diagnosis of occlusal caries lesions, the finding from this study suggests that SOPROLIFE can be used as a reproducible and reliable assessment tool. At a cut-off point, categorizing noncarious lesions and visual change in enamel, SOPROLIFE shows a high sensitivity and specificity. We can conclude that financially ICDAS is better than SOPROLIFE. However SOPROLIFE is easier for clinicians since it is a simple evaluation of images. Finally in terms of efficiency SOPROLIFE is not superior to ICDAS but tends to be equivalent with the same advantages. PMID:25401161

  17. An accurate equation of state for fluids and solids.

    PubMed

    Parsafar, G A; Spohr, H V; Patey, G N

    2009-09-01

    A simple functional form for a general equation of state based on an effective near-neighbor pair interaction of an extended Lennard-Jones (12,6,3) type is given and tested against experimental data for a wide variety of fluids and solids. Computer simulation results for ionic liquids are used for further evaluation. For fluids, there appears to be no upper density limitation on the equation of state. The lower density limit for isotherms near the critical temperature is the critical density. The equation of state gives a good description of all types of fluids, nonpolar (including long-chain hydrocarbons), polar, hydrogen-bonded, and metallic, at temperatures ranging from the triple point to the highest temperature for which there is experimental data. For solids, the equation of state is very accurate for all types considered, including covalent, molecular, metallic, and ionic systems. The experimental pvT data available for solids does not reveal any pressure or temperature limitations. An analysis of the importance and possible underlying physical significance of the terms in the equation of state is given. PMID:19678647

  18. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  19. Accurate bulk density determination of irregularly shaped translucent and opaque aerogels

    NASA Astrophysics Data System (ADS)

    Petkov, M. P.; Jones, S. M.

    2016-05-01

    We present a volumetric method for accurate determination of bulk density of aerogels, calculated from extrapolated weight of the dry pure solid and volume estimates based on the Archimedes' principle of volume displacement, using packed 100 μm-sized monodispersed glass spheres as a "quasi-fluid" media. Hard particle packing theory is invoked to demonstrate the reproducibility of the apparent density of the quasi-fluid. Accuracy rivaling that of the refractive index method is demonstrated for both translucent and opaque aerogels with different absorptive properties, as well as for aerogels with regular and irregular shapes.

  20. On the Possibility to Combine the Order Effect with Sequential Reproducibility for Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Basieva, Irina; Khrennikov, Andrei

    2015-10-01

    In this paper we study the problem of a possibility to use quantum observables to describe a possible combination of the order effect with sequential reproducibility for quantum measurements. By the order effect we mean a dependence of probability distributions (of measurement results) on the order of measurements. We consider two types of the sequential reproducibility: adjacent reproducibility (A-A) (the standard perfect repeatability) and separated reproducibility(A-B-A). The first one is reproducibility with probability 1 of a result of measurement of some observable A measured twice, one A measurement after the other. The second one, A-B-A, is reproducibility with probability 1 of a result of A measurement when another quantum observable B is measured between two A's. Heuristically, it is clear that the second type of reproducibility is complementary to the order effect. We show that, surprisingly, this may not be the case. The order effect can coexist with a separated reproducibility as well as adjacent reproducibility for both observables A and B. However, the additional constraint in the form of separated reproducibility of the B-A-B type makes this coexistence impossible. The problem under consideration was motivated by attempts to apply the quantum formalism outside of physics, especially, in cognitive psychology and psychophysics. However, it is also important for foundations of quantum physics as a part of the problem about the structure of sequential quantum measurements.

  1. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  2. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  3. Reproducibility and Accuracy of Quantitative Myocardial Blood Flow Using 82Rb-PET: Comparison with 13N-Ammonia

    PubMed Central

    Fakhri, Georges El

    2011-01-01

    =0.843) and stress (r2=0.761). The Bland-Altman plots show no significant presence of proportional error at rest or stress, nor a dependence of the variations on the amplitude of the myocardial blood flow at rest or stress. A small systematic overestimation of 13N-ammonia MBF was observed with 82Rb at rest (0.129 ml/g/min) and the opposite, i.e., underestimation, at stress (0.22 ml/g/min). Conclusions Our results show that absolute quantitation of myocardial bloof flow is reproducible and accurate with 82Rb dynamic cardiac PET as compared to 13N-ammonia. The reproducibility of the quantitation approach itself was very good as well as inter-observer reproducibility. PMID:19525467

  4. SU-E-J-236: Audiovisual Biofeedback Improves Breath-Hold Lung Tumor Position Reproducibility Measured with 4D MRI

    SciTech Connect

    Lee, D; Pollock, S; Keall, P; Greer, P; Lapuz, C; Ludbrook, J; Kim, T

    2015-06-15

    Purpose: Audiovisual biofeedback breath-hold (AVBH) was employed to reproduce tumor position on inhale and exhale breath-holds for 4D tumor information. We hypothesize that lung tumor position will be more consistent using AVBH compared with conventional breath-hold (CBH). Methods: Lung tumor positions were determined for seven lung cancer patients (age: 25 – 74) during to two separate 3T MRI sessions. A breathhold training session was performed prior to the MRI sessions to allow patients to become comfortable with AVBH and their exhale and inhale target positions. CBH and AVBH 4D image datasets were obtained in the first MRI session (pre-treatment) and the second MRI session (midtreatment) within six weeks of the first session. Audio-instruction (MRI: Siemens Skyra) in CBH and verbal-instruction (radiographer) in AVBH were used. A radiation oncologist contoured the lung tumor using Eclipse (Varian Medical Systems); tumor position was quantified as the centroid of the contoured tumor after rigid registration based on vertebral anatomy across two MRI sessions. CBH and AVBH were compared in terms of the reproducibility assessed via (1) the difference between the two exhale positions for the two sessions and the two inhale positions for the sessions. (2) The difference in amplitude (exhale to inhale) between the two sessions. Results: Compared to CBH, AVBH improved the reproducibility of two exhale (or inhale) lung tumor positions relative to each other by 33%, from 6.4±5.3 mm to 4.3±3.0 mm (p=0.005). Compared to CBH, AVBH improved the reproducibility of exhale and inhale amplitude by 66%, from 5.6±5.9 mm to 1.9±1.4 mm (p=0.005). Conclusions: This study demonstrated that audiovisual biofeedback can be utilized for improving the reproducibility of breath-hold lung tumor position. These results are advantageous towards achieving more accurate emerging radiation treatment planning methods, in addition to imaging and treatment modalities utilizing breath

  5. Anatomical Brain Images Alone Can Accurately Diagnose Chronic Neuropsychiatric Illnesses

    PubMed Central

    Bansal, Ravi; Staib, Lawrence H.; Laine, Andrew F.; Hao, Xuejun; Xu, Dongrong; Liu, Jun; Weissman, Myrna; Peterson, Bradley S.

    2012-01-01

    Objective Diagnoses using imaging-based measures alone offer the hope of improving the accuracy of clinical diagnosis, thereby reducing the costs associated with incorrect treatments. Previous attempts to use brain imaging for diagnosis, however, have had only limited success in diagnosing patients who are independent of the samples used to derive the diagnostic algorithms. We aimed to develop a classification algorithm that can accurately diagnose chronic, well-characterized neuropsychiatric illness in single individuals, given the availability of sufficiently precise delineations of brain regions across several neural systems in anatomical MR images of the brain. Methods We have developed an automated method to diagnose individuals as having one of various neuropsychiatric illnesses using only anatomical MRI scans. The method employs a semi-supervised learning algorithm that discovers natural groupings of brains based on the spatial patterns of variation in the morphology of the cerebral cortex and other brain regions. We used split-half and leave-one-out cross-validation analyses in large MRI datasets to assess the reproducibility and diagnostic accuracy of those groupings. Results In MRI datasets from persons with Attention-Deficit/Hyperactivity Disorder, Schizophrenia, Tourette Syndrome, Bipolar Disorder, or persons at high or low familial risk for Major Depressive Disorder, our method discriminated with high specificity and nearly perfect sensitivity the brains of persons who had one specific neuropsychiatric disorder from the brains of healthy participants and the brains of persons who had a different neuropsychiatric disorder. Conclusions Although the classification algorithm presupposes the availability of precisely delineated brain regions, our findings suggest that patterns of morphological variation across brain surfaces, extracted from MRI scans alone, can successfully diagnose the presence of chronic neuropsychiatric disorders. Extensions of these

  6. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2003-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  7. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2002-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  8. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  9. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  10. The thermodynamic cost of accurate sensory adaptation

    NASA Astrophysics Data System (ADS)

    Tu, Yuhai

    2015-03-01

    Living organisms need to obtain and process environment information accurately in order to make decisions critical for their survival. Much progress have been made in identifying key components responsible for various biological functions, however, major challenges remain to understand system-level behaviors from the molecular-level knowledge of biology and to unravel possible physical principles for the underlying biochemical circuits. In this talk, we will present some recent works in understanding the chemical sensory system of E. coli by combining theoretical approaches with quantitative experiments. We focus on addressing the questions on how cells process chemical information and adapt to varying environment, and what are the thermodynamic limits of key regulatory functions, such as adaptation.

  11. Accurate numerical solutions of conservative nonlinear oscillators

    NASA Astrophysics Data System (ADS)

    Khan, Najeeb Alam; Nasir Uddin, Khan; Nadeem Alam, Khan

    2014-12-01

    The objective of this paper is to present an investigation to analyze the vibration of a conservative nonlinear oscillator in the form u" + lambda u + u^(2n-1) + (1 + epsilon^2 u^(4m))^(1/2) = 0 for any arbitrary power of n and m. This method converts the differential equation to sets of algebraic equations and solve numerically. We have presented for three different cases: a higher order Duffing equation, an equation with irrational restoring force and a plasma physics equation. It is also found that the method is valid for any arbitrary order of n and m. Comparisons have been made with the results found in the literature the method gives accurate results.

  12. Accurate Telescope Mount Positioning with MEMS Accelerometers

    NASA Astrophysics Data System (ADS)

    Mészáros, L.; Jaskó, A.; Pál, A.; Csépány, G.

    2014-08-01

    This paper describes the advantages and challenges of applying microelectromechanical accelerometer systems (MEMS accelerometers) in order to attain precise, accurate, and stateless positioning of telescope mounts. This provides a completely independent method from other forms of electronic, optical, mechanical or magnetic feedback or real-time astrometry. Our goal is to reach the subarcminute range which is considerably smaller than the field-of-view of conventional imaging telescope systems. Here we present how this subarcminute accuracy can be achieved with very cheap MEMS sensors and we also detail how our procedures can be extended in order to attain even finer measurements. In addition, our paper discusses how can a complete system design be implemented in order to be a part of a telescope control system.

  13. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception. PMID:24549293

  14. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  15. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  16. The importance of accurate atmospheric modeling

    NASA Astrophysics Data System (ADS)

    Payne, Dylan; Schroeder, John; Liang, Pang

    2014-11-01

    This paper will focus on the effect of atmospheric conditions on EO sensor performance using computer models. We have shown the importance of accurately modeling atmospheric effects for predicting the performance of an EO sensor. A simple example will demonstrated how real conditions for several sites in China will significantly impact on image correction, hyperspectral imaging, and remote sensing. The current state-of-the-art model for computing atmospheric transmission and radiance is, MODTRAN® 5, developed by the US Air Force Research Laboratory and Spectral Science, Inc. Research by the US Air Force, Navy and Army resulted in the public release of LOWTRAN 2 in the early 1970's. Subsequent releases of LOWTRAN and MODTRAN® have continued until the present. Please verify that (1) all pages are present, (2) all figures are correct, (3) all fonts and special characters are correct, and (4) all text and figures fit within the red margin lines shown on this review document. Complete formatting information is available at http://SPIE.org/manuscripts Return to the Manage Active Submissions page at http://spie.org/submissions/tasks.aspx and approve or disapprove this submission. Your manuscript will not be published without this approval. Please contact author_help@spie.org with any questions or concerns. The paper will demonstrate the importance of using validated models and local measured meteorological, atmospheric and aerosol conditions to accurately simulate the atmospheric transmission and radiance. Frequently default conditions are used which can produce errors of as much as 75% in these values. This can have significant impact on remote sensing applications.

  17. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities. PMID:12747164

  18. Accurate Weather Forecasting for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  19. Strategy for accurate liver intervention by an optical tracking system

    PubMed Central

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Guan, Peifeng; Xiao, Weihu; Wu, Xiaoming

    2015-01-01

    Image-guided navigation for radiofrequency ablation of liver tumors requires the accurate guidance of needle insertion into a tumor target. The main challenge of image-guided navigation for radiofrequency ablation of liver tumors is the occurrence of liver deformations caused by respiratory motion. This study reports a strategy of real-time automatic registration to track custom fiducial markers glued onto the surface of a patient’s abdomen to find the respiratory phase, in which the static preoperative CT is performed. Custom fiducial markers are designed. Real-time automatic registration method consists of the automatic localization of custom fiducial markers in the patient and image spaces. The fiducial registration error is calculated in real time and indicates if the current respiratory phase corresponds to the phase of the static preoperative CT. To demonstrate the feasibility of the proposed strategy, a liver simulator is constructed and two volunteers are involved in the preliminary experiments. An ex-vivo porcine liver model is employed to further verify the strategy for liver intervention. Experimental results demonstrate that real-time automatic registration method is rapid, accurate, and feasible for capturing the respiratory phase from which the static preoperative CT anatomical model is generated by tracking the movement of the skin-adhered custom fiducial markers. PMID:26417501

  20. An accurate model potential for alkali neon systems.

    PubMed

    Zanuttini, D; Jacquet, E; Giglio, E; Douady, J; Gervais, B

    2009-12-01

    We present a detailed investigation of the ground and lowest excited states of M-Ne dimers, for M=Li, Na, and K. We show that the potential energy curves of these Van der Waals dimers can be obtained accurately by considering the alkali neon systems as one-electron systems. Following previous authors, the model describes the evolution of the alkali valence electron in the combined potentials of the alkali and neon cores by means of core polarization pseudopotentials. The key parameter for an accurate model is the M(+)-Ne potential energy curve, which was obtained by means of ab initio CCSD(T) calculation using a large basis set. For each MNe dimer, a systematic comparison with ab initio computation of the potential energy curve for the X, A, and B states shows the remarkable accuracy of the model. The vibrational analysis and the comparison with existing experimental data strengthens this conclusion and allows for a precise assignment of the vibrational levels. PMID:19968334

  1. Accurate Evaluation Method of Molecular Binding Affinity from Fluctuation Frequency

    NASA Astrophysics Data System (ADS)

    Hoshino, Tyuji; Iwamoto, Koji; Ode, Hirotaka; Ohdomari, Iwao

    2008-05-01

    Exact estimation of the molecular binding affinity is significantly important for drug discovery. The energy calculation is a direct method to compute the strength of the interaction between two molecules. This energetic approach is, however, not accurate enough to evaluate a slight difference in binding affinity when distinguishing a prospective substance from dozens of candidates for medicine. Hence more accurate estimation of drug efficacy in a computer is currently demanded. Previously we proposed a concept of estimating molecular binding affinity, focusing on the fluctuation at an interface between two molecules. The aim of this paper is to demonstrate the compatibility between the proposed computational technique and experimental measurements, through several examples for computer simulations of an association of human immunodeficiency virus type-1 (HIV-1) protease and its inhibitor (an example for a drug-enzyme binding), a complexation of an antigen and its antibody (an example for a protein-protein binding), and a combination of estrogen receptor and its ligand chemicals (an example for a ligand-receptor binding). The proposed affinity estimation has proven to be a promising technique in the advanced stage of the discovery and the design of drugs.

  2. Isomerism of Cyanomethanimine: Accurate Structural, Energetic, and Spectroscopic Characterization.

    PubMed

    Puzzarini, Cristina

    2015-11-25

    The structures, relative stabilities, and rotational and vibrational parameters of the Z-C-, E-C-, and N-cyanomethanimine isomers have been evaluated using state-of-the-art quantum-chemical approaches. Equilibrium geometries have been calculated by means of a composite scheme based on coupled-cluster calculations that accounts for the extrapolation to the complete basis set limit and core-correlation effects. The latter approach is proved to provide molecular structures with an accuracy of 0.001-0.002 Å and 0.05-0.1° for bond lengths and angles, respectively. Systematically extrapolated ab initio energies, accounting for electron correlation through coupled-cluster theory, including up to single, double, triple, and quadruple excitations, and corrected for core-electron correlation and anharmonic zero-point vibrational energy, have been used to accurately determine relative energies and the Z-E isomerization barrier with an accuracy of about 1 kJ/mol. Vibrational and rotational spectroscopic parameters have been investigated by means of hybrid schemes that allow us to obtain rotational constants accurate to about a few megahertz and vibrational frequencies with a mean absolute error of ∼1%. Where available, for all properties considered, a very good agreement with experimental data has been observed. PMID:26529434

  3. Accurate oscillator strengths for interstellar ultraviolet lines of Cl I

    NASA Technical Reports Server (NTRS)

    Schectman, R. M.; Federman, S. R.; Beideck, D. J.; Ellis, D. J.

    1993-01-01

    Analyses on the abundance of interstellar chlorine rely on accurate oscillator strengths for ultraviolet transitions. Beam-foil spectroscopy was used to obtain f-values for the astrophysically important lines of Cl I at 1088, 1097, and 1347 A. In addition, the line at 1363 A was studied. Our f-values for 1088, 1097 A represent the first laboratory measurements for these lines; the values are f(1088)=0.081 +/- 0.007 (1 sigma) and f(1097) = 0.0088 +/- 0.0013 (1 sigma). These results resolve the issue regarding the relative strengths for 1088, 1097 A in favor of those suggested by astronomical measurements. For the other lines, our results of f(1347) = 0.153 +/- 0.011 (1 sigma) and f(1363) = 0.055 +/- 0.004 (1 sigma) are the most precisely measured values available. The f-values are somewhat greater than previous experimental and theoretical determinations.

  4. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  5. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  6. Raising the bar for reproducible science at the U.S. Environmental Protection Agency Office of Research and Development.

    PubMed

    George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N

    2015-05-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. PMID:25795653

  7. Raising the Bar for Reproducible Science at the U.S. Environmental Protection Agency Office of Research and Development

    PubMed Central

    George, Barbara Jane; Sobus, Jon R.; Phelps, Lara P.; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N.

    2015-01-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency’s research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. PMID:25795653

  8. An isotopic-independent highly accurate potential energy surface for CO2 isotopologues and an initial (12)C(16)O2 infrared line list.

    PubMed

    Huang, Xinchuan; Schwenke, David W; Tashkun, Sergey A; Lee, Timothy J

    2012-03-28

    An isotopic-independent, highly accurate potential energy surface (PES) has been determined for CO(2) by refining a purely ab initio PES with selected, purely experimentally determined rovibrational energy levels. The purely ab initio PES is denoted Ames-0, while the refined PES is denoted Ames-1. Detailed tests are performed to demonstrate the spectroscopic accuracy of the Ames-1 PES. It is shown that Ames-1 yields σ(rms) (root-mean-squares error) = 0.0156 cm(-1) for 6873 J = 0-117 (12)C(16)O(2) experimental energy levels, even though less than 500 (12)C(16)O(2) energy levels were included in the refinement procedure. It is also demonstrated that, without any additional refinement, Ames-1 yields very good agreement for isotopologues. Specifically, for the (12)C(16)O(2) and (13)C(16)O(2) isotopologues, spectroscopic constants G(v) computed from Ames-1 are within ±0.01 and 0.02 cm(-1) of reliable experimentally derived values, while for the (16)O(12)C(18)O, (16)O(12)C(17)O, (16)O(13)C(18)O, (16)O(13)C(17)O, (12)C(18)O(2), (17)O(12)C(18)O, (12)C(17)O(2), (13)C(18)O(2), (13)C(17)O(2), (17)O(13)C(18)O, and (14)C(16)O(2) isotopologues, the differences are between ±0.10 and 0.15 cm(-1). To our knowledge, this is the first time a polyatomic PES has been refined using such high J values, and this has led to new challenges in the refinement procedure. An initial high quality, purely ab initio dipole moment surface (DMS) is constructed and used to generate a 296 K line list. For most bands, experimental IR intensities are well reproduced for (12)C(16)O(2) using Ames-1 and the DMS. For more than 80% of the bands, the experimental intensities are reproduced with σ(rms)(ΔI) < 20% or σ(rms)(ΔI∕δ(obs)) < 5. A few exceptions are analyzed and discussed. Directions for future improvements are discussed, though it is concluded that the current Ames-1 and the DMS should be useful in analyzing and assigning high-resolution laboratory or astronomical spectra. PMID:22462861

  9. Development of an XYZ Digital Camera with Embedded Color Calibration System for Accurate Color Acquisition

    NASA Astrophysics Data System (ADS)

    Kretkowski, Maciej; Jablonski, Ryszard; Shimodaira, Yoshifumi

    Acquisition of accurate colors is important in the modern era of widespread exchange of electronic multimedia. The variety of device-dependent color spaces causes troubles with accurate color reproduction. In this paper we present the outlines of accomplished digital camera system with device-independent output formed from tristimulus XYZ values. The outstanding accuracy and fidelity of acquired color is achieved in our system by employing an embedded color calibration system based on emissive device generating reference calibration colors with user-defined spectral distribution and chromaticity coordinates. The system was tested by calibrating the camera using 24 reference colors spectrally reproduced from 24 color patches of the Macbeth Chart. The average color difference (CIEDE2000) has been found to be ΔE =0.83, which is an outstanding result compared to commercially available digital cameras.

  10. Accurate Nanoscale Crystallography in Real-Space Using Scanning Transmission Electron Microscopy.

    PubMed

    Dycus, J Houston; Harris, Joshua S; Sang, Xiahan; Fancher, Chris M; Findlay, Scott D; Oni, Adedapo A; Chan, Tsung-Ta E; Koch, Carl C; Jones, Jacob L; Allen, Leslie J; Irving, Douglas L; LeBeau, James M

    2015-08-01

    Here, we report reproducible and accurate measurement of crystallographic parameters using scanning transmission electron microscopy. This is made possible by removing drift and residual scan distortion. We demonstrate real-space lattice parameter measurements with <0.1% error for complex-layered chalcogenides Bi2Te3, Bi2Se3, and a Bi2Te2.7Se0.3 nanostructured alloy. Pairing the technique with atomic resolution spectroscopy, we connect local structure with chemistry and bonding. Combining these results with density functional theory, we show that the incorporation of Se into Bi2Te3 causes charge redistribution that anomalously increases the van der Waals gap between building blocks of the layered structure. The results show that atomic resolution imaging with electrons can accurately and robustly quantify crystallography at the nanoscale. PMID:26169835

  11. Accurate definition of brain regions position through the functional landmark approach.

    PubMed

    Thirion, Bertrand; Varoquaux, Gaël; Poline, Jean-Baptiste

    2010-01-01

    In many application of functional Magnetic Resonance Imaging (fMRI), including clinical or pharmacological studies, the definition of the location of the functional activity between subjects is crucial. While current acquisition and normalization procedures improve the accuracy of the functional signal localization, it is also important to ensure that functional foci detection yields accurate results, and reflects between-subject variability. Here we introduce a fast functional landmark detection procedure, that explicitly models the spatial variability of activation foci in the observed population. We compare this detection approach to standard statistical maps peak extraction procedures: we show that it yields more accurate results on simulations, and more reproducible results on a large cohort of subjects. These results demonstrate that explicit functional landmark modeling approaches are more effective than standard statistical mapping for brain functional focus detection. PMID:20879321

  12. Comparison of experimental and Dirac-Fock calculated high-multipole-order internal conversion coefficients

    NASA Astrophysics Data System (ADS)

    Németh, Zsolt

    1992-02-01

    A large set of accurately measured E3, M3, E4 and M4 internal conversion coefficients (ICCs) has been compared with various theoretical values. ICCs calculated by considering Dirac-Fock wave functions are found in best agreement with the experimental values, although dependence of their discrepancies on transition energy, multipolarity and parity, as well as on nuclear charge and shell, has been revealed. The ICCs of Rösel et al., after the adjustment of Németh and Veres, proved to be the most successful in reproducing the experimental values. The adjusted ICCs of Rösel et al. are recommended and revision of the ICCs and γ-emission probabilities of isomeric transitions in evaluated data compilations such as Nuclear Data Sheets is suggested. Selection from the contradicting K and total ICCs of the 661.66 keV transition of 137Ba is proposed.

  13. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  14. [Myasthenia gravis - optimal treatment and accurate diagnosis].

    PubMed

    Gilhus, Nils Erik; Kerty, Emilia; Løseth, Sissel; Mygland, Åse; Tallaksen, Chantal

    2016-07-01

    Around 700 people in Norway have myasthenia gravis, an autoimmune disease that affects neuromuscular transmission and results in fluctuating weakness in some muscles as its sole symptom. The diagnosis is based on typical symptoms and findings, detection of antibodies and neurophysiological examination. Symptomatic treatment with acetylcholinesterase inhibitors is generally effective, but most patients also require immunosuppressive drug treatment. Antigen-specific therapy is being tested in experimental disease models. PMID:27381787

  15. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  16. A New Cecal Slurry Preparation Protocol with Improved Long-Term Reproducibility for Animal Models of Sepsis

    PubMed Central

    Starr, Marlene E.; Steele, Allison M.; Saito, Mizuki; Hacker, Bill J.; Evers, B. Mark; Saito, Hiroshi

    2014-01-01

    Sepsis, a life-threatening systemic inflammatory response syndrome induced by infection, is widely studied using laboratory animal models. While cecal-ligation and puncture (CLP) is considered the gold standard model for sepsis research, it may not be preferable for experiments comparing animals of different size or under different dietary regimens. By comparing cecum size, shape, and cecal content characteristics in mice under different experimental conditions (aging, diabetes, pancreatitis), we show that cecum variability could be problematic for some CLP experiments. The cecal slurry (CS) injection model, in which the cecal contents of a laboratory animal are injected intraperitoneally to other animals, is an alternative method for inducing polymicrobial sepsis; however, the CS must be freshly prepared under conventional protocols, which is a major disadvantage with respect to reproducibility and convenience. The objective of this study was to develop an improved CS preparation protocol that allows for long-term storage of CS with reproducible results. Using our new CS preparation protocol we found that bacterial viability is maintained for at least 6 months when the CS is prepared in 15% glycerol-PBS and stored at -80°C. To test sepsis-inducing efficacy of stored CS stocks, various amounts of CS were injected to young (4–6 months old), middle-aged (12–14 months old), and aged (24–26 months old) male C57BL/6 mice. Dose- and age-dependent mortality was observed with high reproducibility. Circulating bacteria levels strongly correlated with mortality suggesting an infection-mediated death. Further, injection with heat-inactivated CS resulted in acute hypothermia without mortality, indicating that CS-mediated death is not due to endotoxic shock. This new CS preparation protocol results in CS stocks which are durable for freezing preservation without loss of bacterial viability, allowing experiments to be performed more conveniently and with higher

  17. Rainfall variability and extremes over southern Africa: assessment of a climate model to reproduce daily extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  18. Repeatability and reproducibility of intracellular molar concentration assessed by synchrotron-based x-ray fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Merolle, L.; Malucelli, E.; Fratini, M.; Gianoncelli, A.; Notargiacomo, A.; Cappadone, C.; Farruggia, G.; Sargenti, A.; Procopio, A.; Lombardo, M.; Lagomarsino, S.; Iotti, S.

    2016-01-01

    Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the major obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.

  19. AN ACCURATE ORBITAL INTEGRATOR FOR THE RESTRICTED THREE-BODY PROBLEM AS A SPECIAL CASE OF THE DISCRETE-TIME GENERAL THREE-BODY PROBLEM

    SciTech Connect

    Minesaki, Yukitaka

    2013-08-01

    For the restricted three-body problem, we propose an accurate orbital integration scheme that retains all conserved quantities of the two-body problem with two primaries and approximately preserves the Jacobi integral. The scheme is obtained by taking the limit as mass approaches zero in the discrete-time general three-body problem. For a long time interval, the proposed scheme precisely reproduces various periodic orbits that cannot be accurately computed by other generic integrators.

  20. A System to Simulate and Reproduce Audio-Visual Environments for Spatial Hearing Research

    PubMed Central

    Seeber, Bernhard U.; Kerber, Stefan; Hafter, Ervin R.

    2009-01-01

    The article reports the experience gained from two implementations of the “Simulated Open-Field Environment” (SOFE), a setup that allows sounds to be played at calibrated levels over a wide frequency range from multiple loudspeakers in an anechoic chamber. Playing sounds from loudspeakers in the free-field has the advantage that each participant listens with their own ears, and individual characteristics of the ears are captured in the sound they hear. This makes an easy and accurate comparison between various listeners with and without hearing devices possible. The SOFE uses custom calibration software to assure individual equalization of each loudspeaker. Room simulation software creates the spatio-temporal reflection pattern of sound sources in rooms which is played via the SOFE loudspeakers. The sound playback system is complemented by a video projection facility which can be used to collect or give feedback or to study auditory-visual interaction. The article discusses acoustical and technical requirements for accurate sound playback against the specific needs in hearing research. An introduction to software concepts is given which allow easy, high-level control of the setup and thus fast experimental development, turning the SOFE into a “Swiss army knife” tool for auditory, spatial hearing and audio-visual research. PMID:19909802