Science.gov

Sample records for acceptable analytical reproducibility

  1. Interlaboratory Reproducibility of Selective Reaction Monitoring Assays Using Multiple Upfront Analyte Enrichment Strategies

    PubMed Central

    Prakash, Amol; Rezai, Taha; Krastins, Bryan; Sarracino, David; Athanas, Michael; Russo, Paul; Zhang, Hui; Tian, Yuan; Li, Yan; Kulasingam, Vathany; Drabovich, Andrei; Smith, Christopher R.; Batruch, Ihor; Oran, Paul E.; Fredolini, Claudia; Luchini, Alessandra; Liotta, Lance; Petricoin, Emanuel; Diamandis, Eleftherios P.; Chan, Daniel W.; Nelson, Randall; Lopez, Mary F.

    2013-01-01

    Over the past few years, mass spectrometry has emerged as a technology to complement and potentially replace standard immunoassays in routine clinical core laboratories. Application of mass spectrometry to protein and peptide measurement can provide advantages including high sensitivity, the ability to multiplex analytes, and high specificity at the amino acid sequence level. In our previous study, we demonstrated excellent reproducibility of mass spectrometry-selective reaction monitoring (MS-SRM) assays when applying standardized standard operating procedures (SOPs) to measure synthetic peptides in a complex sample, as lack of reproducibility has been a frequent criticism leveled at the use of mass spectrometers in the clinical laboratory compared to immunoassays. Furthermore, an important caveat of SRM-based assays for proteins is that many low-abundance analytes require some type of enrichment before detection with MS. This adds a level of complexity to the procedure and the potential for irreproducibility increases, especially across different laboratories with different operators. The purpose of this study was to test the interlaboratory reproducibility of SRM assays with various upfront enrichment strategies and different types of clinical samples (representing real-world body fluids commonly encountered in routine clinical laboratories). Three different, previously published enrichment strategies for low-abundance analytes and a no-enrichment strategy for high-abundance analytes were tested across four different laboratories using different liquid chromatography-SRM (LC-SRM) platforms and previously developed SOPs. The results demonstrated that these assays were indeed reproducible with coefficients of variation of less than 30% for the measurement of important clinical proteins across all four laboratories in real world samples. PMID:22639787

  2. Accelerate Healthcare Data Analytics: An Agile Practice to Perform Collaborative and Reproducible Analyses.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.

  3. An analytical nonlinear model for laminate multiferroic composites reproducing the DC magnetic bias dependent magnetoelectric properties.

    PubMed

    Lin, Lizhi; Wan, Yongping; Li, Faxin

    2012-07-01

    In this work, we propose an analytical nonlinear model for laminate multiferroic composites in which the magnetic-field-induced strain in magnetostrictive phase is described by a standard square law taking the stress effect into account, whereas the ferroelectric phase retains a linear piezoelectric response. Furthermore, differing from previous models which assume uniform deformation, we take into account the stress attenuation and adopt non-uniform deformation along the layer thickness in both piezoelectric and magnetostrictive phases. Analysis of this model on L-T and L-L modes of sandwiched Terfenol-D/lead zirconate titanate/Terfenol-D composites can well reproduce the observed dc magnetic field (H(dc)) dependent magnetoelectric coefficients, which reach their maximum with the H(dc) all at about 500 Oe. The model also suggests that stress attenuation along the layer thickness in practical composites should be taken into account. Furthermore, the model also indicates that a high volume fraction of magnetostrictive phase is required to get giant magnetoelectric coupling, coinciding with existing models.

  4. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Proposals to change an accepted analytical... accepted analytical principle applied in the Postal Service's annual periodic reports to the Commission. (a... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  5. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 39 Postal Service 1 2014-07-01 2014-07-01 false Proposals to change an accepted analytical... accepted analytical principle applied in the Postal Service's annual periodic reports to the Commission. (a... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  6. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Proposals to change an accepted analytical... accepted analytical principle applied in the Postal Service's annual periodic reports to the Commission. (a... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  7. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Proposals to change an accepted analytical... accepted analytical principle applied in the Postal Service's annual periodic reports to the Commission. (a... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  8. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  9. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  10. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-02

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization.

  11. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  12. On the establishment of equivalence acceptance criterion in analytical similarity assessment.

    PubMed

    Wang, Tongrong; Chow, Shein-Chung

    2017-01-04

    For the assessment of biosimilarity of biosimilar products, the United States (US) Food and Drug Administration (FDA) proposed a stepwise approach for providing the totality-of-the-evidence of similarity between a proposed biosimilar product and a US-licensed (reference) product. The stepwise approach starts with the assessment of critical quality attributes (CQAs) that are relevant to clinical outcomes in structural and functional characterization in the manufacturing process of the proposed biosimilar product. FDA suggests that these critical quality relevant attributes be identified and classified into three tiers depending on their criticality or risk ranking. To assist the sponsors, FDA also suggests some statistical approaches for the assessment of analytical similarity for CQAs from different tiers, namely equivalence test for Tier 1, quality range approach for Tier 2, and descriptive raw data and graphical comparison for Tier 3. Analytical similarity assessment for CQAs in Tier 1 is performed based on the equivalence acceptance criterion (EAC), which depends upon the estimate of variability of the reference product. The FDA's recommended approach often underestimates the variability of the reference product because it does not take the worst possible lots into consideration. In this article, we examine the statistical properties of the FDA's recommended approach and proposed alternative methods in establishing an alternative approach under the scenario where multiple samples drew from each lot.

  13. Using Functional Analytic Therapy to Train Therapists in Acceptance and Commitment Therapy, a Conceptual and Practical Framework

    ERIC Educational Resources Information Center

    Schoendorff, Benjamin; Steinwachs, Joanne

    2012-01-01

    How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…

  14. Fast gradient separation by very high pressure liquid chromatography: reproducibility of analytical data and influence of delay between successive runs.

    PubMed

    Stankovicha, Joseph J; Gritti, Fabrice; Beaver, Lois Ann; Stevensona, Paul G; Guiochon, Georges

    2013-11-29

    Five methods were used to implement fast gradient separations: constant flow rate, constant column-wall temperature, constant inlet pressure at moderate and high pressures (controlled by a pressure controller),and programmed flow constant pressure. For programmed flow constant pressure, the flow rates and gradient compositions are controlled using input into the method instead of the pressure controller. Minor fluctuations in the inlet pressure do not affect the mobile phase flow rate in programmed flow. There producibilities of the retention times, the response factors, and the eluted band width of six successive separations of the same sample (9 components) were measured with different equilibration times between 0 and 15 min. The influence of the length of the equilibration time on these reproducibilities is discussed. The results show that the average column temperature may increase from one separation to the next and that this contributes to fluctuation of the results.

  15. Canine olfaction as an alternative to analytical instruments for disease diagnosis: understanding 'dog personality' to achieve reproducible results.

    PubMed

    Hackner, Klaus; Pleil, Joachim

    2017-01-09

    Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all the way to essentially random results. This disparity is not likely to be a detection issue; dogs have been shown to have extremely sensitive noses as proven by their use for tracking, bomb detection and search and rescue. However, in contrast to analytical instruments, dogs are subject to boredom, fatigue, hunger and external distractions. These challenges are of particular importance in a clinical environment where task repetition is prized, but not as entertaining for a dog as chasing odours outdoors. The question addressed here is how to exploit the intrinsic sensitivity and simplicity of having a dog simply sniff out disease, in the face of variability in behavior and response.

  16. An Analytical Framework for Flood Water Conservation Considering Forecast Uncertainty and Acceptable Risk

    NASA Astrophysics Data System (ADS)

    Ding, W.; Zhang, C.

    2015-12-01

    Reservoir water levels are usually not allowed to exceed the flood limited water level (FLWL) during flood season, which neglects the meteorological and real-time forecast information and leads to the great waste of water resources. With the development of weather forecasting, hydrologic modeling, and hydro-climatic teleconnection, the streamflow forecast precision have improved a lot, which provides the technical support for the flood water utilization. This paper addresses how much flood water can be conserved for use after the flood season through the operation of reservoir based on uncertain forecast information by taking into account the residual flood control capacity (the difference between flood conveyance capacity and the expected inflow in a lead time). A two-stage model for dynamic control of the flood limited water level (the maximum allowed water level during the flood season, DC-FLWL) is established considering forecast uncertainty and acceptable flood risk. It is found that DC-FLWL is applicable when the reservoir inflow ranges from small to medium levels of the historical records, while both forecast uncertainty and acceptable risk in the downstream affect the feasible space of DC-FLWL. As forecast uncertainty increases (under a given risk level) or as acceptable risk level decreases (under a given forecast uncertainty level), the minimum required safety margin for flood control increases, and the chance for DC-FLWL decreases. The derived hedging rules from the modeling framework illustrate either the dominant role of water conservation or flood control or the tradeoff between the two objectives under different levels of forecast uncertainty and acceptable risk. These rules may provide useful guidelines for conserving water from flood, especially in the area with heavy water stress.

  17. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  18. Analytical method transfer using equivalence tests with reasonable acceptance criteria and appropriate effort: extension of the ISPE concept.

    PubMed

    Kaminski, L; Schepers, U; Wätzig, H

    2010-12-15

    A method development process is commonly finalized by a method transfer from the developing to the routine laboratory. Statistical tests are performed in order to survey if a transfer succeeded or failed. However, using the classic two-sample t-test can lead to misjudgments and unsatisfying transfer results due to its test characteristics. Therefore the International Society of Pharmaceutical Engineering (ISPE) employed a fixed method transfer design using equivalence tests in their Guide for Technology Transfer. Although it was well received by analytical laboratories worldwide this fixed design can easily bring about high beta-errors (rejection of successful transfers) or high workload (many analysts employed during transfer) if sigma(AN) (error due to different analysts) exceeds 0.6%. Hence this work introduces an extended concept which will help to circumvent this disadvantage by providing guidance to select a personalized and more appropriate experimental design. First of all it demonstrates that former t-test related acceptance criteria can be scaled by a factor of 1.15, which allows for a broader tolerance without a loss of decision certainty. Furthermore a decision guidance to choose the proper number of analysts or series at given percentage acceptance limits (%AL) is presented.

  19. Does Acceptance and Relationship Focused Behavior Therapy Contribute to Bupropion Outcomes? A Randomized Controlled Trial of Functional Analytic Psychotherapy and Acceptance and Commitment Therapy for Smoking Cessation

    ERIC Educational Resources Information Center

    Gifford, Elizabeth V.; Kohlenberg, Barbara S.; Hayes, Steven C.; Pierson, Heather M.; Piasecki, Melissa P.; Antonuccio, David O.; Palm, Kathleen M.

    2011-01-01

    This study evaluated a treatment combining bupropion with a novel acceptance and relationship focused behavioral intervention based on the acceptance and relationship context (ARC) model. Three hundred and three smokers from a community sample were randomly assigned to bupropion, a widely used smoking cessation medication, or bupropion plus…

  20. Acceptance- and mindfulness-based interventions for the treatment of chronic pain: a meta-analytic review.

    PubMed

    Veehof, M M; Trompetter, H R; Bohlmeijer, E T; Schreurs, K M G

    2016-01-01

    The number of acceptance- and mindfulness-based interventions for chronic pain, such as acceptance and commitment therapy (ACT), mindfulness-based stress reduction (MBSR), and mindfulness-based cognitive therapy (MBCT), increased in recent years. Therefore an update is warranted of our former systematic review and meta-analysis of studies that reported effects on the mental and physical health of chronic pain patients. Pubmed, EMBASE, PsycInfo and Cochrane were searched for eligible studies. Current meta-analysis only included randomized controlled trials (RCTs). Studies were rated for quality. Mean quality did not improve in recent years. Pooled standardized mean differences using the random-effect model were calculated to represent the average intervention effect and, to perform subgroup analyses. Outcome measures were pain intensity, depression, anxiety, pain interference, disability and quality of life. Included were twenty-five RCTs totaling 1285 patients with chronic pain, in which we compared acceptance- and mindfulness-based interventions to the waitlist, (medical) treatment-as-usual, and education or support control groups. Effect sizes ranged from small (on all outcome measures except anxiety and pain interference) to moderate (on anxiety and pain interference) at post-treatment and from small (on pain intensity and disability) to large (on pain interference) at follow-up. ACT showed significantly higher effects on depression and anxiety than MBSR and MBCT. Studies' quality, attrition rate, type of pain and control group, did not moderate the effects of acceptance- and mindfulness-based interventions. Current acceptance- and mindfulness-based interventions, while not superior to traditional cognitive behavioral treatments, can be good alternatives.

  1. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  2. The Need for Reproducibility

    SciTech Connect

    Robey, Robert W.

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  3. Tissue compliance meter is a more reproducible method of measuring radiation-induced fibrosis than late effects of normal tissue-subjective objective management analytical in patients treated with intracavitary brachytherapy accelerated partial breast irradiation: results of a prospective trial.

    PubMed

    Wernicke, A Gabriella; Greenwood, Eleni A; Coplowitz, Shana; Parashar, Bhupesh; Kulidzhanov, Fridon; Christos, Paul J; Fischer, Andrew; Nori, Dattatreyudu; Chao, Kun S Clifford

    2013-01-01

    Identification of radiation-induced fibrosis (RIF) remains a challenge with Late Effects of Normal Tissue-Subjective Objective Management Analytical (LENT-SOMA). Tissue compliance meter (TCM), a non-invasive applicator, may render a more reproducible tool for measuring RIF. In this study, we prospectively quantify RIF after intracavitary brachytherapy (IB) accelerated partial breast irradiation (APBI) with TCM and compare it with LENT-SOMA. Thirty-nine women with American Joint Committee on Cancer Stages 0-I breast cancer, treated with lumpectomy and intracavitary brachytherapy delivered by accelerated partial breast irradiation (IBAPBI), were evaluated by two raters in a prospective manner pre-IBAPBI and every 6 months post-IBAPBI for development of RIF, using TCM and LENT-SOMA. TCM classification scale grades RIF as 0 = none, 1 = mild, 2 = moderate, and 3 = severe, corresponding to a change in TCM (ΔTCM) between the IBAPBI and nonirradiated breasts of ≤2.9, 3.0-5.9, 6.0-8.9, ≥9.0 mm, respectively. LENT-SOMA scale employs clinical palpation to grade RIF as 0 = none, 1 = mild, 2 = moderate, and 3 = severe. Correlation coefficients-Intraclass (ICC), Pearson (r), and Cohen's kappa (κ)-were employed to assess reliability of TCM and LENT-SOMA. Multivariate and univariate linear models explored the relationship between RIF and anatomical parameters [bra cup size], antihormonal therapy, and dosimetric factors [balloon diameter, skin-to-balloon distance (SBD), V150, and V200]. Median time to follow-up from completion of IBAPBI is 3.6 years (range, 0.8-4.9 years). Median age is 69 years (range, 47-82 years). Median breast cup size is 39D (range, 34B-44DDD). Median balloon size is 41.2 cc (range, 37.6-50.0 cc), and median SBD is 1.4 cm (range, 0.2-5.5 cm). At pre-IBAPBI, TCM measurements demonstrate high interobserver agreement between two raters in all four quadrants of both breasts ICC ≥ 0.997 (95% CI 0.994-1.000). After 36 months, RIF is graded by TCM scale as 0

  4. A reproducible method for determination of nitrocellulose in soil.

    PubMed

    Macmillan, Denise K; Majerus, Chelsea R; Laubscher, Randy D; Shannon, John P

    2008-01-15

    A reproducible analytical method for determination of nitrocellulose in soil is described. The new method provides the precision and accuracy needed for quantitation of nitrocellulose in soils to enable worker safety on contaminated sites. The method utilizes water and ethanol washes to remove co-contaminants, acetone extraction of nitrocellulose, and base hydrolysis of the extract to reduce nitrate groups. The hydrolysate is then neutralized and analyzed by ion chromatography for determination of free nitrate and nitrite. A variety of bases for hydrolysis and acids for neutralization were evaluated, with 5N sodium hydroxide and carbon dioxide giving the most complete hydrolysis and interference-free neutralization, respectively. The concentration of nitrocellulose in the soil is calculated from the concentrations of nitrate and nitrite and the weight percentage of nitrogen content in nitrocellulose. The laboratory detection limit for the analysis is 10mg/kg. The method acceptance range for recovery of nitrocellulose from control samples is 78-105%.

  5. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  6. Reproducibility in Chemical Research.

    PubMed

    Bergman, Robert G; Danheiser, Rick L

    2016-10-04

    "… To what extent is reproducibility a significant issue in chemical research? How can problems involving irreproducibility be minimized? … Researchers should be aware of the dangers of unconscious investigator bias, all papers should provide adequate experimental detail, and Reviewers have a responsibility to carefully examine papers for adequacy of experimental detail and support for the conclusions …" Read more in the Editorial by Robert G. Bergman and Rick L. Danheiser.

  7. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    SciTech Connect

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection

  8. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  9. Patent Law's Reproducibility Paradox.

    PubMed

    Sherkow, Jacob S

    2017-01-01

    Clinical research faces a reproducibility crisis. Many recent clinical and preclinical studies appear to be irreproducible--their results cannot be verified by outside researchers. This is problematic for not only scientific reasons but also legal ones: patents grounded in irreproducible research appear to fail their constitutional bargain of property rights in exchange for working disclosures of inventions. The culprit is likely patent law’s doctrine of enablement. Although the doctrine requires patents to enable others to make and use their claimed inventions, current difficulties in applying the doctrine hamper or even actively dissuade reproducible data in patents. This Article assesses the difficulties in reconciling these basic goals of scientific research and patent law. More concretely, it provides several examples of irreproducibility in patents on blockbuster drugs--Prempro, Xigris, Plavix, and Avastin--and discusses some of the social costs of the misalignment between good clinical practice and patent doctrine. Ultimately, this analysis illuminates several current debates concerning innovation policy. It strongly suggests that a proper conception of enablement should take into account after-arising evidence. It also sheds light on the true purpose--and limits--of patent disclosure. And lastly, it untangles the doctrines of enablement and utility.

  10. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  11. Reproducibility in a multiprocessor system

    SciTech Connect

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  12. Approaches to acceptable risk

    SciTech Connect

    Whipple, C.

    1997-04-30

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.

  13. Contextual sensitivity in scientific reproducibility.

    PubMed

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  14. Tissue Doppler imaging reproducibility during exercise.

    PubMed

    Bougault, V; Nottin, S; Noltin, S; Doucende, G; Obert, P

    2008-05-01

    Tissue Doppler imaging (TDI) is an echocardiographic technique used during exercising to improve the accuracy of a cardiovascular diagnostic. The validity of TDI requires its reproducibility, which has never been challenged during moderate to maximal intensity exercising. The present study was specifically designed to assess the transmitral Doppler and pulsed TDI reproducibility in 19 healthy men, who had undergone two identical semi-supine maximal exercise tests on a cycle ergometer. Systolic (S') and diastolic (E') tissue velocities at the septal and lateral walls as well as early transmitral velocities (E) were assessed during exercise up to maximal effort. The data were compared between the two tests at 40 %, 60 %, 80 % and 100 % of maximal aerobic power. Despite upper body movements and hyperventilation, good quality echocardiographic images were obtained in each case. Regardless of exercise intensity, no differences were noticed between the two tests for all measurements. The variation coefficients for Doppler variables ranged from 3 % to 9 % over the transition from rest to maximal exercise. The random measurement error was, on average, 5.8 cm/s for E' and 4.4 cm/s for S'. Overall, the reproducibility of TDI was acceptable. Tissue Doppler imaging can be used to accurately evaluate LV diastolic and/or systolic function for this range of exercise intensity.

  15. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  16. Contextual sensitivity in scientific reproducibility

    PubMed Central

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  17. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  18. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  19. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  20. Reproducible research in computational science.

    PubMed

    Peng, Roger D

    2011-12-02

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  1. Acceptance speech.

    PubMed

    Yusuf, C K

    1994-01-01

    I am proud and honored to accept this award on behalf of the Government of Bangladesh, and the millions of Bangladeshi children saved by oral rehydration solution. The Government of Bangladesh is grateful for this recognition of its commitment to international health and population research and cost-effective health care for all. The Government of Bangladesh has already made remarkable strides forward in the health and population sector, and this was recognized in UNICEF's 1993 "State of the World's Children". The national contraceptive prevalence rate, at 40%, is higher than that of many developed countries. It is appropriate that Bangladesh, where ORS was discovered, has the largest ORS production capacity in the world. It was remarkable that after the devastating cyclone in 1991, the country was able to produce enough ORS to meet the needs and remain self-sufficient. Similarly, Bangladesh has one of the most effective, flexible and efficient control of diarrheal disease and epidemic response program in the world. Through the country, doctors have been trained in diarrheal disease management, and stores of ORS are maintained ready for any outbreak. Despite grim predictions after the 1991 cyclone and the 1993 floods, relatively few people died from diarrheal disease. This is indicative of the strength of the national program. I want to take this opportunity to acknowledge the contribution of ICDDR, B and the important role it plays in supporting the Government's efforts in the health and population sector. The partnership between the Government of Bangladesh and ICDDR, B has already borne great fruit, and I hope and believe that it will continue to do so for many years in the future. Thank you.

  2. Reproducibility and Validity of a Handheld Spirometer

    PubMed Central

    Barr, R Graham; Stemple, Kimberly J.; Mesia-Vela, Sonia; Basner, Robert C.; Derk, Susan; Henneberger, Paul; Milton, Donald K; Taveras, Brenda

    2013-01-01

    Background Handheld spirometers have several advantages over desktop spirometers but worries persist regarding their reproducibility and validity. We undertook an independent examination of an ultrasonic flow-sensing handheld spirometer. Methods Laboratory methods included reproducibility and validity testing using a waveform generator with standard American Thoracic Society (ATS) waveforms, in-line testing, calibration adaptor testing, and compression of the mouthpiece. Clinical testing involved repeated testing of 24 spirometry-naive volunteers and comparison to a volume-sensing dry rolling seal spirometer. Results The EasyOne Diagnostic spirometer exceeded standard thresholds of acceptability for ATS waveforms. In-line testing yielded valid results with relative differences (mean ± SD) between the EasyOne and the reference spirometer for the forced vital capacity (FVC) of 0.03±0.23 L and the forced expiratory volume in one second (FEV1) of −0.06±0.09 L. The calibration adaptor showed no appreciable problems, but extreme compression of the mouthpiece reduced measures. In clinical testing, coefficients of variation and limits of agreement were, respectively: 3.3% and 0.24 L for the FVC; 2.6% and 0.18 L for the FEV1; and 1.9% and 0.05 for the FEV1/FVC ratio. The EasyOne yielded lower values than the reference spirometry (FVC: −0.12 L; FEV1: −0.17 L; FEV1/FVC ratio: −0.02). Limits of agreement were within criteria for FVC but not for the FEV1, possibly due to a training effect. Conclusion The EasyOne spirometer yielded generally reproducible results that were generally valid compared to laboratory-based spirometry. The use of this handheld spirometer in clinical, occupational and research settings seems justified. PMID:18364054

  3. Datathons and Software to Promote Reproducible Research

    PubMed Central

    2016-01-01

    Background Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. Objective We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Methods Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Results Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes—a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. Conclusions This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility. PMID:27558834

  4. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  5. High throughput reproducible cantilever functionalization

    SciTech Connect

    Evans, Barbara R; Lee, Ida

    2014-01-21

    A method for functionalizing cantilevers is provided that includes providing a holder having a plurality of channels each having a width for accepting a cantilever probe and a plurality of probes. A plurality of cantilever probes are fastened to the plurality of channels of the holder by the spring clips. The wells of a well plate are filled with a functionalization solution, wherein adjacent wells in the well plate are separated by a dimension that is substantially equal to a dimension separating adjacent channels of the plurality of channels. Each cantilever probe that is fastened within the plurality of channels of the holder is applied to the functionalization solution that is contained in the wells of the well plate.

  6. High throughout reproducible cantilever functionalization

    SciTech Connect

    Evans, Barbara R; Lee, Ida

    2014-11-25

    A method for functionalizing cantilevers is provided that includes providing a holder having a plurality of channels each having a width for accepting a cantilever probe and a plurality of probes. A plurality of cantilever probes are fastened to the plurality of channels of the holder by the spring clips. The wells of a well plate are filled with a functionalization solution, wherein adjacent wells in the well plate are separated by a dimension that is substantially equal to a dimension separating adjacent channels of the plurality of channels. Each cantilever probe that is fastened within the plurality of channels of the holder is applied to the functionalization solution that is contained in the wells of the well plate.

  7. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  8. Reliability and reproducibility of Kienbock's disease staging.

    PubMed

    Goeminne, S; Degreef, I; De Smet, L

    2010-09-01

    We evaluated the interobserver reliability and intraobserver reproducibility of the Lichtman et al. classification for Kienböck's disease by getting four observers with different experience to look at 70 sets of wrist radiographs at different points in time. These observers staged each set of radiographs. Paired comparisons of the observations identified an agreement in 63% of cases and a mean weighted kappa coefficient of 0.64 confirming interobserver reliability. The stage of the involved lunate was reproduced in 78% of the observations with a mean weighted kappa coefficient of 0.81 showing intraobserver reproducibility. This classification for Kienböck's disease has good reliability and reproducibility.

  9. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  10. Offer/Acceptance Ratio.

    ERIC Educational Resources Information Center

    Collins, Mimi

    1997-01-01

    Explores how human resource professionals, with above average offer/acceptance ratios, streamline their recruitment efforts. Profiles company strategies with internships, internal promotion, cooperative education programs, and how to get candidates to accept offers. Also discusses how to use the offer/acceptance ratio as a measure of program…

  11. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  12. Analytic materials

    NASA Astrophysics Data System (ADS)

    Milton, Graeme W.

    2016-11-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  13. Analytic materials.

    PubMed

    Milton, Graeme W

    2016-11-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90(°) rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  14. The Economics of Reproducibility in Preclinical Research.

    PubMed

    Freedman, Leonard P; Cockburn, Iain M; Simcoe, Timothy S

    2015-06-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  15. LIMS user acceptance testing.

    PubMed

    Klein, Corbett S

    2003-01-01

    Laboratory Information Management Systems (LIMS) play a key role in the pharmaceutical industry. Thorough and accurate validation of such systems is critical and is a regulatory requirement. LIMS user acceptance testing is one aspect of this testing and enables the user to make a decision to accept or reject implementation of the system. This paper discusses key elements in facilitating the development and execution of a LIMS User Acceptance Test Plan (UATP).

  16. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  17. On Maximum FODO Acceptance

    SciTech Connect

    Batygin, Yuri Konstantinovich

    2014-12-24

    This note illustrates maximum acceptance of FODO quadrupole focusing channel. Acceptance is the largest Floquet ellipse of a matched beam: A = $\\frac{a^2}{β}$$_{max}$ where a is the aperture of the channel and βmax is the largest value of beta-function in the channel. If aperture of the channel is restricted by a circle of radius a, the s-s acceptance is available for particles oscillating at median plane, y=0. Particles outside median plane will occupy smaller phase space area. In x-y plane, cross section of the accepted beam has a shape of ellipse with truncated boundaries.

  18. Reproducing Kernels in Harmonic Spaces and Their Numerical Implementation

    NASA Astrophysics Data System (ADS)

    Nesvadba, Otakar

    2010-05-01

    In harmonic analysis such as the modelling of the Earth's gravity field, the importance of Hilbert's space of harmonic functions with the reproducing kernel is often discussed. Moreover, in case of an unbounded domain given by the exterior of the sphere or an ellipsoid, the reproducing kernel K(x,y) can be expressed analytically by means of closed formulas or by infinite series. Nevertheless, the straightforward numerical implementation of these formulas leads to dozen of problems, which are mostly connected with the floating-point arithmetic and a number representation. The contribution discusses numerical instabilities in K(x,y) and gradK(x,y) that can be overcome by employing elementary functions, in particular expm1 and log1p. Suggested evaluation scheme for reproducing kernels offers uniform formulas within the whole solution domain as well as superior speed and near-perfect accuracy (10-16 for IEC 60559 double-precision numbers) when compared with the straightforward formulas. The formulas can be easily implemented on the majority of computer platforms, especially when C standard library ISO/IEC 9899:1999 is available.

  19. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient.

  20. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  1. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  2. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  3. Reproducibility of graph metrics in FMRI networks.

    PubMed

    Telesford, Qawi K; Morgan, Ashley R; Hayasaka, Satoru; Simpson, Sean L; Barret, William; Kraft, Robert A; Mozolic, Jennifer L; Laurienti, Paul J

    2010-01-01

    The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC) statistics and Bland-Altman (BA) plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC = 0.86), global efficiency (ICC = 0.83), path length (ICC = 0.79), and local efficiency (ICC = 0.75); the ICC score for degree was found to be low (ICC = 0.29). ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency, and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  4. Newbery Medal Acceptance.

    ERIC Educational Resources Information Center

    Freedman, Russell

    1988-01-01

    Presents the Newbery Medal acceptance speech of Russell Freedman, writer of children's nonfiction. Discusses the place of nonfiction in the world of children's literature, the evolution of children's biographies, and the author's work on "Lincoln." (ARH)

  5. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  6. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  7. Europe Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Illustrative black and white maps, tables, and graphs designed for clear reproducibility depict Europe's size, population, resources, commodities, trade, cities, schooling, jobs, energy, industry, demographic statistics, food, and agriculture. Also included are 33 United States Department of State individual country maps. This volume is intended…

  8. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    ERIC Educational Resources Information Center

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  9. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  10. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  11. Chimie douce preparation of reproducible silver coatings for SERS applications

    NASA Astrophysics Data System (ADS)

    Sidorov, Alexander V.; Grigorieva, Anastasia V.; Goldt, Anastasia E.; Eremina, Olga E.; Veselova, Irina A.; Savilov, Sergey V.; Goodilin, Eugene A.

    2016-12-01

    A new soft chemistry preparation method of submicron — thick porous coatings of metallic silver is suggested for possible surface enhanced Raman spectroscopy (SERS) applications. The method is based on facile deposition of diamminesilver (I) aerosols forming instantly a nanostructured layer by fast decomposition and self-reduction of [Ag(NH3)2]+ aqueous solutions onto surfaces of inorganic substrates under mild conditions of 280-300∘C in air. A strong difference in overall microstructures and related SERS signals of model analytes is found for substrates with different deposition time and in comparison with a standard magnetron deposition technique. It is demonstrated that the suggested method is predominant for formation of robust SERS substrates with a stable and reproducible SERS enhancement.

  12. Accepting space radiation risks.

    PubMed

    Schimmerling, Walter

    2010-08-01

    The human exploration of space inevitably involves exposure to radiation. Associated with this exposure are multiple risks, i.e., probabilities that certain aspects of an astronaut's health or performance will be degraded. The management of these risks requires that such probabilities be accurately predicted, that the actual exposures be verified, and that comprehensive records be maintained. Implicit in these actions is the fact that, at some point, a decision has been made to accept a certain level of risk. This paper examines ethical and practical considerations involved in arriving at a determination that risks are acceptable, roles that the parties involved may play, and obligations arising out of reliance on the informed consent paradigm seen as the basis for ethical radiation risk acceptance in space.

  13. Reproducibility of 3D chromatin configuration reconstructions

    PubMed Central

    Segal, Mark R.; Xiong, Hao; Capurso, Daniel; Vazquez, Mariel; Arsuaga, Javier

    2014-01-01

    It is widely recognized that the three-dimensional (3D) architecture of eukaryotic chromatin plays an important role in processes such as gene regulation and cancer-driving gene fusions. Observing or inferring this 3D structure at even modest resolutions had been problematic, since genomes are highly condensed and traditional assays are coarse. However, recently devised high-throughput molecular techniques have changed this situation. Notably, the development of a suite of chromatin conformation capture (CCC) assays has enabled elicitation of contacts—spatially close chromosomal loci—which have provided insights into chromatin architecture. Most analysis of CCC data has focused on the contact level, with less effort directed toward obtaining 3D reconstructions and evaluating the accuracy and reproducibility thereof. While questions of accuracy must be addressed experimentally, questions of reproducibility can be addressed statistically—the purpose of this paper. We use a constrained optimization technique to reconstruct chromatin configurations for a number of closely related yeast datasets and assess reproducibility using four metrics that measure the distance between 3D configurations. The first of these, Procrustes fitting, measures configuration closeness after applying reflection, rotation, translation, and scaling-based alignment of the structures. The others base comparisons on the within-configuration inter-point distance matrix. Inferential results for these metrics rely on suitable permutation approaches. Results indicate that distance matrix-based approaches are preferable to Procrustes analysis, not because of the metrics per se but rather on account of the ability to customize permutation schemes to handle within-chromosome contiguity. It has recently been emphasized that the use of constrained optimization approaches to 3D architecture reconstruction are prone to being trapped in local minima. Our methods of reproducibility assessment provide a

  14. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  15. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  16. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  17. Why was Relativity Accepted?

    NASA Astrophysics Data System (ADS)

    Brush, S. G.

    Historians of science have published many studies of the reception of Einstein's special and general theories of relativity. Based on a review of these studies, and my own research on the role of the light-bending prediction in the reception of general relativity, I discuss the role of three kinds of reasons for accepting relativity (1) empirical predictions and explanations; (2) social-psychological factors; and (3) aesthetic-mathematical factors. According to the historical studies, acceptance was a three-stage process. First, a few leading scientists adopted the special theory for aesthetic-mathematical reasons. In the second stage, their enthusiastic advocacy persuaded other scientists to work on the theory and apply it to problems currently of interest in atomic physics. The special theory was accepted by many German physicists by 1910 and had begun to attract some interest in other countries. In the third stage, the confirmation of Einstein's light-bending prediction attracted much public attention and forced all physicists to take the general theory of relativity seriously. In addition to light-bending, the explanation of the advance of Mercury's perihelion was considered strong evidence by theoretical physicists. The American astronomers who conducted successful tests of general relativity became defenders of the theory. There is little evidence that relativity was `socially constructed' but its initial acceptance was facilitated by the prestige and resources of its advocates.

  18. UGV acceptance testing

    NASA Astrophysics Data System (ADS)

    Kramer, Jeffrey A.; Murphy, Robin R.

    2006-05-01

    With over 100 models of unmanned vehicles now available for military and civilian safety, security or rescue applications, it is important to for agencies to establish acceptance testing. However, there appears to be no general guidelines for what constitutes a reasonable acceptance test. This paper describes i) a preliminary method for acceptance testing by a customer of the mechanical and electrical components of an unmanned ground vehicle system, ii) how it has been applied to a man-packable micro-robot, and iii) discusses the value of testing both to ensure that the customer has a workable system and to improve design. The test method automated the operation of the robot to repeatedly exercise all aspects and combinations of components on the robot for 6 hours. The acceptance testing process uncovered many failures consistent with those shown to occur in the field, showing that testing by the user does predict failures. The process also demonstrated that the testing by the manufacturer can provide important design data that can be used to identify, diagnose, and prevent long-term problems. Also, the structured testing environment showed that sensor systems can be used to predict errors and changes in performance, as well as uncovering unmodeled behavior in subsystems.

  19. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  20. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  1. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  2. Acceptability of human risk.

    PubMed

    Kasperson, R E

    1983-10-01

    This paper has three objectives: to explore the nature of the problem implicit in the term "risk acceptability," to examine the possible contributions of scientific information to risk standard-setting, and to argue that societal response is best guided by considerations of process rather than formal methods of analysis. Most technological risks are not accepted but are imposed. There is also little reason to expect consensus among individuals on their tolerance of risk. Moreover, debates about risk levels are often at base debates over the adequacy of the institutions which manage the risks. Scientific information can contribute three broad types of analyses to risk-setting deliberations: contextual analysis, equity assessment, and public preference analysis. More effective risk-setting decisions will involve attention to the process used, particularly in regard to the requirements of procedural justice and democratic responsibility.

  3. Acceptability of human risk.

    PubMed Central

    Kasperson, R E

    1983-01-01

    This paper has three objectives: to explore the nature of the problem implicit in the term "risk acceptability," to examine the possible contributions of scientific information to risk standard-setting, and to argue that societal response is best guided by considerations of process rather than formal methods of analysis. Most technological risks are not accepted but are imposed. There is also little reason to expect consensus among individuals on their tolerance of risk. Moreover, debates about risk levels are often at base debates over the adequacy of the institutions which manage the risks. Scientific information can contribute three broad types of analyses to risk-setting deliberations: contextual analysis, equity assessment, and public preference analysis. More effective risk-setting decisions will involve attention to the process used, particularly in regard to the requirements of procedural justice and democratic responsibility. PMID:6418541

  4. Acceptance Test Plan.

    DTIC Science & Technology

    2014-09-26

    7 RD-Ai507 154 CCEPTANCE TEST PLN(U) WESTINGHOUSE DEFENSE ND i/i ELECTRO ICS CENTER BALTIMORE MD DEVELOPMENT AND OPERATIONS DIY D C KRRiJS 28 JUN...Ln ACCEPTANCE TEST PLAN FOR SPECIAL RELIABILITY TESTS FOR BROADBAND MICROWAVE AMPLIFIER PANEL David C. Kraus, Reliability Engineer WESTINGHOUSE ...ORGANIZATION b. OFFICE SYMBOL 7g& NAME OF MONITORING ORGANIZATION tIf appdeg ble) WESTINGHOUSE ELECTRIC CORP. - NAVAL RESEARCH LABORATORY e. AOORES$ (Ci7t

  5. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  6. Age and Acceptance of Euthanasia.

    ERIC Educational Resources Information Center

    Ward, Russell A.

    1980-01-01

    Study explores relationship between age (and sex and race) and acceptance of euthanasia. Women and non-Whites were less accepting because of religiosity. Among older people less acceptance was attributable to their lesser education and greater religiosity. Results suggest that quality of life in old age affects acceptability of euthanasia. (Author)

  7. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  8. Reproducibility and reusability of scientific software

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2017-01-01

    Information science and technology has been becoming an integral part of astronomy research, and due to the consistent growth in the size and impact of astronomical databases, that trend is bound to continue. While software is a vital part information systems and data analysis processes, in many cases the importance of the software and the standards of reporting on the use of source code has not yet elevated in the scientific communication process to the same level as other parts of the research. The purpose of the discussion is to examine the role of software in the scientific communication process in the light of transparency, reproducibility, and reusability of the research, as well as discussing software in astronomy in comparison to other disciplines.

  9. Is Isolated Nocturnal Hypertension A Reproducible Phenotype?

    PubMed Central

    Goldsmith, Jeff; Muntner, Paul; Diaz, Keith M.; Reynolds, Kristi; Schwartz, Joseph E.; Shimbo, Daichi

    2016-01-01

    BACKGROUND Isolated nocturnal hypertension (INH), defined as nocturnal without daytime hypertension on ambulatory blood pressure (BP) monitoring (ABPM), has been observed to be associated with an increased risk of cardiovascular disease (CVD) events and mortality. The aim of this study was to determine the short-term reproducibility of INH. METHODS The Improving the Detection of Hypertension Study enrolled a community-based sample of adults (N = 282) in upper Manhattan without CVD, renal failure, or treated hypertension. Each participant completed two 24-hour ABPM recordings (ABPM1: first recording and ABPM2: second recording) with a mean ± SD time interval of 33±17 days between recordings. Daytime hypertension was defined as mean awake systolic/diastolic BP ≥ 135/85mm Hg; nocturnal hypertension as mean sleep systolic/diastolic BP ≥ 120/70mm Hg; INH as nocturnal without daytime hypertension; isolated daytime hypertension (IDH) as daytime without nocturnal hypertension; day and night hypertension (DNH) as daytime and nocturnal hypertension, and any ambulatory hypertension as having daytime and/or nocturnal hypertension. RESULTS On ABPM1, 26 (9.2%), 21 (7.4%), and 50 (17.7%) participants had INH, IDH, and DNH, respectively. On ABPM2, 24 (8.5%), 19 (6.7%), and 54 (19.1%) had INH, IDH, and DNH, respectively. The kappa statistics were 0.21 (95% confidence interval (CI) 0.04–0.38), 0.25 (95% CI 0.06–0.44), and 0.65 (95% CI 0.53–0.77) for INH, IDH, and DNH respectively; and 0.72 (95% CI 0.63–0.81) for having any ambulatory hypertension. CONCLUSIONS Our results suggest that INH and IDH are poorly reproducible phenotypes, and that ABPM should be primarily used to identify individuals with daytime hypertension and/or nocturnal hypertension. PMID:25904648

  10. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  11. High acceptance recoil polarimeter

    SciTech Connect

    The HARP Collaboration

    1992-12-05

    In order to detect neutrons and protons in the 50 to 600 MeV energy range and measure their polarization, an efficient, low-noise, self-calibrating device is being designed. This detector, known as the High Acceptance Recoil Polarimeter (HARP), is based on the recoil principle of proton detection from np[r arrow]n[prime]p[prime] or pp[r arrow]p[prime]p[prime] scattering (detected particles are underlined) which intrinsically yields polarization information on the incoming particle. HARP will be commissioned to carry out experiments in 1994.

  12. Baby-Crying Acceptance

    NASA Astrophysics Data System (ADS)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  13. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  14. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  15. Assessment of Reproducibility of Laser Electrospray Mass Spectrometry using Electrospray Deposition of Analyte

    NASA Astrophysics Data System (ADS)

    Sistani, Habiballah; Karki, Santosh; Archer, Jieutonne J.; Shi, Fengjian; Levis, Robert J.

    2017-03-01

    A nonresonant, femtosecond (fs) laser is employed to desorb samples of Victoria blue deposited on stainless steel or indium tin oxide (ITO) slides using either electrospray deposition (ESD) or dried droplet deposition. The use of ESD resulted in uniform films of Victoria blue whereas the dried droplet method resulted in the formation of a ring pattern of the dye. Laser electrospray mass spectrometry (LEMS) measurements of the ESD-prepared films on either substrate were similar and revealed lower average relative standard deviations for measurements within-film (20.9%) and between-films (8.7%) in comparison to dried droplet (75.5% and 40.2%, respectively). The mass spectral response for ESD samples on both substrates was linear (R2 > 0.99), enabling quantitative measurements over the selected range of 7.0 × 10-11 to 2.8 × 10-9 mol, as opposed to the dried droplet samples where quantitation was not possible (R2 = 0.56). The limit of detection was measured to be 210 fmol.

  16. Analytic process and dreaming about analysis.

    PubMed

    Sirois, François

    2016-12-01

    Dreams about the analytic session feature a manifest content in which the analytic setting is subject to distortion while the analyst appears undisguised. Such dreams are a consistent yet infrequent occurrence in most analyses. Their specificity consists in never reproducing the material conditions of the analysis as such. This paper puts forward the following hypothesis: dreams about the session relate to some aspects of the analyst's activity. In this sense, such dreams are indicative of the transference neurosis, prefiguring transference resistances to the analytic elaboration of key conflicts. The parts taken by the patient and by the analyst are discussed in terms of their ability to signal a deepening of the analysis.

  17. Reproducibility of transcutaneous oximetry and laser Doppler flowmetry in facial skin and gingival tissue.

    PubMed

    Svalestad, J; Hellem, S; Vaagbø, G; Irgens, A; Thorsen, E

    2010-01-01

    Laser Doppler flowmetry (LDF) and transcutaneous oximetry (TcPO(2)) are non-invasive techniques, widely used in the clinical setting, for assessing microvascular blood flow and tissue oxygen tension, e.g. recording vascular changes after radiotherapy and hyperbaric oxygen therapy. With standardized procedures and improved reproducibility, these methods might also be applicable in longitudinal studies. The aim of this study was to evaluate the reproducibility of facial skin and gingival LDF and facial skin TcPO(2). The subjects comprised ten healthy volunteers, 5 men, aged 31-68 years. Gingival perfusion was recorded with the LDF probe fixed to a custom made, tooth-supported acrylic splint. Skin perfusion was recorded on the cheek. TcPO(2) was recorded on the forehead and cheek and in the second intercostal space. The reproducibility of LDF measurements taken after vasodilation by heat provocation was greater than for basal flow in both facial skin and mandibular gingiva. Pronounced intraday variations were observed. Interweek reproducibility assessed by intraclass correlation coefficient ranged from 0.74 to 0.96 for LDF and from 0.44 to 0.75 for TcPO(2). The results confirm acceptable reproducibility of LDF and TcPO(2) in longitudinal studies in a vascular laboratory where subjects serve as their own controls. The use of thermoprobes is recommended. Repeat measurements should be taken at the same time of day.

  18. Reproducibility of neuroimaging analyses across operating systems.

    PubMed

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  19. Accuracy and reproducibility of cholesterol assay in the western Cape.

    PubMed

    Berger, G M; Christopher, K; Juritz, J M; Liesegang, F

    1988-11-19

    The accuracy and precision of cholesterol assay in the western Cape region is reported. The survey was carried out over 15 weeks utilising three human EDTA plasma pools with normal, borderline high and high cholesterol levels respectively. All 11 laboratories in the region providing a service to academic, provincial or military hospitals or to the private medical sector were included in the study. Ten of the 11 laboratories utilised automated enzymatic methods of cholesterol assay whereas 1 used a manual procedure based on the Liebermann-Burchard reaction. Methods were standardised by means of a variety of commercial calibrator material in all except 1 laboratory which used reference sera from the Centers for Disease Control, Atlanta. The performance of the 4 best laboratories met the standard of precision recommended for cholesterol assay, viz. total coefficient of variation of less than or equal to 2.5%. However, only 2 of the 11 laboratories achieved the optimum objective of an overall bias of less than 2.0% together with precision of less than or equal to 2.5%. Rational use of cholesterol assay for diagnosis and management will therefore require standardisation of cholesterol assay on a common reference material and greater attention to analytical factors influencing the reproducibility of results. Intrinsic biological variation also contributes uncertainty to the interpretation of a single value. Thus important clinical decisions must be based on two or more assays carried out using appropriate methodology.

  20. Sonic boom acceptability studies

    NASA Technical Reports Server (NTRS)

    Shepherd, Kevin P.; Sullivan, Brenda M.; Leatherwood, Jack D.; Mccurdy, David A.

    1992-01-01

    The determination of the magnitude of sonic boom exposure which would be acceptable to the general population requires, as a starting point, a method to assess and compare individual sonic booms. There is no consensus within the scientific and regulatory communities regarding an appropriate sonic boom assessment metric. Loudness, being a fundamental and well-understood attribute of human hearing was chosen as a means of comparing sonic booms of differing shapes and amplitudes. The figure illustrates the basic steps which yield a calculated value of loudness. Based upon the aircraft configuration and its operating conditions, the sonic boom pressure signature which reaches the ground is calculated. This pressure-time history is transformed to the frequency domain and converted into a one-third octave band spectrum. The essence of the loudness method is to account for the frequency response and integration characteristics of the auditory system. The result of the calculation procedure is a numerical description (perceived level, dB) which represents the loudness of the sonic boom waveform.

  1. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  2. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  3. Grazing function g and collimation angular acceptance

    SciTech Connect

    Peggs, S.G.; Previtali, V.

    2009-11-02

    The grazing function g is introduced - a synchrobetatron optical quantity that is analogous (and closely connected) to the Twiss and dispersion functions {beta}, {alpha}, {eta}, and {eta}'. It parametrizes the rate of change of total angle with respect to synchrotron amplitude for grazing particles, which just touch the surface of an aperture when their synchrotron and betatron oscillations are simultaneously (in time) at their extreme displacements. The grazing function can be important at collimators with limited acceptance angles. For example, it is important in both modes of crystal collimation operation - in channeling and in volume reflection. The grazing function is independent of the collimator type - crystal or amorphous - but can depend strongly on its azimuthal location. The rigorous synchrobetatron condition g = 0 is solved, by invoking the close connection between the grazing function and the slope of the normalized dispersion. Propagation of the grazing function is described, through drifts, dipoles, and quadrupoles. Analytic expressions are developed for g in perfectly matched periodic FODO cells, and in the presence of {beta} or {eta} error waves. These analytic approximations are shown to be, in general, in good agreement with realistic numerical examples. The grazing function is shown to scale linearly with FODO cell bend angle, but to be independent of FODO cell length. The ideal value is g = 0 at the collimator, but finite nonzero values are acceptable. Practically achievable grazing functions are described and evaluated, for both amorphous and crystal primary collimators, at RHIC, the SPS (UA9), the Tevatron (T-980), and the LHC.

  4. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    Business Analytics, Decision Analytics, Business Intelligence, Advanced Analytics, Data Science. . . to a certain degree, to label is to limit - if only... Business Analytics. 2004 2006 2008 2010 2012 2014 Figure 1: Google trending of daily searches for various analytic disciplines “The limits of my

  5. Reproducibility of self-reported melanoma risk factors in a large cohort study of Norwegian women.

    PubMed

    Veierød, Marit B; Parr, Christine L; Lund, Eiliv; Hjartåker, Anette

    2008-02-01

    We studied the test-retest reproducibility of melanoma risk factors, including the use of sunscreen and sun protection factor (SPF), in a self-administered exposure follow-up questionnaire from the Norwegian Women and Cancer Study, a large national population-based cohort. In 2002, a random sample of 2000 women (46-75 years) received the questionnaire twice, about 3 months a part (response 75%). Kappa (kappa) was 0.77 for freckling when sunbathing [95% confidence interval (CI), 0.74-0.81]. Weighted kappa, kappaw, for sunbathing vacations to southern latitudes and solarium use last 5 years was 0.71 (95% CI, 0.68-0.74) and 0.70 (95% CI, 0.67-0.73), respectively. Reproducibility was also good for sunscreen use (yes/no) on specific occasions (0.64< or =kappa< or =0.74) and the corresponding SPF. Spearman's correlation coefficient (r(s)) for SPF on sunbathing vacations to southern latitudes was 0.73 (95% CI, 0.69-0.77) for today and 0.71 (95% CI, 0.66-0.76) for 10 years ago. For the eight most common sunscreen brands, reproducibility was lower for use (yes/no) (0.31< or =kappa< or =0.60) than for SPF (0.38< or =r(s)< or =0.87). The frequency of sunburn and sunbathing vacations in Norway or outside southern latitudes had fair reproducibility (kappaw was 0.49 and 0.47, respectively). Other studies have also found it challenging to measure sunburn. This study was larger than previous studies, permitting subgroup analyses. In conclusion, the overall reproducibility of the questionnaire was acceptable and not affected by age, education or skin color. In particular, our study has added new knowledge about the reproducibility of sunscreen use and SPF.

  6. Month-to-month and year-to-year reproducibility of high frequency QRS ECG signals

    NASA Technical Reports Server (NTRS)

    Batdorf, Niles J.; Feiveson, Alan H.; Schlegel, Todd T.

    2004-01-01

    High frequency electrocardiography analyzing the entire QRS complex in the frequency range of 150 to 250 Hz may prove useful in the detection of coronary artery disease, yet the long-term stability of these waveforms has not been fully characterized. Therefore, we prospectively investigated the reproducibility of the root mean squared voltage, kurtosis, and the presence versus absence of reduced amplitude zones in signal averaged 12-lead high frequency QRS recordings acquired in the supine position one month apart in 16 subjects and one year apart in 27 subjects. Reproducibility of root mean squared voltage and kurtosis was excellent over these time intervals in the limb leads, and acceptable in the precordial leads using both the V-lead and CR-lead derivations. The relative error of root mean squared voltage was 12% month-to-month and 16% year-to-year in the serial recordings when averaged over all 12 leads. Reduced amplitude zones were also reproducible up to a rate of 87% and 81%, respectively, for the month-to-month and year-to-year recordings. We conclude that 12-lead high frequency QRS electrocardiograms are sufficiently reproducible for clinical use.

  7. Month-to-Month and Year-to-Year Reproducibility of High Frequency QRS ECG signals

    NASA Technical Reports Server (NTRS)

    Batdorf, Niles; Feiveson, Alan H.; Schlegel, Todd T.

    2006-01-01

    High frequency (HF) electrocardiography analyzing the entire QRS complex in the frequency range of 150 to 250 Hz may prove useful in the detection of coronary artery disease, yet the long-term stability of these waveforms has not been fully characterized. We therefore prospectively investigated the reproducibility of the root mean squared (RMS) voltage, kurtosis, and the presence versus absence of reduced amplitude zones (RAzs) in signal averaged 12-lead HF QRS recordings acquired in the supine position one month apart in 16 subjects and one year apart in 27 subjects. Reproducibility of RMS voltage and kurtosis was excellent over these time intervals in the limb leads, and acceptable in the precordial leads using both the V-lead and CR-lead derivations. The relative error of RMS voltage was 12% month-to-month and 16% year-to-year in the serial recordings when averaged over all 12 leads. RAzs were also reproducible at a rate of up to 87% and 8 1 %, respectively, for the month-to-month and year-to-year recordings. We conclude that 12-lead HF QRS electrocardiograms are sufficiently reproducible for clinical use.

  8. The effect of sample holder material on ion mobility spectrometry reproducibility

    NASA Technical Reports Server (NTRS)

    Jadamec, J. Richard; Su, Chih-Wu; Rigdon, Stephen; Norwood, Lavan

    1995-01-01

    When a positive detection of a narcotic occurs during the search of a vessel, a decision has to be made whether further intensive search is warranted. This decision is based in part on the results of a second sample collected from the same area. Therefore, the reproducibility of both sampling and instrumental analysis is critical in terms of justifying an in depth search. As reported at the 2nd Annual IMS Conference in Quebec City, the U.S. Coast Guard has determined that when paper is utilized as the sample desorption medium for the Barringer IONSCAN, the analytical results using standard reference samples are reproducible. A study was conducted utilizing papers of varying pore sizes and comparing their performance as a desorption material relative to the standard Barringer 50 micron Teflon. Nominal pore sizes ranged from 30 microns down to 2 microns. Results indicate that there is some peak instability in the first two to three windows during the analysis. The severity of the instability was observed to increase as the pore size of the paper is decreased. However, the observed peak instability does not create a situation that results in a decreased reliability or reproducibility in the analytical result.

  9. Acceptance of tinnitus: validation of the tinnitus acceptance questionnaire.

    PubMed

    Weise, Cornelia; Kleinstäuber, Maria; Hesser, Hugo; Westin, Vendela Zetterqvist; Andersson, Gerhard

    2013-01-01

    The concept of acceptance has recently received growing attention within tinnitus research due to the fact that tinnitus acceptance is one of the major targets of psychotherapeutic treatments. Accordingly, acceptance-based treatments will most likely be increasingly offered to tinnitus patients and assessments of acceptance-related behaviours will thus be needed. The current study investigated the factorial structure of the Tinnitus Acceptance Questionnaire (TAQ) and the role of tinnitus acceptance as mediating link between sound perception (i.e. subjective loudness of tinnitus) and tinnitus distress. In total, 424 patients with chronic tinnitus completed the TAQ and validated measures of tinnitus distress, anxiety, and depression online. Confirmatory factor analysis provided support to a good fit of the data to the hypothesised bifactor model (root-mean-square-error of approximation = .065; Comparative Fit Index = .974; Tucker-Lewis Index = .958; standardised root mean square residual = .032). In addition, mediation analysis, using a non-parametric joint coefficient approach, revealed that tinnitus-specific acceptance partially mediated the relation between subjective tinnitus loudness and tinnitus distress (path ab = 5.96; 95% CI: 4.49, 7.69). In a multiple mediator model, tinnitus acceptance had a significantly stronger indirect effect than anxiety. The results confirm the factorial structure of the TAQ and suggest the importance of a general acceptance factor that contributes important unique variance beyond that of the first-order factors activity engagement and tinnitus suppression. Tinnitus acceptance as measured with the TAQ is proposed to be a key construct in tinnitus research and should be further implemented into treatment concepts to reduce tinnitus distress.

  10. Reproducibility of quantitative planar thallium-201 scintigraphy: quantitative criteria for reversibility of myocardial perfusion defects

    SciTech Connect

    Sigal, S.L.; Soufer, R.; Fetterman, R.C.; Mattera, J.A.; Wackers, F.J. )

    1991-05-01

    Fifty-two paired stress/delayed planar {sup 201}TI studies (27 exercise studies, 25 dipyridamole studies) were processed twice by seven technologists to assess inter- and intraobserver variability. The reproducibility was inversely related to the size of {sup 201}Tl perfusion abnormalities. Intraobserver variability was not different between exercise and dipyridamole studies for lesions of similar size. Based upon intraobserver variability, objective quantitative criteria for reversibility of perfusion abnormalities were defined. These objective criteria were tested prospectively in a separate group of 35 {sup 201}Tl studies and compared with the subjective interpretation of quantitative circumferential profiles. Overall, exact agreement existed in 78% of images (kappa statistic k = 0.66). We conclude that quantification of planar {sup 201}Tl scans is highly reproducible, with acceptable inter- and intraobserver variability. Objective criteria for lesion reversibility correlated well with analysis by experienced observers.

  11. Virtual Reference Environments: a simple way to make research reproducible

    PubMed Central

    Hurley, Daniel G.; Budden, David M.

    2015-01-01

    Reproducible research’ has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work. PMID:25433467

  12. Virtual Reference Environments: a simple way to make research reproducible.

    PubMed

    Hurley, Daniel G; Budden, David M; Crampin, Edmund J

    2015-09-01

    'Reproducible research' has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work.

  13. Cone penetrometer acceptance test report

    SciTech Connect

    Boechler, G.N.

    1996-09-19

    This Acceptance Test Report (ATR) documents the results of acceptance test procedure WHC-SD-WM-ATR-151. Included in this report is a summary of the tests, the results and issues, the signature and sign- off ATP pages, and a summarized table of the specification vs. ATP section that satisfied the specification.

  14. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  15. Reproducibility and variability of the cost functions reconstructed from experimental recordings in multifinger prehension.

    PubMed

    Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2012-01-01

    The study examines whether the cost functions reconstructed from experimental recordings are reproducible over time. Participants repeated the trials on three days. By following Analytical Inverse Optimization procedures, the cost functions of finger forces were reconstructed for each day. The cost functions were found to be reproducible over time: application of a cost function C(i) to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) the 2nd order coefficients of the cost function showed negative linear relations with finger force magnitudes; (b) the finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space for all subjects and all testing sessions; (c) the data agreed well with the principle of superposition, i.e. the action of object prehension can be decoupled into the control of rotational equilibrium and slipping prevention.

  16. Rapid and reproducible determination of active gibberellins in citrus tissues by UPLC/ESI-MS/MS.

    PubMed

    Manzi, Matías; Gómez-Cadenas, Aurelio; Arbona, Vicent

    2015-09-01

    Phytohormone determination is crucial to explain the physiological mechanisms during growth and development. Therefore, rapid and precise methods are needed to achieve reproducible determination of phytohormones. Among many others, gibberellins (GAs) constitute a family of complex analytes as most of them share similar structure and chemical properties although only a few hold biological activity (namely GA1; GA3; GA4 and GA7). A method has been developed to extract GAs from plant tissues by mechanical disruption using ultrapure water as solvent and, in this way, ion suppression was reduced whereas sensitivity increased. Using this methodology, the four active GAs were separated and quantified by UPLC coupled to MS/MS using the isotope-labeled internal standards [(2)H2]-GA1 and [(2)H2]-GA4. To sum up, the new method provides a fast and reproducible protocol to determine bioactive GAs at low concentrations, using minimal amounts of sample and reducing the use of organic solvents.

  17. [Validation and regulatory acceptance of alternative methods for toxicity evaluation].

    PubMed

    Ohno, Yasuo

    2004-01-01

    For regulatory acceptance of alternative methods (AMs) to animal toxicity tests, their reproducibility and relevance should be determined by intra- and inter-laboratory validation. Appropriate procedures of the validation and regulatory acceptance of AMs were recommended by OECD in 1996. According to those principles, several in vitro methods like skin corrosivity tests and phototoxicity tests were evaluated and accepted by ECVAM (European Center for the Validation of Alternative Methods), ICCVAM (The Interagency Coordinating Committee on the Validation of Alternative Methods), and OECD. Because of the difficulties in conducting inter-laboratory validation and relatively short period remained until EU's ban of animal experiments for safety evaluation of cosmetics, ECVAM and ICCVAM have recently started cooperation in validation and evaluation of AMs. It is also necessary to establish JaCVAM (Japanese Center for the Validation of AM) to contribute the issue and for the evaluation of new toxicity tests originated in Japan.

  18. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  19. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  20. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  1. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of... Restricted Data may be reproduced to the minimum extent necessary consistent with efficient operation...

  2. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of... Restricted Data may be reproduced to the minimum extent necessary consistent with efficient operation...

  3. Extending the Technology Acceptance Model: Policy Acceptance Model (PAM)

    NASA Astrophysics Data System (ADS)

    Pierce, Tamra

    There has been extensive research on how new ideas and technologies are accepted in society. This has resulted in the creation of many models that are used to discover and assess the contributing factors. The Technology Acceptance Model (TAM) is one that is a widely accepted model. This model examines people's acceptance of new technologies based on variables that directly correlate to how the end user views the product. This paper introduces the Policy Acceptance Model (PAM), an expansion of TAM, which is designed for the analysis and evaluation of acceptance of new policy implementation. PAM includes the traditional constructs of TAM and adds the variables of age, ethnicity, and family. The model is demonstrated using a survey of people's attitude toward the upcoming healthcare reform in the United States (US) from 72 survey respondents. The aim is that the theory behind this model can be used as a framework that will be applicable to studies looking at the introduction of any new or modified policies.

  4. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    PubMed

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  5. Assessments of endothelial function and arterial stiffness are reproducible in patients with COPD

    PubMed Central

    Rodriguez-Miguelez, Paula; Seigler, Nichole; Bass, Leon; Dillard, Thomas A; Harris, Ryan A

    2015-01-01

    Background Elevated cardiovascular disease risk is observed in patients with COPD. Non-invasive assessments of endothelial dysfunction and arterial stiffness have recently emerged to provide mechanistic insight into cardiovascular disease risk in COPD; however, the reproducibility of endothelial function and arterial stiffness has yet to be investigated in this patient population. Objectives This study sought to examine the within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. Methods Baseline diameter, peak diameter, flow-mediated dilation, augmentation index, augmentation index at 75 beats per minute, and pulse wave velocity were assessed three times in 17 patients with COPD (six males, eleven females, age range 47–75 years old; forced expiratory volume in 1 second =51.5% predicted). Session A and B were separated by 3 hours (within-day), whereas session C was conducted at least 7 days following session B (between-day). Reproducibility was assessed by: 1) paired t-tests, 2) coefficients of variation, 3) coefficients of variation prime, 4) intra-class correlation coefficient, 5) Pearson’s correlations (r), and 6) Bland–Altman plots. Five acceptable assessments were required to confirm reproducibility. Results Six out of six within-day criteria were met for endothelial function and arterial stiffness outcomes. Six out of six between-day criteria were met for baseline and peak diameter, augmentation index and pulse wave velocity, whereas five out of six criteria were met for flow-mediated dilation. Conclusion The present study provides evidence for within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. PMID:26396509

  6. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description

    PubMed Central

    Sahoo, Satya S.; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled “Rigor and Reproducibility “ for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project. PMID:28269904

  7. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-12-10

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

  8. An open investigation of the reproducibility of cancer biology research

    PubMed Central

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. DOI: http://dx.doi.org/10.7554/eLife.04333.001 PMID:25490932

  9. Spin-coating process evolution and reproducibility for power-law fluids.

    PubMed

    Jardim, P L G; Michels, A F; Horowitz, F

    2014-03-20

    A distinct development of an exact analytical solution for power-law fluids during the spin-coating process is presented for temporal and spatial thickness evolution, after steady state conditions are attained. This solution leads to the definition of a characteristic time, related to the memory of the initial thickness profile. Previously obtained experimental data, for several rotation speeds and carboxymetilcellulose concentrations in water, are quantitatively analyzed through the evaluation of their characteristic times and compared with theoretical predictions, thus allowing better understanding of thickness profile evolution and of process reproducibility.

  10. Market Acceptance of Smart Growth

    EPA Pesticide Factsheets

    This report finds that smart growth developments enjoy market acceptance because of stability in prices over time. Housing resales in smart growth developments often have greater appreciation than their conventional suburban counterparts.

  11. L-286 Acceptance Test Record

    SciTech Connect

    HARMON, B.C.

    2000-01-14

    This document provides a detailed account of how the acceptance testing was conducted for Project L-286, ''200E Area Sanitary Water Plant Effluent Stream Reduction''. The testing of the L-286 instrumentation system was conducted under the direct supervision

  12. Analytical strategy for detecting doping agents in hair.

    PubMed

    Thieme, D; Grosse, J; Sachs, H; Mueller, R K

    2000-01-10

    , because the possibility of many biotransformation reactions from (into) other precursors (metabolites) has to be taken into account. Precursor substances of anabolic steroids (especially esters as application forms) are very promising analytical targets of hair analysis, because they can only be detected after an exogenous intake. The quantitative evaluation of active parent compounds like testosterone (which is actively involved in physiological processes of hair growth) in hair is still controversial. Clinical applications under reproducible conditions can be useful, but the biovariability of these parameters will probably prevent the definition of acceptable cut-off levels as a criterion of abuse.

  13. Reproducibility of the World Health Organization 2008 criteria for myelodysplastic syndromes.

    PubMed

    Senent, Leonor; Arenillas, Leonor; Luño, Elisa; Ruiz, Juan C; Sanz, Guillermo; Florensa, Lourdes

    2013-04-01

    The reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is uncertain and its assessment was the major aim of this study. The different peripheral blood and bone marrow variables required for an adequate morphological classification were blindly evaluated by four cytomorphologists in samples from 50 patients with myelodysplastic syndromes. The degree of agreement among observers was calculated using intraclass correlation coefficient and the generalized kappa statistic for multiple raters. The degree of agreement for the percentages of blasts in bone marrow and peripheral blood, ring sideroblasts in bone marrow, and erythroid, granulocytic and megakaryocytic dysplastic cells was strong (P<0.001 in all instances). After stratifying the percentages according to the categories required for the assignment of World Health Organization subtypes, the degree of agreement was not statistically significant for cases with 5-9% blasts in bone marrow (P=0.07), 0.1-1% blasts in peripheral blood (P=0.47), or percentage of erythroid dysplastic cells (P=0.49). Finally, the interobserver concordance for World Health Organization-defined subtypes showed a moderate overall agreement (P<0.001), the reproducibility being lower for cases with refractory anemia with excess of blasts type 1 (P=0.05) and refractory anemia with ring sideroblasts (P=0.09). In conclusion, the reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is acceptable but the defining criteria for blast cells and features of erythroid dysplasia need to be refined.

  14. Reproducibility of the World Health Organization 2008 criteria for myelodysplastic syndromes

    PubMed Central

    Senent, Leonor; Arenillas, Leonor; Luño, Elisa; Ruiz, Juan C.; Sanz, Guillermo; Florensa, Lourdes

    2013-01-01

    The reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is uncertain and its assessment was the major aim of this study. The different peripheral blood and bone marrow variables required for an adequate morphological classification were blindly evaluated by four cytomorphologists in samples from 50 patients with myelodysplastic syndromes. The degree of agreement among observers was calculated using intraclass correlation coefficient and the generalized kappa statistic for multiple raters. The degree of agreement for the percentages of blasts in bone marrow and peripheral blood, ring sideroblasts in bone marrow, and erythroid, granulocytic and megakaryocytic dysplastic cells was strong (P<0.001 in all instances). After stratifying the percentages according to the categories required for the assignment of World Health Organization subtypes, the degree of agreement was not statistically significant for cases with 5-9% blasts in bone marrow (P=0.07), 0.1-1% blasts in peripheral blood (P=0.47), or percentage of erythroid dysplastic cells (P=0.49). Finally, the interobserver concordance for World Health Organization-defined subtypes showed a moderate overall agreement (P<0.001), the reproducibility being lower for cases with refractory anemia with excess of blasts type 1 (P=0.05) and refractory anemia with ring sideroblasts (P=0.09). In conclusion, the reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is acceptable but the defining criteria for blast cells and features of erythroid dysplasia need to be refined. PMID:23065505

  15. Reproducibility and validity of a semi-quantitative FFQ for trace elements.

    PubMed

    Lee, Yujin; Park, Kyong

    2016-09-01

    The aim of this study was to test the reproducibility and validity of a self-administered FFQ for the Trace Element Study of Korean Adults in the Yeungnam area (SELEN). Study subjects were recruited from the SELEN cohort selected from rural and urban areas in Yeungnam, Korea. A semi-quantitative FFQ with 146 items was developed considering the dietary characteristics of cohorts in the study area. In a validation study, seventeen men and forty-eight women aged 38-62 years completed 3-d dietary records (DR) and two FFQ over a 3-month period. The validity was examined with the FFQ and DR, and the reproducibility was estimated using partial correlation coefficients, the Bland-Altman method and cross-classification. There were no significant differences between the mean intakes of selected nutrients as estimated from FFQ1, FFQ2 and DR. The median correlation coefficients for all nutrients were 0·47 and 0·56 in the reproducibility and validity tests, respectively. Bland-Altman's index and cross-classification showed acceptable agreement between FFQ1 and FFQ2 and between FFQ2 and DR. Ultimately, 78 % of the subjects were classified into the same and adjacent quartiles for most nutrients. In addition, the weighted κ value indicated that the two methods agreed fairly. In conclusion, this newly developed FFQ was a suitable dietary assessment method for the SELEN cohort study.

  16. Incurred sample reproducibility and stability assessment in a cell-based drug concentration assay.

    PubMed

    White, Joleen T; Crossman, Mary; Subramanyam, Meena

    2015-01-01

    Joleen White is Principal Scientist in Translational Sciences at Biogen Idec. Throughout her career, she has applied her background in biophysical protein chemistry to pharmaceutical development in therapeutic indications with significant unmet medical need. In her current role, she supports method development and regulated bioanalysis of biomarkers, biopharmaceuticals, and immunogenicity in biological samples from nonclinical and clinical studies. Her experience with measuring macromolecules includes enzymes, monoclonal antibodies, Fc fusions, oligonucleotides, PEGylated proteins, and other novel protein constructs. She has supported studies from discovery through all phases of development including GLP nonclinical, clinical, and post-marketing commitments. Incurred samplereproducibility is one aspect of in-study validation, with white papers outlining expectations for chromatographic assays and immunoassays. This manuscript outlines an approach for performing incurred sample reproducibility for a bioequivalence study using a cell-based assay, with the complication of time elapsed between original and repeat assays. The incurred sample reproducibility passed the pre-established acceptance criteria of 45% for at least 2/3 of the samples: 174/216 samples (80.6%). Data trends between the two crossover arms were qualitatively similar. The passed incurred sample reproducibility and stability further supports the validity of the original study conclusion that the two manufacturing processes were bioequivalent. This illustrates one approach to extrapolating industry and regulatory recommendations for situations outside current guidance.

  17. Analytical laboratory quality audits

    SciTech Connect

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  18. Analytic technique: a reconsideration of the concept.

    PubMed

    Grossman, Lee

    2014-06-01

    Lipton's 1977 paper on "The Advantages of Freud's Technique …" is taken as a starting point to reconsider the concept of analytic technique itself. How an analyst works may be construed in terms of rules of the analyst's behavior, of principles underlying the analyst's behavior, or of the analyst's attitude that shapes how he or she acts on technical principles. The author argues that the analyst's attitude while acting on technical principles is an integral part of analytic praxis, and that it is a function of the analyst's character. As such, it is not generalizable as a "technique," yet it is often the case that an analyst will rationalize his or her character traits and think of them as a reproducible "technique." This has important consequences for teaching and supervising. The author suggests that the very idea of a reproducible analytic technique may inhibit the analyst's development of his or her own analytic voice. Other aspects of theorizing may also represent a conceptual confusion between what is personal and characterological and what is generalizable.

  19. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  20. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  1. Reproducibility of computational workflows is automated using continuous analysis.

    PubMed

    Beaulieu-Jones, Brett K; Greene, Casey S

    2017-03-13

    Replication, validation and extension of experiments are crucial for scientific progress. Computational experiments are scriptable and should be easy to reproduce. However, computational analyses are designed and run in a specific computing environment, which may be difficult or impossible to match using written instructions. We report the development of continuous analysis, a workflow that enables reproducible computational analyses. Continuous analysis combines Docker, a container technology akin to virtual machines, with continuous integration, a software development technique, to automatically rerun a computational analysis whenever updates or improvements are made to source code or data. This enables researchers to reproduce results without contacting the study authors. Continuous analysis allows reviewers, editors or readers to verify reproducibility without manually downloading and rerunning code and can provide an audit trail for analyses of data that cannot be shared.

  2. Spatial Analytic Interfaces: Spatial User Interfaces for In-Situ Visual Analytics.

    PubMed

    Ens, Barrett; Irani, Pourang

    2016-03-18

    As wearable devices gain acceptance, we ask "What do user interfaces look like in a post-smartphone world?" and "Can these future interfaces support sophisticated interactions in a mobile context?" In stark contrast to the micro-interactions of current wearable interfaces lies visual analytics. A hallmark of such platforms is the ability to simultaneously view multiple linked visualizations of diverse datasets. We draw from visual analytic concepts to address the growing need of individuals to manage information on personal devices. We propose Spatial Analytic Interfaces to leverage the benefits of spatial interaction to enable everyday visual analytic tasks to be performed in-situ, at the most beneficial place and time. We explore the possibilities for such interfaces using head-worn display technology, to integrate multiple information views into the user's physical environment. We discuss current developments and propose research goals for the successful development of SUI for in-situ visual analytics.

  3. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-04

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  4. Reproducibility with repeat CT in radiomics study for rectal cancer

    PubMed Central

    Hu, Panpan; Wang, Jiazhou; Zhong, Haoyu; Zhou, Zhen; Shen, Lijun; Hu, Weigang; Zhang, Zhen

    2016-01-01

    Purpose To evaluate the reproducibility of radiomics features by repeating computed tomographic (CT) scans in rectal cancer. To choose stable radiomics features for rectal cancer. Results Volume normalized features are much more reproducible than unnormalized features. The average value of all slices is the most reproducible feature type in rectal cancer. Different filters have little effect for the reproducibility of radiomics features. For the average type features, 496 out of 775 features showed high reproducibility (ICC ≥ 0.8), 225 out of 775 features showed medium reproducibility (0.8 > ICC ≥ 0.5) and 54 out of 775 features showed low reproducibility (ICC < 0.5). Methods 40 rectal cancer patients with stage II were enrolled in this study, each of whom underwent two CT scans within average 8.7 days. 775 radiomics features were defined in this study. For each features, five different values (value from the largest slice, maximum value, minimum value, average value of all slices and value from superposed intermediate matrix) were extracted. Meanwhile a LOG filter with different parameters was applied to these images to find stable filter value. Concordance correlation coefficients (CCC) and inter-class correlation coefficients (ICC) of two CT scans were calculated to assess the reproducibility, based on original features and volume normalized features. Conclusions Features are recommended to be normalized to volume in radiomics analysis. The average type radiomics features are the most stable features in rectal cancer. Further analysis of these features of rectal cancer can be warranted for treatment monitoring and prognosis prediction. PMID:27669756

  5. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  6. Transparency, usability, and reproducibility: Guiding principles for improving comparative databases using primates as examples.

    PubMed

    Borries, Carola; Sandel, Aaron A; Koenig, Andreas; Fernandez-Duque, Eduardo; Kamilar, Jason M; Amoroso, Caroline R; Barton, Robert A; Bray, Joel; Di Fiore, Anthony; Gilby, Ian C; Gordon, Adam D; Mundry, Roger; Port, Markus; Powell, Lauren E; Pusey, Anne E; Spriggs, Amanda; Nunn, Charles L

    2016-09-01

    Recent decades have seen rapid development of new analytical methods to investigate patterns of interspecific variation. Yet these cutting-edge statistical analyses often rely on data of questionable origin, varying accuracy, and weak comparability, which seem to have reduced the reproducibility of studies. It is time to improve the transparency of comparative data while also making these improved data more widely available. We, the authors, met to discuss how transparency, usability, and reproducibility of comparative data can best be achieved. We propose four guiding principles: 1) data identification with explicit operational definitions and complete descriptions of methods; 2) inclusion of metadata that capture key characteristics of the data, such as sample size, geographic coordinates, and nutrient availability (for example, captive versus wild animals); 3) documentation of the original reference for each datum; and 4) facilitation of effective interactions with the data via user friendly and transparent interfaces. We urge reviewers, editors, publishers, database developers and users, funding agencies, researchers publishing their primary data, and those performing comparative analyses to embrace these standards to increase the transparency, usability, and reproducibility of comparative studies.

  7. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  8. From requirements to acceptance tests

    NASA Technical Reports Server (NTRS)

    Baize, Lionel; Pasquier, Helene

    1993-01-01

    From user requirements definition to accepted software system, the software project management wants to be sure that the system will meet the requirements. For the development of a telecommunication satellites Control Centre, C.N.E.S. has used new rules to make the use of tracing matrix easier. From Requirements to Acceptance Tests, each item of a document must have an identifier. A unique matrix traces the system and allows the tracking of the consequences of a change in the requirements. A tool has been developed, to import documents into a relational data base. Each record of the data base corresponds to an item of a document, the access key is the item identifier. Tracing matrix is also processed, providing automatically links between the different documents. It enables the reading on the same screen of traced items. For example one can read simultaneously the User Requirements items, the corresponding Software Requirements items and the Acceptance Tests.

  9. Defining acceptable conditions in wilderness

    NASA Astrophysics Data System (ADS)

    Roggenbuck, J. W.; Williams, D. R.; Watson, A. E.

    1993-03-01

    The limits of acceptable change (LAC) planning framework recognizes that forest managers must decide what indicators of wilderness conditions best represent resource naturalness and high-quality visitor experiences and how much change from the pristine is acceptable for each indicator. Visitor opinions on the aspects of the wilderness that have great impact on their experience can provide valuable input to selection of indicators. Cohutta, Georgia; Caney Creek, Arkansas; Upland Island, Texas; and Rattlesnake, Montana, wilderness visitors have high shared agreement that littering and damage to trees in campsites, noise, and seeing wildlife are very important influences on wilderness experiences. Camping within sight or sound of other people influences experience quality more than do encounters on the trails. Visitors’ standards of acceptable conditions within wilderness vary considerably, suggesting a potential need to manage different zones within wilderness for different clientele groups and experiences. Standards across wildernesses, however, are remarkably similar.

  10. Paper-based microfluidic approach for surface-enhanced raman spectroscopy and highly reproducible detection of proteins beyond picomolar concentration.

    PubMed

    Saha, Arindam; Jana, Nikhil R

    2015-01-14

    Although microfluidic approach is widely used in various point of care diagnostics, its implementation in surface enhanced Raman spectroscopy (SERS)-based detection is challenging. This is because SERS signal depends on plasmonic nanoparticle aggregation induced generation of stable electromagnetic hot spots and in currently available microfluidic platform this condition is difficult to adapt. Here we show that SERS can be adapted using simple paper based microfluidic system where both the plasmonic nanomaterials and analyte are used in mobile phase. This approach allows analyte induced controlled particle aggregation and electromagnetic hot spot generation inside the microfluidic channel with the resultant SERS signal, which is highly reproducible and sensitive. This approach has been used for reproducible detection of protein in the pico to femtomolar concentration. Presented approach is simple, rapid, and cost-effective, and requires low sample volume. Method can be extended for SERS-based detection of other biomolecules.

  11. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  12. Reproducibility of Resting State Connectivity in Patients with Stable Multiple Sclerosis

    PubMed Central

    Pinter, Daniela; Beckmann, Christian; Koini, Marisa; Pirker, Eva; Filippini, Nicola; Pichler, Alexander; Fuchs, Siegrid; Fazekas, Franz; Enzinger, Christian

    2016-01-01

    Given increasing efforts to use resting-state fMRI (rfMRI) as a biomarker of disease progression in multiple sclerosis (MS) we here explored the reproducibility of longitudinal rfMRI over three months in patients with clinically and radiologically stable MS. To pursue this aim, two approaches were applied in nine rfMRI networks: First, the intraclass correlation coefficient (ICC 3,1) was assessed for the mean functional connectivity maps across the entire network and a region of interest (ROI). Second, the ratio of overlap between Z-thresholded connectivity maps for each network was assessed. We quantified between-session functional reproducibility of rfMRI for 20 patients with stable MS and 14 healthy controls (HC). Nine rfMRI networks (RSNs) were examined at baseline and after 3 months of follow-up: three visual RSNs, the default-mode network, sensorimotor-, auditory-, executive control, and the left and right fronto-parietal RSN. ROI analyses were constrained to thresholded overlap masks for each individual (Z>0) at baseline and follow-up.In both stable MS and HC mean functional connectivity across the entire network did not reach acceptable ICCs for several networks (ICC<0.40) but we found a high reproducibility of ROI ICCs and of the ratio of overlap. ROI ICCs of all nine networks were between 0.98 and 0.99 for HC and ranged from 0.88 to 0.99 in patients with MS, respectively. The ratio of overlap for all networks was similar for both groups, ranging from 0.60 to 0.75.Our findings attest to a high reproducibility of rfMRI networks not only in HC but also in patients with stable MS when applying ROI analysis. This supports the utility of rfMRI to monitor functional changes related to disease progression or therapeutic interventions in MS. PMID:27007237

  13. Reproducibility and accuracy of optic nerve sheath diameter assessment using ultrasound compared to magnetic resonance imaging

    PubMed Central

    2013-01-01

    Background Quantification of the optic nerve sheath diameter (ONSD) by transbulbar sonography is a promising non-invasive technique for the detection of altered intracranial pressure. In order to establish this method as follow-up tool in diseases with intracranial hyper- or hypotension scan-rescan reproducibility and accuracy need to be systematically investigated. Methods The right ONSD of 15 healthy volunteers (mean age 24.5 ± 0.8 years) were measured by both transbulbar sonography (9 – 3 MHz) and 3 Tesla MRI (half-Fourier acquisition single-shot turbo spin-echo sequences, HASTE) 3 and 5 mm behind papilla. All volunteers underwent repeated ultrasound and MRI examinations in order to assess scan-rescan reproducibility and accuracy. Moreover, inter- and intra-observer variabilities were calculated for both techniques. Results Scan-rescan reproducibility was robust for ONSD quantification by sonography and MRI at both depths (r > 0.75, p ≤ 0.001, mean differences < 2%). Comparing ultrasound- and MRI-derived ONSD values, we found acceptable agreement between both methods for measurements at a depth of 3 mm (r = 0.72, p = 0.002, mean difference < 5%). Further analyses revealed good inter- and intra-observer reliability for sonographic measurements 3 mm behind the papilla and for MRI at 3 and 5 mm (r > 0.82, p < 0.001, mean differences < 5%). Conclusions Sonographic ONSD quantification 3 mm behind the papilla can be performed with good reproducibility, measurement accuracy and observer agreement. Thus, our findings emphasize the feasibility of this technique as a non-invasive bedside tool for longitudinal ONSD measurements. PMID:24289136

  14. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study

    PubMed Central

    van de Water, Tanja; Faber, Irene; Elferink-Gemser, Marije

    2017-01-01

    Abstract Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p < 0.05). Elite players did not outscore non-elite players on domain-general reaction time nor on both components of inhibitory control (p > 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p < 0.01) and non-elite (p = 0.70, p < 0.05) players. No relationship was found between the national ranking and badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players’ performance. PMID:28210347

  15. ICRPG WORKING GROUP ON ANALYTICAL CHEMISTRY ROUND ROBIN NO. 22 -- EUDIOMETRIC ANALYSIS OF POWDERED ALUMINUM,

    DTIC Science & Technology

    Analytical Chemistry voted to conduct a round robin to estimate the interlaboratory reproducibility. The round robin was designed to facilitate statistical analysis of the data. Three samples representing different purity levels as

  16. Further Conceptualization of Treatment Acceptability

    ERIC Educational Resources Information Center

    Carter, Stacy L.

    2008-01-01

    A review and extension of previous conceptualizations of treatment acceptability is provided in light of progress within the area of behavior treatment development and implementation. Factors including legislation, advances in research, and service delivery models are examined as to their relationship with a comprehensive conceptualization of…

  17. Acceptance and Commitment Therapy: Introduction

    ERIC Educational Resources Information Center

    Twohig, Michael P.

    2012-01-01

    This is the introductory article to a special series in Cognitive and Behavioral Practice on Acceptance and Commitment Therapy (ACT). Instead of each article herein reviewing the basics of ACT, this article contains that review. This article provides a description of where ACT fits within the larger category of cognitive behavior therapy (CBT):…

  18. Nitrogen trailer acceptance test report

    SciTech Connect

    Kostelnik, A.J.

    1996-02-12

    This Acceptance Test Report documents compliance with the requirements of specification WHC-S-0249. The equipment was tested according to WHC-SD-WM-ATP-108 Rev.0. The equipment being tested is a portable contained nitrogen supply. The test was conducted at Norco`s facility.

  19. Imaginary Companions and Peer Acceptance

    ERIC Educational Resources Information Center

    Gleason, Tracy R.

    2004-01-01

    Early research on imaginary companions suggests that children who create them do so to compensate for poor social relationships. Consequently, the peer acceptance of children with imaginary companions was compared to that of their peers. Sociometrics were conducted on 88 preschool-aged children; 11 had invisible companions, 16 had personified…

  20. Euthanasia Acceptance: An Attitudinal Inquiry.

    ERIC Educational Resources Information Center

    Klopfer, Fredrick J.; Price, William F.

    The study presented was conducted to examine potential relationships between attitudes regarding the dying process, including acceptance of euthanasia, and other attitudinal or demographic attributes. The data of the survey was comprised of responses given by 331 respondents to a door-to-door interview. Results are discussed in terms of preferred…

  1. Helping Our Children Accept Themselves.

    ERIC Educational Resources Information Center

    Gamble, Mae

    1984-01-01

    Parents of a child with muscular dystrophy recount their reactions to learning of the diagnosis, their gradual acceptance, and their son's resistance, which was gradually lessened when he was provided with more information and treated more normally as a member of the family. (CL)

  2. Repeatability and reproducibility in proteomic identifications by liquid chromatography-tandem mass spectrometry.

    PubMed

    Tabb, David L; Vega-Montoto, Lorenzo; Rudnick, Paul A; Variyath, Asokan Mulayath; Ham, Amy-Joan L; Bunk, David M; Kilpatrick, Lisa E; Billheimer, Dean D; Blackman, Ronald K; Cardasis, Helene L; Carr, Steven A; Clauser, Karl R; Jaffe, Jacob D; Kowalski, Kevin A; Neubert, Thomas A; Regnier, Fred E; Schilling, Birgit; Tegeler, Tony J; Wang, Mu; Wang, Pei; Whiteaker, Jeffrey R; Zimmerman, Lisa J; Fisher, Susan J; Gibson, Bradford W; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E; Tempst, Paul; Paulovich, Amanda G; Liebler, Daniel C; Spiegelman, Cliff

    2010-02-05

    The complexity of proteomic instrumentation for LC-MS/MS introduces many possible sources of variability. Data-dependent sampling of peptides constitutes a stochastic element at the heart of discovery proteomics. Although this variation impacts the identification of peptides, proteomic identifications are far from completely random. In this study, we analyzed interlaboratory data sets from the NCI Clinical Proteomic Technology Assessment for Cancer to examine repeatability and reproducibility in peptide and protein identifications. Included data spanned 144 LC-MS/MS experiments on four Thermo LTQ and four Orbitrap instruments. Samples included yeast lysate, the NCI-20 defined dynamic range protein mix, and the Sigma UPS 1 defined equimolar protein mix. Some of our findings reinforced conventional wisdom, such as repeatability and reproducibility being higher for proteins than for peptides. Most lessons from the data, however, were more subtle. Orbitraps proved capable of higher repeatability and reproducibility, but aberrant performance occasionally erased these gains. Even the simplest protein digestions yielded more peptide ions than LC-MS/MS could identify during a single experiment. We observed that peptide lists from pairs of technical replicates overlapped by 35-60%, giving a range for peptide-level repeatability in these experiments. Sample complexity did not appear to affect peptide identification repeatability, even as numbers of identified spectra changed by an order of magnitude. Statistical analysis of protein spectral counts revealed greater stability across technical replicates for Orbitraps, making them superior to LTQ instruments for biomarker candidate discovery. The most repeatable peptides were those corresponding to conventional tryptic cleavage sites, those that produced intense MS signals, and those that resulted from proteins generating many distinct peptides. Reproducibility among different instruments of the same type lagged behind

  3. Assessing the Validity and Reproducibility of an Iron Dietary Intake Questionnaire Conducted in a Group of Young Polish Women

    PubMed Central

    Głąbska, Dominika; Guzek, Dominika; Ślązak, Joanna; Włodarek, Dariusz

    2017-01-01

    The aim of the study was to analyse a designed brief iron dietary intake questionnaire based on a food frequency assessment (IRONIC-FFQ—IRON Intake Calculation-Food Frequency Questionnaire), including the assessment of validity and reproducibility in a group of 75 Polish women aged 20–30 years. Participants conducted 3-day dietary records and filled in the IRONIC-FFQ twice (FFQ1—directly after the dietary record and FFQ2—6 weeks later). The analysis included an assessment of validity (comparison with the results of the 3-day dietary record) and of reproducibility (comparison of the results obtained twice—FFQ1 and FFQ2). In the analysis of validity, the share of individuals correctly classified into tertiles was over 50% (weighted κ of 0.36), while analysis of correlation revealed correlation coefficients of almost 0.5. In the assessment of reproducibility, almost 80% of individuals were correctly classified and less than 3% were misclassified (weighted κ of 0.73), while a correlation coefficient higher than 0.85 was obtained. Both in the assessment of validity and of reproducibility, a Bland–Altman index of 6.7% was recorded (93.3% of compared pairs of results were in the acceptable range, attributed to differences within ± 2SD limit). Validation of the IRONIC-FFQ revealed a satisfactory level of validity and positively validated reproducibility. PMID:28264423

  4. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  5. Reproducibility of Research Algorithms in GOES-R Operational Software

    NASA Astrophysics Data System (ADS)

    Kennelly, E.; Botos, C.; Snell, H. E.; Steinfelt, E.; Khanna, R.; Zaccheo, T.

    2012-12-01

    The research to operations transition for satellite observations is an area of active interest as identified by The National Research Council Committee on NASA-NOAA Transition from Research to Operations. Their report recommends improved transitional processes for bridging technology from research to operations. Assuring the accuracy of operational algorithm results as compared to research baselines, called reproducibility in this paper, is a critical step in the GOES-R transition process. This paper defines reproducibility methods and measurements for verifying that operationally implemented algorithms conform to research baselines, demonstrated with examples from GOES-R software development. The approach defines reproducibility for implemented algorithms that produce continuous data in terms of a traditional goodness-of-fit measure (i.e., correlation coefficient), while the reproducibility for discrete categorical data is measured using a classification matrix. These reproducibility metrics have been incorporated in a set of Test Tools developed for GOES-R and the software processes have been developed to include these metrics to validate both the scientific and numerical implementation of the GOES-R algorithms. In this work, we outline the test and validation processes and summarize the current results for GOES-R Level 2+ algorithms.

  6. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  7. Reproducibility of the Structural Connectome Reconstruction across Diffusion Methods.

    PubMed

    Prčkovska, Vesna; Rodrigues, Paulo; Puigdellivol Sanchez, Ana; Ramos, Marc; Andorra, Magi; Martinez-Heras, Eloy; Falcon, Carles; Prats-Galino, Albert; Villoslada, Pablo

    2016-01-01

    Analysis of the structural connectomes can lead to powerful insights about the brain's organization and damage. However, the accuracy and reproducibility of constructing the structural connectome done with different acquisition and reconstruction techniques is not well defined. In this work, we evaluated the reproducibility of the structural connectome techniques by performing test-retest (same day) and longitudinal studies (after 1 month) as well as analyzing graph-based measures on the data acquired from 22 healthy volunteers (6 subjects were used for the longitudinal study). We compared connectivity matrices and tract reconstructions obtained with the most typical acquisition schemes used in clinical application: diffusion tensor imaging (DTI), high angular resolution diffusion imaging (HARDI), and diffusion spectrum imaging (DSI). We observed that all techniques showed high reproducibility in the test-retest analysis (correlation >.9). However, HARDI was the only technique with low variability (2%) in the longitudinal assessment (1-month interval). The intraclass coefficient analysis showed the highest reproducibility for the DTI connectome, however, with more sparse connections than HARDI and DSI. Qualitative (neuroanatomical) assessment of selected tracts confirmed the quantitative results showing that HARDI managed to detect most of the analyzed fiber groups and fanning fibers. In conclusion, we found that HARDI acquisition showed the most balanced trade-off between high reproducibility of the connectome, higher rate of path detection and of fanning fibers, and intermediate acquisition times (10-15 minutes), although at the cost of higher appearance of aberrant fibers.

  8. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    PubMed

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-01-05

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  9. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  10. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    PubMed Central

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-01-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research. PMID:27009765

  11. Making neurophysiological data analysis reproducible: why and how?

    PubMed

    Delescluse, Matthieu; Franconville, Romain; Joucla, Sébastien; Lieury, Tiffany; Pouzat, Christophe

    2012-01-01

    Reproducible data analysis is an approach aiming at complementing classical printed scientific articles with everything required to independently reproduce the results they present. "Everything" covers here: the data, the computer codes and a precise description of how the code was applied to the data. A brief history of this approach is presented first, starting with what economists have been calling replication since the early eighties to end with what is now called reproducible research in computational data analysis oriented fields like statistics and signal processing. Since efficient tools are instrumental for a routine implementation of these approaches, a description of some of the available ones is presented next. A toy example demonstrates then the use of two open source software programs for reproducible data analysis: the "Sweave family" and the org-mode of emacs. The former is bound to R while the latter can be used with R, Matlab, Python and many more "generalist" data processing software. Both solutions can be used with Unix-like, Windows and Mac families of operating systems. It is argued that neuroscientists could communicate much more efficiently their results by adopting the reproducible research paradigm from their lab books all the way to their articles, thesis and books.

  12. Reproducibility of radiomics for deciphering tumor phenotype with imaging.

    PubMed

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H

    2016-03-24

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.

  13. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    NASA Astrophysics Data System (ADS)

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-03-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.

  14. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  15. Reproducibility of thalamic segmentation based on probabilistic tractography.

    PubMed

    Traynor, Catherine; Heckemann, Rolf A; Hammers, Alexander; O'Muircheartaigh, Jonathan; Crum, William R; Barker, Gareth J; Richardson, Mark P

    2010-08-01

    Reliable identification of thalamic nuclei is required to improve targeting of electrodes used in Deep Brain Stimulation (DBS), and for exploring the role of thalamus in health and disease. A previously described method using probabilistic tractography to segment the thalamus based on connections to cortical target regions was implemented. Both within- and between-subject reproducibility were quantitatively assessed by the overlap of the resulting segmentations; the effect of two different numbers of target regions (6 and 31) on reproducibility of the segmentation results was also investigated. Very high reproducibility was observed when a single dataset was processed multiple times using different starting conditions. Thalamic segmentation was also very reproducible when multiple datasets from the same subject were processed using six cortical target regions. Within-subject reproducibility was reduced when the number of target regions was increased, particularly in medial and posterior regions of the thalamus. A large degree of overlap in segmentation results from different subjects was obtained, particularly in thalamic regions classified as connecting to frontal, parietal, temporal and pre-central cortical target regions.

  16. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients prior to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was

  17. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  18. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  19. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  20. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  1. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  2. Relevant principal factors affecting the reproducibility of insect primary culture.

    PubMed

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-02-22

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  3. Reproducibility of cephalometric measurements made by three radiology clinics.

    PubMed

    da Silveira, Heraldo Luis Dias; Silveira, Heloisa Emilia Dias

    2006-05-01

    The purpose of this study was to assess reproducibility of cephalometric measurements in cephalograms obtained by three dentomaxillofacial radiology clinics. Forty lateral cephalometric radiographs were selected and sent at different times to three different clinics for cephalometric analyses. Each clinic digitized the radiographs with the same resolution, and landmarks were located with the mouse pointer directly on the digitized radiographic image on the screen. Three cephalograms were obtained from each radiograph, totaling 120 analyses. Data were analyzed with analysis of variance. Of the 32 factors studied, reproducibility of results was satisfactory for only four factors: position of maxilla relative to anterior cranial base, inclination of occlusal plane relative to anterior cranial base, position of lower incisor relative to nasion-pogonion line, and soft-tissue profile of face (P < .05). Differences in cephalometric measurements were present and such differences were significant for most factors analyzed. The different cephalometric measurements obtained by the three dental radiology clinics were not reproducible.

  4. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  5. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  6. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  7. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    PubMed Central

    Wei, Haoran; Vikesland, Peter J.

    2015-01-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection. PMID:26658696

  8. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    NASA Astrophysics Data System (ADS)

    Wei, Haoran; Vikesland, Peter J.

    2015-12-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection.

  9. Acceptability of reactors in space

    SciTech Connect

    Buden, D.

    1981-01-01

    Reactors are the key to our future expansion into space. However, there has been some confusion in the public as to whether they are a safe and acceptable technology for use in space. The answer to these questions is explored. The US position is that when reactors are the preferred technical choice, that they can be used safely. In fact, it does not appear that reactors add measurably to the risk associated with the Space Transportation System.

  10. Acceptability of reactors in space

    SciTech Connect

    Buden, D.

    1981-04-01

    Reactors are the key to our future expansion into space. However, there has been some confusion in the public as to whether they are a safe and acceptable technology for use in space. The answer to these questions is explored. The US position is that when reactors are the preferred technical choice, that they can be used safely. In fact, it dies not appear that reactors add measurably to the risk associated with the Space Transportation System.

  11. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  12. Analytic cognitive style predicts religious and paranormal belief.

    PubMed

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A

    2012-06-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically.

  13. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  14. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  15. Unambiguous characterization of analytical markers in complex, seized opiate samples using an enhanced ion mobility trace detector-mass spectrometer.

    PubMed

    Liuni, Peter; Romanov, Vladimir; Binette, Marie-Josée; Zaknoun, Hafid; Tam, Maggie; Pilon, Pierre; Hendrikse, Jan; Wilson, Derek J

    2014-11-04

    Ion mobility spectroscopy (IMS)-based trace-compound detectors (TCDs) are powerful and widely implemented tools for the detection of illicit substances. They combine high sensitivity, reproducibility, rapid analysis time, and resistance to dirt with an acceptable false alarm rate. The analytical specificity of TCD-IMS instruments for a given analyte depends strongly on a detailed knowledge of the ion chemistry involved, as well as the ability to translate this knowledge into field-robust analytical methods. In this work, we introduce an enhanced hybrid TCD-IMS/mass spectrometer (TCD-IMS/MS) that combines the strengths of ion-mobility-based target compound detection with unambiguous identification by tandem MS. Building on earlier efforts along these lines (Kozole et al., Anal. Chem. 2011, 83, 8596-8603), the current instrument is capable of positive and negative-mode analyses with tightly controlled gating between the IMS and MS modules and direct measurement of ion mobility profiles. We demonstrate the unique capabilities of this instrument using four samples of opium seized by the Canada Border Services Agency (CBSA), consisting of a mixture of opioid alkaloids and other naturally occurring compounds typically found in these samples. Although many analytical methods have been developed for analyzing naturally occurring opiates, this is the first detailed ion mobility study on seized opium samples. This work demonstrates all available analytical modes for the new IMS-MS system including "single-gate", "dual-gate", MS/MS, and precursor ion scan methods. Using a combination of these modes, we unambiguously identify all signals in the IMS spectra, including previously uncharacterized minor peaks arising from compounds that are common in raw opium.

  16. Regional cerebral blood flow utilizing the gamma camera and xenon inhalation: reproducibility and clinical applications

    SciTech Connect

    Fox, R.A.; Knuckey, N.W.; Fleay, R.F.; Stokes, B.A.; Van der Schaaf, A.; Surveyor, I.

    1985-11-01

    A modified collimator and standard gamma camera have been used to measure regional cerebral blood flow following inhalation of radioactive xenon. The collimator and a simplified analysis technique enables excellent statistical accuracy to be achieved with acceptable precision in the measurement of grey matter blood flow. The validity of the analysis was supported by computer modelling and patient measurements. Sixty-one patients with subarachnoid hemorrhage, cerebrovascular disease or dementia were retested to determine the reproducibility of our method. The measured coefficient of variation was 6.5%. Of forty-six patients who had a proven subarachnoid hemorrhage, 15 subsequently developed cerebral ischaemia. These showed a CBF of 42 +/- 6 ml X minute-1 X 100 g brain-1 compared with 49 +/- 11 ml X minute-1 X 100 g brain-1 for the remainder. There is evidence that decreasing blood flow and low initial flow correlate with the subsequent onset of cerebral ischemia.

  17. Measurement of Liver Iron Concentration by MRI Is Reproducible

    PubMed Central

    Alústiza, José María; Emparanza, José I.; Castiella, Agustín; Casado, Alfonso; Aldazábal, Pablo; San Vicente, Manuel; Garcia, Nerea; Asensio, Ana Belén; Banales, Jesús; Salvador, Emma; Moyua, Aranzazu; Arozena, Xabier; Zarco, Miguel; Jauregui, Lourdes; Vicente, Ohiana

    2015-01-01

    Purpose. The objectives were (i) construction of a phantom to reproduce the behavior of iron overload in the liver by MRI and (ii) assessment of the variability of a previously validated method to quantify liver iron concentration between different MRI devices using the phantom and patients. Materials and Methods. A phantom reproducing the liver/muscle ratios of two patients with intermediate and high iron overload. Nine patients with different levels of iron overload were studied in 4 multivendor devices and 8 of them were studied twice in the machine where the model was developed. The phantom was analysed in the same equipment and 14 times in the reference machine. Results. FeCl3 solutions containing 0.3, 0.5, 0.6, and 1.2 mg Fe/mL were chosen to generate the phantom. The average of the intramachine variability for patients was 10% and for the intermachines 8%. For the phantom the intramachine coefficient of variation was always below 0.1 and the average of intermachine variability was 10% for moderate and 5% for high iron overload. Conclusion. The phantom reproduces the behavior of patients with moderate or high iron overload. The proposed method of calculating liver iron concentration is reproducible in several different 1.5 T systems. PMID:25874207

  18. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  19. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  20. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  1. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  2. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  3. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  4. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  5. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  6. Reproducibility of heart rate turbulence indexes in heart failure patients.

    PubMed

    D'Addio, Gianni; Cesarelli, Mario; Corbi, Graziamaria; Romano, Maria; Furgi, Giuseppe; Ferrara, Nicola; Rengo, Franco

    2010-01-01

    Cardiovascular oscillations following spontaneous ventricular premature complexes (VPC) are characterized by a short-term heart rate fluctuation known as heart rate turbulence (HRT) described by the so-called turbulence onset (TO) and slope (TS). Despite a recent written consensus on the standard of HRT measurement, reproducibility data are lacking. Aim of the paper was a reproducibility study of HRT indexes in heart failure patients (HF). Eleven HF patients underwent two 24h ECG Holter recordings, spaced 7 ± 5 days. A paired t test was used to assess the clinical stability of patients during the study period and the number of PVC in Holter recordings' couples. Both TO and TS indexes were calculated for each isolated VPC, and due to their skewed distribution, reproducibility of median and mean TO and TS was studied by Bland-Altman technique. Results showed that median HRT indexes might be preferred to commonly suggested mean values and that, although TO showed lower bias value than TS, TS can be considered much more reproducible than TO, comparing limits of agreements with normal values. This preliminary results suggest the use of medians instead of mean HRT indexes values and a reliability of the turbulence slope greater than the turbulence onset index.

  7. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    PubMed Central

    Al-Hosni, Zainab S; Al-Khabori, Murtadha; Al-Mamari, Sahimah; Al-Qasabi, Jamal; Davis, Hiedi; Al-Lawati, Hatim; Al-Riyami, Arwa Z

    2016-01-01

    Objectives Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20). Data were analyzed using intraclass correlation coefficient (ICC) as a method of reproducibility assessment. Results The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78). The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037). When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420). Conclusions The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts. PMID:27974955

  8. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  9. How reproducible is the acoustical characterization of porous media?

    PubMed

    Pompoli, Francesco; Bonfiglio, Paolo; Horoshenkov, Kirill V; Khan, Amir; Jaouen, Luc; Bécot, François-Xavier; Sgard, Franck; Asdrubali, Francesco; D'Alessandro, Francesco; Hübelt, Jörn; Atalla, Noureddine; Amédin, Celse K; Lauriks, Walter; Boeckx, Laurens

    2017-02-01

    There is a considerable number of research publications on the characterization of porous media that is carried out in accordance with ISO 10534-2 (International Standards Organization, Geneva, Switzerland, 2001) and/or ISO 9053 (International Standards Organization, Geneva, Switzerland, 1991). According to the Web of Science(TM) (last accessed 22 September 2016) there were 339 publications in the Journal of the Acoustical Society of America alone which deal with the acoustics of porous media. However, the reproducibility of these characterization procedures is not well understood. This paper deals with the reproducibility of some standard characterization procedures for acoustic porous materials. The paper is an extension of the work published by Horoshenkov, Khan, Bécot, Jaouen, Sgard, Renault, Amirouche, Pompoli, Prodi, Bonfiglio, Pispola, Asdrubali, Hübelt, Atalla, Amédin, Lauriks, and Boeckx [J. Acoust. Soc. Am. 122(1), 345-353 (2007)]. In this paper, independent laboratory measurements were performed on the same material specimens so that the naturally occurring inhomogeneity in materials was controlled. It also presented the reproducibility data for the characteristic impedance, complex wavenumber, and for some related pore structure properties. This work can be helpful to better understand the tolerances of these material characterization procedures so improvements can be developed to reduce experimental errors and improve the reproducibility between laboratories.

  10. Tractography of the optic radiation: a repeatability and reproducibility study.

    PubMed

    Dayan, Michael; Kreutzer, Sylvia; Clark, Chris A

    2015-04-01

    Our main objective was to evaluate the repeatability and reproducibility of optic radiation (OR) reconstruction from diffusion MRI (dMRI) data. 14 adults were scanned twice with the same 60-direction dMRI sequence. Peaks in the diffusion profile were estimated with the single tensor (ST), Q-ball (QSH) and persistent angular structure (PAS) methods. Segmentation of the OR was performed by two experimenters with probabilistic tractography based on a manually drawn region-of-interest (ROI) protocol typically employed for OR segmentation, with both standard and extended sets of ROIs. The repeatability and reproducibility were assessed by calculating the intra-class correlation coefficient (ICC) of intra- and inter-rater experiments, respectively. ICCs were calculated for commonly used dMRI metrics (FA, MD, AD, RD) and anatomical dimensions of the optic radiation (distance from Meyer's loop to the temporal pole, ML-TP), as well as the Dice similarity coefficient (DSC) between the raters' OR segmentation. Bland-Altman plots were also calculated to investigate bias and variability in the reproducibility measurements. The OR was successfully reconstructed in all subjects by both raters. The ICC was found to be in the good to excellent range for both repeatability and reproducibility of the dMRI metrics, DSC and ML-TP distance. The Bland-Altman plots did not show any apparent systematic bias for any quantities. Overall, higher ICC values were found for the multi-fiber methods, QSH and PAS, and for the standard set of ROIs. Considering the good to excellent repeatability and reproducibility of all the quantities investigated, these findings support the use of multi-fiber OR reconstruction with a limited number of manually drawn ROIs in clinical applications utilizing either OR microstructure characterization or OR dimensions, as is the case in neurosurgical planning for temporal lobectomy.

  11. Reproducibility of anthropometric measurements in children: a longitudinal study.

    PubMed

    Leppik, Aire; Jürimäe, Toivo; Jürimäe, Jaak

    2004-03-01

    The purpose of this study was to establish the reproducibility of a series of anthropometric measures performed twice during one week during a three year period in boys and girls. The subjects of this investigation were 39 children (21 boys and 18 girls), 9-10 year of age at the beginning of the study. Children were measured three times with one year interval. Children were classified by Tanner stage 1-2 during the first measurements, stage 1-3 during the second measurements and stage 1-4 during the third measurements. Body height and weight were measured and BMI calculated. All anthropometric parameters were measured according to the protocol recommended by the International Society for the Advancement of Kinanthropometry (Norton & Olds 1996). Nine skinfolds, 13 girths, eight lengths and eight breadths/lengths were measured. The reproducibility of body height (r = 0.995-0.999), body weight (r = 0.990-0.999) and BMI (r = 0.969-0.999) was very high in boys and girls. The intraclass correlations (ICC), technical errors (TE) and coefficients of variation (CV) were quite different depending on the measurement site of the skinfold thickness. It was surprising that the ICCs were highest and TEs and CVs were lowest during the second year of the measurement. The computed ICC was high, and TE and CV values were quite similar and relatively low in girth, length and breadth/length measurements. It was concluded that the reproducibility of girths, lengths and breadths/lengths in children is very high and the reproducibility of skinfolds is high. Specifically, the reproducibility is very high immediately before puberty in boys and girls.

  12. Evaluation of measurement reproducibility using the standard-sites data, 1994 Fernald field characterization demonstration project

    SciTech Connect

    Rautman, C.A.

    1996-02-01

    The US Department of Energy conducted the 1994 Fernald (Ohio) field characterization demonstration project to evaluate the performance of a group of both industry-standard and proposed alternative technologies in describing the nature and extent of uranium contamination in surficial soils. Detector stability and measurement reproducibility under actual operating conditions encountered in the field is critical to establishing the credibility of the proposed alternative characterization methods. Comparability of measured uranium activities to those reported by conventional, US Environmental Protection Agency (EPA)-certified laboratory methods is also required. The eleven (11) technologies demonstrated included (1) EPA-standard soil sampling and laboratory mass-spectroscopy analyses, and currently-accepted field-screening techniques using (2) sodium-iodide scintillometers, (3) FIDLER low-energy scintillometers, and (4) a field-portable x-ray fluorescence spectrometer. Proposed advanced characterization techniques included (5) alpha-track detectors, (6) a high-energy beta scintillometer, (7) electret ionization chambers, (8) and (9) a high-resolution gamma-ray spectrometer in two different configurations, (10) a field-adapted laser ablation-inductively coupled plasma-atomic emission spectroscopy (ICP-AES) technique, and (11) a long-range alpha detector. Measurement reproducibility and the accuracy of each method were tested by acquiring numerous replicate measurements of total uranium activity at each of two ``standard sites`` located within the main field demonstration area. Meteorological variables including temperature, relative humidity. and 24-hour rainfall quantities were also recorded in conjunction with the standard-sites measurements.

  13. PH Tester Gauge Repeatability and Reproducibility Study for WO3 Nanostructure Hydrothermal Growth Process

    NASA Astrophysics Data System (ADS)

    Abd Rashid, Amirul; Hayati Saad, Nor; Bien Chia Sheng, Daniel; Yee, Lee Wai

    2014-06-01

    PH value is one of the important variables for tungsten trioxide (WO3) nanostructure hydrothermal synthesis process. The morphology of the synthesized nanostructure can be properly controlled by measuring and controlling the pH value of the solution used in this facile synthesis route. Therefore, it is very crucial to ensure the gauge used for pH measurement is reliable in order to achieve the expected result. In this study, gauge repeatability and reproducibility (GR&R) method was used to assess the repeatability and reproducibility of the pH tester. Based on ANOVA method, the design of experimental metrics as well as the result of the experiment was analyzed using Minitab software. It was found that the initial GR&R value for the tester was at 17.55 % which considered as acceptable. To further improve the GR&R level, a new pH measuring procedure was introduced. With the new procedure, the GR&R value was able to be reduced to 2.05%, which means the tester is statistically very ideal to measure the pH of the solution prepared for WO3 hydrothermal synthesis process.

  14. Explosive Strength of the Knee Extensors: The Influence of Criterion Trial Detection Methodology on Measurement Reproducibility.

    PubMed

    Dirnberger, Johannes; Wiesinger, Hans-Peter; Wiemer, Nicolas; Kösters, Alexander; Müller, Erich

    2016-04-01

    The present study was conducted to assess test-retest reproducibility of explosive strength measurements during single-joint isometric knee extension using the IsoMed 2000 dynamometer. Thirty-one physically active male subjects (mean age: 23.7 years) were measured on two occasions separated by 48-72 h. The intraclass correlation coefficient (ICC 2,1) and the coefficient of variation (CV) were calculated for (i) maximum torque (MVC), (ii) the peak rate of torque development (RTDpeak) as well as for (iii) the average rate of torque development (RTD) and the impulse taken at several predefined time intervals (0-30 to 0-300 ms); thereby explosive strength variables were derived in two conceptually different versions: on the one hand from the MVC-trial (version I), on the other hand from the trial showing the RTDpeak (version II). High ICC-values (0.80-0.99) and acceptable CV-values (1.9-8.7%) could be found for MVC as well as for the RTD and the impulse taken at time intervals of ≥100 ms, regardless of whether version I or II was used. In contrast, measurements of the RTDpeak as well as the RTD and the impulse taken during the very early contraction phase (i.e. RTD/impulse0-30ms and RTD/impulse0-50ms) showed clearly weaker reproducibility results (ICC: 0.53-0.84; CV: 7.3-16.4%) and gave rise to considerable doubts as to clinical usefulness, especially when derived using version I. However, if there is a need to measure explosive strength for earlier time intervals in practice, it is, in view of stronger reproducibility results, recommended to concentrate on measures derived from version II, which is based on the RTDpeak-trial.

  15. Signals: Applying Academic Analytics

    ERIC Educational Resources Information Center

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  16. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  17. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  18. Analytical mass spectrometry

    SciTech Connect

    Not Available

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  19. Analytical mass spectrometry. Abstracts

    SciTech Connect

    Not Available

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  20. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  1. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  2. Axelrod model: accepting or discussing

    NASA Astrophysics Data System (ADS)

    Dybiec, Bartlomiej; Mitarai, Namiko; Sneppen, Kim

    2012-10-01

    Agents building social systems are characterized by complex states, and interactions among individuals can align their opinions. The Axelrod model describes how local interactions can result in emergence of cultural domains. We propose two variants of the Axelrod model where local consensus is reached either by listening and accepting one of neighbors' opinion or two agents discuss their opinion and achieve an agreement with mixed opinions. We show that the local agreement rule affects the character of the transition between the single culture and the multiculture regimes.

  3. Electrochemiluminescence detection in microfluidic cloth-based analytical devices.

    PubMed

    Guan, Wenrong; Liu, Min; Zhang, Chunsun

    2016-01-15

    This work describes the first approach at combining microfluidic cloth-based analytical devices (μCADs) with electrochemiluminescence (ECL) detection. Wax screen-printing is employed to make cloth-based microfluidic chambers which are patterned with carbon screen-printed electrodes (SPEs) to create truly disposable, simple, inexpensive sensors which can be read with a low-cost, portable charge coupled device (CCD) imaging sensing system. And, the two most commonly used ECL systems of tris(2,2'-bipyridyl)ruthenium(II)/tri-n-propylamine (Ru(bpy)3(2+)/TPA) and 3-aminophthalhydrazide/hydrogen peroxide (luminol/H2O2) are applied to demonstrate the quantitative ability of the ECL μCADs. In this study, the proposed devices have successfully fulfilled the determination of TPA with a linear range from 2.5 to 2500μM with a detection limit of 1.265μM. In addition, the detection of H2O2 can be performed in the linear range of 0.05-2.0mM, with a detection limit of 0.027mM. It has been shown that the ECL emission on the wax-patterned cloth device has an acceptable sensitivity, stability and reproducibility. Finally, the applicability of cloth-based ECL is demonstrated for determination of glucose in phosphate buffer solution (PBS) and artificial urine (AU) samples, with the detection limits of 0.032mM and 0.038mM, respectively. It can be foreseen, therefore, that μCADs with ECL detection could provide a new sensing platform for point-of-care testing, public health, food safety detection and environmental monitoring in remote regions, developing or developed countries.

  4. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  5. Reproducibility and Comparability of Computational Models for Astrocyte Calcium Excitability

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2017-01-01

    The scientific community across all disciplines faces the same challenges of ensuring accessibility, reproducibility, and efficient comparability of scientific results. Computational neuroscience is a rapidly developing field, where reproducibility and comparability of research results have gained increasing interest over the past years. As the number of computational models of brain functions is increasing, we chose to address reproducibility using four previously published computational models of astrocyte excitability as an example. Although not conventionally taken into account when modeling neuronal systems, astrocytes have been shown to take part in a variety of in vitro and in vivo phenomena including synaptic transmission. Two of the selected astrocyte models describe spontaneous calcium excitability, and the other two neurotransmitter-evoked calcium excitability. We specifically addressed how well the original simulation results can be reproduced with a reimplementation of the models. Additionally, we studied how well the selected models can be reused and whether they are comparable in other stimulation conditions and research settings. Unexpectedly, we found out that three of the model publications did not give all the necessary information required to reimplement the models. In addition, we were able to reproduce the original results of only one of the models completely based on the information given in the original publications and in the errata. We actually found errors in the equations provided by two of the model publications; after modifying the equations accordingly, the original results were reproduced more accurately. Even though the selected models were developed to describe the same biological event, namely astrocyte calcium excitability, the models behaved quite differently compared to one another. Our findings on a specific set of published astrocyte models stress the importance of proper validation of the models against experimental wet

  6. The Dutch motor skills assessment as tool for talent development in table tennis: a reproducibility and validity study.

    PubMed

    Faber, Irene R; Nijhuis-Van Der Sanden, Maria W G; Elferink-Gemser, Marije T; Oosterveld, Frits G J

    2015-01-01

    A motor skills assessment could be helpful in talent development by estimating essential perceptuo-motor skills of young players, which are considered requisite to develop excellent technical and tactical qualities. The Netherlands Table Tennis Association uses a motor skills assessment in their talent development programme consisting of eight items measuring perceptuo-motor skills specific to table tennis under varying conditions. This study aimed to investigate this assessment regarding its reproducibility, internal consistency, underlying dimensions and concurrent validity in 113 young table tennis players (6-10 years). Intraclass correlation coefficients of six test items met the criteria of 0.7 with coefficients of variation between 3% and 8%. Cronbach's alpha valued 0.853 for internal consistency. The principal components analysis distinguished two conceptually meaningful factors: "ball control" and "gross motor function." Concurrent validity analyses demonstrated moderate associations between the motor skills assessment's results and national ranking; boys r = -0.53 (P < 0.001) and girls r = -0.45 (P = 0.015). In conclusion, this evaluation demonstrated six test items with acceptable reproducibility, good internal consistency and good prospects for validity. Two test items need revision to upgrade reproducibility. Since the motor skills assessment seems to be a reproducible, objective part of a talent development programme, more longitudinal studies are required to investigate its predictive validity.

  7. An open science resource for establishing reliability and reproducibility in functional connectomics

    PubMed Central

    Zuo, Xi-Nian; Anderson, Jeffrey S; Bellec, Pierre; Birn, Rasmus M; Biswal, Bharat B; Blautzik, Janusch; Breitner, John C.S; Buckner, Randy L; Calhoun, Vince D; Castellanos, F. Xavier; Chen, Antao; Chen, Bing; Chen, Jiangtao; Chen, Xu; Colcombe, Stanley J; Courtney, William; Craddock, R Cameron; Di Martino, Adriana; Dong, Hao-Ming; Fu, Xiaolan; Gong, Qiyong; Gorgolewski, Krzysztof J; Han, Ying; He, Ye; He, Yong; Ho, Erica; Holmes, Avram; Hou, Xiao-Hui; Huckins, Jeremy; Jiang, Tianzi; Jiang, Yi; Kelley, William; Kelly, Clare; King, Margaret; LaConte, Stephen M; Lainhart, Janet E; Lei, Xu; Li, Hui-Jie; Li, Kaiming; Li, Kuncheng; Lin, Qixiang; Liu, Dongqiang; Liu, Jia; Liu, Xun; Liu, Yijun; Lu, Guangming; Lu, Jie; Luna, Beatriz; Luo, Jing; Lurie, Daniel; Mao, Ying; Margulies, Daniel S; Mayer, Andrew R; Meindl, Thomas; Meyerand, Mary E; Nan, Weizhi; Nielsen, Jared A; O’Connor, David; Paulsen, David; Prabhakaran, Vivek; Qi, Zhigang; Qiu, Jiang; Shao, Chunhong; Shehzad, Zarrar; Tang, Weijun; Villringer, Arno; Wang, Huiling; Wang, Kai; Wei, Dongtao; Wei, Gao-Xia; Weng, Xu-Chu; Wu, Xuehai; Xu, Ting; Yang, Ning; Yang, Zhi; Zang, Yu-Feng; Zhang, Lei; Zhang, Qinglin; Zhang, Zhe; Zhang, Zhiqiang; Zhao, Ke; Zhen, Zonglei; Zhou, Yuan; Zhu, Xing-Ting; Milham, Michael P

    2014-01-01

    Efforts to identify meaningful functional imaging-based biomarkers are limited by the ability to reliably characterize inter-individual differences in human brain function. Although a growing number of connectomics-based measures are reported to have moderate to high test-retest reliability, the variability in data acquisition, experimental designs, and analytic methods precludes the ability to generalize results. The Consortium for Reliability and Reproducibility (CoRR) is working to address this challenge and establish test-retest reliability as a minimum standard for methods development in functional connectomics. Specifically, CoRR has aggregated 1,629 typical individuals’ resting state fMRI (rfMRI) data (5,093 rfMRI scans) from 18 international sites, and is openly sharing them via the International Data-sharing Neuroimaging Initiative (INDI). To allow researchers to generate various estimates of reliability and reproducibility, a variety of data acquisition procedures and experimental designs are included. Similarly, to enable users to assess the impact of commonly encountered artifacts (for example, motion) on characterizations of inter-individual variation, datasets of varying quality are included. PMID:25977800

  8. Quality assurance management plan (QAPP) special analytical support (SAS)

    SciTech Connect

    LOCKREM, L.L.

    1999-05-20

    It is the policy of Special Analytical Support (SAS) that the analytical aspects of all environmental data generated and processed in the laboratory, subject to the Environmental Protection Agency (EPA), U.S. Department of Energy or other project specific requirements, be of known and acceptable quality. It is the intention of this QAPP to establish and assure that an effective quality controlled management system is maintained in order to meet the quality requirements of the intended use(s) of the data.

  9. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  10. Utility, reliability and reproducibility of immunoassay multiplex kits.

    PubMed

    Tighe, Paddy; Negm, Ola; Todd, Ian; Fairclough, Lucy

    2013-05-15

    Multiplex technologies are becoming increasingly important in biomarker studies as they enable patterns of biomolecules to be examined, which provide a more comprehensive depiction of disease than individual biomarkers. They are crucial in deciphering these patterns, but it is essential that they are endorsed for reliability, reproducibility and precision. Here we outline the theoretical basis of a variety of multiplex technologies: Bead-based multiplex immunoassays (i.e. Cytometric Bead Arrays, Luminex™ and Bio-Plex Pro™), microtitre plate-based arrays (i.e. Mesoscale Discovery (MSD) and Quantsys BioSciences QPlex), Slide-based Arrays (i.e. FastQuant™) and reverse phase protein arrays. Their utility, reliability and reproducibility are discussed.

  11. Highly reproducible SERS arrays directly written by inkjet printing

    NASA Astrophysics Data System (ADS)

    Yang, Qiang; Deng, Mengmeng; Li, Huizeng; Li, Mingzhu; Zhang, Cong; Shen, Weizhi; Li, Yanan; Guo, Dan; Song, Yanlin

    2014-12-01

    SERS arrays with uniform gold nanoparticle distribution were fabricated by direct-writing with an inkjet printing method. Quantitative analysis based on Raman detection was achieved with a small standard statistical deviation of less than 4% for the reproducibility and less than 5% for the long-term stability for 12 weeks.SERS arrays with uniform gold nanoparticle distribution were fabricated by direct-writing with an inkjet printing method. Quantitative analysis based on Raman detection was achieved with a small standard statistical deviation of less than 4% for the reproducibility and less than 5% for the long-term stability for 12 weeks. Electronic supplementary information (ESI) available: Additional information on the experimental details, gold nanoparticle characterization, and theoretical calculation for the diameters of contact area of droplets on substrates with different contact angles. See DOI: 10.1039/c4nr04656k

  12. jicbioimage: a tool for automated and reproducible bioimage analysis

    PubMed Central

    Hartley, Matthew

    2016-01-01

    There has been steady improvement in methods for capturing bioimages. However analysing these images still remains a challenge. The Python programming language provides a powerful and flexible environment for scientific computation. It has a wide range of supporting libraries for image processing but lacks native support for common bioimage formats, and requires specific code to be written to ensure that suitable audit trails are generated and analyses are reproducible. Here we describe the development of a Python tool that: (1) allows users to quickly view and explore microscopy data; (2) generate reproducible analyses, encoding a complete history of image transformations from raw data to final result; and (3) scale up analyses from initial exploration to high throughput processing pipelines, with a minimal amount of extra effort. The tool, jicbioimage, is open source and freely available online at http://jicbioimage.readthedocs.io. PMID:27896026

  13. Reproducibility Issues: Avoiding Pitfalls in Animal Inflammation Models.

    PubMed

    Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E

    2017-01-01

    In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite evident advancements in searching for alternatives, that is, replacing, reducing, and refining animal experiments-the three R's of Russel and Burch (1959)-understanding the complex interactions of the cells of the immune system, the nervous system and the affected tissue/organ during inflammation critically relies on in vivo models. Consequently, scientific advancement and ultimately novel therapeutic interventions depend on improving the reproducibility of animal inflammation models. As a prelude to the remaining hands-on protocols described in this volume, here, we summarize potential pitfalls of preclinical animal research and provide resources and background reading on how to avoid them.

  14. Data Sharing and Reproducible Clinical Genetic Testing: Successes and Challenges

    PubMed Central

    Yang, Shan; Cline, Melissa; Zhang, Can; Paten, Benedict; Lincoln, Stephen e.

    2016-01-01

    Open sharing of clinical genetic data promises to both monitor and eventually improve the reproducibility of variant interpretation among clinical testing laboratories. A significant public data resource has been developed by the NIH ClinVar initiative, which includes submissions from hundreds of laboratories and clinics worldwide. We analyzed a subset of ClinVar data focused on specific clinical areas and we find high reproducibility (>90% concordance) among labs, although challenges for the community are clearly identified in this dataset. We further review results for the commonly tested BRCA1 and BRCA2 genes, which show even higher concordance, although the significant fragmentation of data into different silos presents an ongoing challenge now being addressed by the BRCA Exchange. We encourage all laboratories and clinics to contribute to these important resources. PMID:27896972

  15. jicbioimage: a tool for automated and reproducible bioimage analysis.

    PubMed

    Olsson, Tjelvar S G; Hartley, Matthew

    2016-01-01

    There has been steady improvement in methods for capturing bioimages. However analysing these images still remains a challenge. The Python programming language provides a powerful and flexible environment for scientific computation. It has a wide range of supporting libraries for image processing but lacks native support for common bioimage formats, and requires specific code to be written to ensure that suitable audit trails are generated and analyses are reproducible. Here we describe the development of a Python tool that: (1) allows users to quickly view and explore microscopy data; (2) generate reproducible analyses, encoding a complete history of image transformations from raw data to final result; and (3) scale up analyses from initial exploration to high throughput processing pipelines, with a minimal amount of extra effort. The tool, jicbioimage, is open source and freely available online at http://jicbioimage.readthedocs.io.

  16. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-08

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  17. Implementation of a portable and reproducible parallel pseudorandom number generator

    SciTech Connect

    Pryor, D.V.; Cuccaro, S.A.; Mascagni, M.; Robinson, M.L.

    1994-12-31

    The authors describe in detail the parallel implementation of a family of additive lagged-Fibonacci pseudorandom number generators. The theoretical structure of these generators is exploited to preserve their well-known randomness properties and to provide a parallel system in of distinct cycles. The algorithm presented here solves the reproducibility problem for a far larger class of parallel Monte Carlo applications than has been previously possible. In particular, Monte Carlo applications that undergo ``splitting`` can be coded to be reproducible, independent both of the number of processors and the execution order of the parallel processes. A library of portable C routines (available from the authors) that implements these ideas is also described.

  18. Interobserver reproducibility of radiographic evaluation of lumbar spine instability

    PubMed Central

    Segundo, Saulo de Tarso de Sá Pereira; Valesin, Edgar Santiago; Lenza, Mario; Santos, Durval do Carmo Barros; Rosemberg, Laercio Alberto; Ferretti, Mario

    2016-01-01

    ABSTRACT Objective: To measure the interobserver reproducibility of the radiographic evaluation of lumbar spine instability. Methods: Measurements of the dynamic radiographs of the lumbar spine in lateral view were performed, evaluating the anterior translation and the angulation among the vertebral bodies. The tests were evaluated at workstations of the organization, through the Carestream Health Vue RIS (PACS), version 11.0.12.14 Inc. 2009© system. Results: Agreement in detecting cases of radiographic instability among the observers varied from 88.1 to 94.4%, and the agreement coefficients AC1 were all above 0.8, indicating excellent agreement. Conclusion: The interobserver analysis performed among orthopedic surgeons with different levels of training in dynamic radiographs of the spine obtained high reproducibility and agreement. However, some factors, such as the manual method of measurement and the presence of vertebral osteophytes, might have generated a few less accurate results in this comparative evaluation of measurements. PMID:27759827

  19. Empowering Multi-Cohort Gene Expression Analysis to Increase Reproducibility

    PubMed Central

    Haynes, Winston A; Vallania, Francesco; Liu, Charles; Bongen, Erika; Tomczak, Aurelie; Andres-Terrè, Marta; Lofgren, Shane; Tam, Andrew; Deisseroth, Cole A; Li, Matthew D; Sweeney, Timothy E

    2016-01-01

    A major contributor to the scientific reproducibility crisis has been that the results from homogeneous, single-center studies do not generalize to heterogeneous, real world populations. Multi-cohort gene expression analysis has helped to increase reproducibility by aggregating data from diverse populations into a single analysis. To make the multi-cohort analysis process more feasible, we have assembled an analysis pipeline which implements rigorously studied meta-analysis best practices. We have compiled and made publicly available the results of our own multi-cohort gene expression analysis of 103 diseases, spanning 615 studies and 36,915 samples, through a novel and interactive web application. As a result, we have made both the process of and the results from multi-cohort gene expression analysis more approachable for non-technical users. PMID:27896970

  20. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  1. Pressure stabilizer for reproducible picoinjection in droplet microfluidic systems.

    PubMed

    Rhee, Minsoung; Light, Yooli K; Yilmaz, Suzan; Adams, Paul D; Saxena, Deepak; Meagher, Robert J; Singh, Anup K

    2014-12-07

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells.

  2. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2016-07-12

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  3. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  4. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  5. Highly reproducible Bragg grating acousto-ultrasonic contact transducers

    NASA Astrophysics Data System (ADS)

    Saxena, Indu Fiesler; Guzman, Narciso; Lieberman, Robert A.

    2014-09-01

    Fiber optic acousto-ultrasonic transducers offer numerous applications as embedded sensors for impact and damage detection in industrial and aerospace applications as well as non-destructive evaluation. Superficial contact transducers with a sheet of fiber optic Bragg gratings has been demonstrated for guided wave ultrasound based measurements. It is reported here that this method of measurement provides highly reproducible guided ultrasound data of the test composite component, despite the optical fiber transducers not being permanently embedded in it.

  6. How reproducible are the measurements of leaf fluctuating asymmetry?

    PubMed Central

    2015-01-01

    Fluctuating asymmetry (FA) represents small, non-directional deviations from perfect symmetry in morphological characters. FA is generally assumed to increase in response to stress; therefore, FA is frequently used in ecological studies as an index of environmental or genetic stress experienced by an organism. The values of FA are usually small, and therefore the reliable detection of FA requires precise measurements. The reproducibility of fluctuating asymmetry (FA) was explored by comparing the results of measurements of scanned images of 100 leaves of downy birch (Betula pubescens) conducted by 31 volunteer scientists experienced in studying plant FA. The median values of FA varied significantly among the participants, from 0.000 to 0.074, and the coefficients of variation in FA for individual leaves ranged from 25% to 179%. The overall reproducibility of the results among the participants was rather low (0.074). Variation in instruments and methods used by the participants had little effect on the reported FA values, but the reproducibility of the measurements increased by 30% following exclusion of data provided by seven participants who had modified the suggested protocol for leaf measurements. The scientists working with plant FA are advised to pay utmost attention to adequate and detailed description of their data acquisition protocols in their forthcoming publications, because all characteristics of instruments and methods need to be controlled to increase the quality and reproducibility of the data. Whenever possible, the images of all measured objects and the results of primary measurements should be published as electronic appendices to scientific papers. PMID:26157612

  7. How reproducible are the measurements of leaf fluctuating asymmetry?

    PubMed

    Kozlov, Mikhail V

    2015-01-01

    Fluctuating asymmetry (FA) represents small, non-directional deviations from perfect symmetry in morphological characters. FA is generally assumed to increase in response to stress; therefore, FA is frequently used in ecological studies as an index of environmental or genetic stress experienced by an organism. The values of FA are usually small, and therefore the reliable detection of FA requires precise measurements. The reproducibility of fluctuating asymmetry (FA) was explored by comparing the results of measurements of scanned images of 100 leaves of downy birch (Betula pubescens) conducted by 31 volunteer scientists experienced in studying plant FA. The median values of FA varied significantly among the participants, from 0.000 to 0.074, and the coefficients of variation in FA for individual leaves ranged from 25% to 179%. The overall reproducibility of the results among the participants was rather low (0.074). Variation in instruments and methods used by the participants had little effect on the reported FA values, but the reproducibility of the measurements increased by 30% following exclusion of data provided by seven participants who had modified the suggested protocol for leaf measurements. The scientists working with plant FA are advised to pay utmost attention to adequate and detailed description of their data acquisition protocols in their forthcoming publications, because all characteristics of instruments and methods need to be controlled to increase the quality and reproducibility of the data. Whenever possible, the images of all measured objects and the results of primary measurements should be published as electronic appendices to scientific papers.

  8. Icy: an open bioimage informatics platform for extended reproducible research.

    PubMed

    de Chaumont, Fabrice; Dallongeville, Stéphane; Chenouard, Nicolas; Hervé, Nicolas; Pop, Sorin; Provoost, Thomas; Meas-Yedid, Vannary; Pankajakshan, Praveen; Lecomte, Timothée; Le Montagner, Yoann; Lagache, Thibault; Dufour, Alexandre; Olivo-Marin, Jean-Christophe

    2012-06-28

    Current research in biology uses evermore complex computational and imaging tools. Here we describe Icy, a collaborative bioimage informatics platform that combines a community website for contributing and sharing tools and material, and software with a high-end visual programming framework for seamless development of sophisticated imaging workflows. Icy extends the reproducible research principles, by encouraging and facilitating the reusability, modularity, standardization and management of algorithms and protocols. Icy is free, open-source and available at http://icy.bioimageanalysis.org/.

  9. Analytic thinking reduces belief in conspiracy theories.

    PubMed

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories.

  10. Visual Analytics 101

    SciTech Connect

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    2016-06-13

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  11. Meal Replacement Mass Reduction and Integration Acceptability Study

    NASA Technical Reports Server (NTRS)

    Sirmons, T.; Cooper, M.; Douglas, G.; Barrett, A.; Richardson, M.; Arias, D.; Schneiderman, J.; Slack, K.; Ploutz-Snyder R.

    2016-01-01

    NASA, in planning for long duration missions, has an imperative to provide a food system with the necessary nutrition, acceptability, and safety to ensure sustainment of crew health and performance. The Orion Multi-Purpose Crew Vehicle (MPCV) and future exploration missions are mass constrained; therefore we are challenged to reduce the mass of the food system by 10% while maintaining safety, nutrition, and acceptability for exploration missions. Food bars have previously been used to supplement meals in the Skylab food system, indicating that regular consumption of bars will be acceptable. However, commercially available products do not meet the requirements for a full meal replacement in the spaceflight food system. The purpose of this task is to develop a variety of nutritionally balanced breakfast replacement bars, which meet spaceflight nutritional, microbiological, sensorial, and shelf-life requirements, while enabling a 10% food mass savings. To date, six nutrient-dense meal replacement bars have been developed, using both traditional methods of compression as well as novel ultrasonic compression technologies developed by Creative Resonance Inc. (Phoenix, AZ). All bars will be prioritized based on acceptability and the four top candidates will be evaluated in the Human Exploration Research Analog (HERA) to assess the frequency with which actual meal replacement options may be implemented. Specifically, overall impact to mood, satiety, dietary discomfort, and satisfaction with food will be analyzed to inform successful implementation strategies. In addition, these bars will be evaluated based on final product sensory acceptability, nutritional stability, qualitative stability of analytical measurements (i.e. water activity and texture), and microbiological compliance over two years of storage at room temperature and potential temperature abuse conditions to predict long-term acceptability. It is expected that this work will enable a successful meal

  12. CRKSPH - A Conservative Reproducing Kernel Smoothed Particle Hydrodynamics Scheme

    NASA Astrophysics Data System (ADS)

    Frontiere, Nicholas; Raskin, Cody D.; Owen, J. Michael

    2017-03-01

    We present a formulation of smoothed particle hydrodynamics (SPH) that utilizes a first-order consistent reproducing kernel, a smoothing function that exactly interpolates linear fields with particle tracers. Previous formulations using reproducing kernel (RK) interpolation have had difficulties maintaining conservation of momentum due to the fact the RK kernels are not, in general, spatially symmetric. Here, we utilize a reformulation of the fluid equations such that mass, linear momentum, and energy are all rigorously conserved without any assumption about kernel symmetries, while additionally maintaining approximate angular momentum conservation. Our approach starts from a rigorously consistent interpolation theory, where we derive the evolution equations to enforce the appropriate conservation properties, at the sacrifice of full consistency in the momentum equation. Additionally, by exploiting the increased accuracy of the RK method's gradient, we formulate a simple limiter for the artificial viscosity that reduces the excess diffusion normally incurred by the ordinary SPH artificial viscosity. Collectively, we call our suite of modifications to the traditional SPH scheme Conservative Reproducing Kernel SPH, or CRKSPH. CRKSPH retains many benefits of traditional SPH methods (such as preserving Galilean invariance and manifest conservation of mass, momentum, and energy) while improving on many of the shortcomings of SPH, particularly the overly aggressive artificial viscosity and zeroth-order inaccuracy. We compare CRKSPH to two different modern SPH formulations (pressure based SPH and compatibly differenced SPH), demonstrating the advantages of our new formulation when modeling fluid mixing, strong shock, and adiabatic phenomena.

  13. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo.

  14. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  15. Reproducibility of SELDI Spectra Across Time and Laboratories

    PubMed Central

    Diao, Lixia; Clarke, Charlotte H.; Coombes, Kevin R.; Hamilton, Stanley R.; Roth, Jack; Mao, Li; Czerniak, Bogdan; Baggerly, Keith A.; Morris, Jeffrey S.; Fung, Eric T.; Bast, Robert C.

    2011-01-01

    This is an open access article. Unrestricted non-commercial use is permitted provided the original work is properly cited. The reproducibility of mass spectrometry (MS) data collected using surface enhanced laser desorption/ionization-time of flight (SELDI-TOF) has been questioned. This investigation was designed to test the reproducibility of SELDI data collected over time by multiple users and instruments. Five laboratories prepared arrays once every week for six weeks. Spectra were collected on separate instruments in the individual laboratories. Additionally, all of the arrays produced each week were rescanned on a single instrument in one laboratory. Lab-to-lab and array-to-array variability in alignment parameters were larger than the variability attributable to running samples during different weeks. The coefficient of variance (CV) in spectrum intensity ranged from 25% at baseline, to 80% in the matrix noise region, to about 50% during the exponential drop from the maximum matrix noise. Before normalization, the median CV of the peak heights was 72% and reduced to about 20% after normalization. Additionally, for the spectra from a common instrument, the CV ranged from 5% at baseline, to 50% in the matrix noise region, to 20% during the drop from the maximum matrix noise. Normalization reduced the variability in peak heights to about 18%. With proper processing methods, SELDI instruments produce spectra containing large numbers of reproducibly located peaks, with consistent heights. PMID:21552492

  16. A neural mechanism for sensing and reproducing a time interval

    PubMed Central

    Jazayeri, Mehrdad; Shadlen, Michael N.

    2015-01-01

    SUMMARY Timing plays a crucial role in sensorimotor function. The neural mechanisms that enable the brain to flexibly measure and reproduce time intervals are however not known. We recorded neural activity in parietal cortex of monkeys in a time reproduction task. Monkeys were trained to measure and immediately afterwards reproduce different sample intervals. While measuring an interval, neural responses had a nonlinear profile that increased with the duration of the sample interval. Activity was reset during the transition from measurement to production, and was followed by a ramping activity whose slope encoded the previously measured sample interval. We found that firing rates at the end of the measurement epoch were correlated with both the slope of the ramp and the monkey’s corresponding production interval on a trial-by-trial basis. Analysis of response dynamics further linked the rate of change of firing rates in the measurement epoch to the slope of the ramp in the production epoch. These observations suggest that, during time reproduction, an interval is measured prospectively in relation to the desired motor plan to reproduce that interval. PMID:26455307

  17. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  18. Planar heterojunction perovskite solar cells with superior reproducibility

    NASA Astrophysics Data System (ADS)

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-11-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method.

  19. Planar heterojunction perovskite solar cells with superior reproducibility.

    PubMed

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-11-07

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method.

  20. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae.

    PubMed

    Cabrales-Arellano, Patricia; Islas-Flores, Tania; Thomé, Patricia E; Villanueva, Marco A

    2017-01-01

    Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian-dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol's iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10-25 μM after 7 days. Strobilation under optimal conditions reached 80-100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae.

  1. Endoscopic Evaluation of Adenoids: Reproducibility Analysis of Current Methods

    PubMed Central

    Hermann, Juliana Sato; Sallum, Ana Carolina; Pignatari, Shirley Shizue Nagata

    2013-01-01

    Objectives To investigate intra- and interexaminers' reproducibility of usual adenoid hypertrophy assessment methods, according to nasofiberendoscopic examination. Methods Forty children of both sexes, ages ranging between 4 and 14 years, presenting with nasal obstruction and oral breathing suspected to be caused by adenoid hypertrophy, were enrolled in this study. Patients were evaluated by nasofiberendoscopy, and records were referred to and evaluated by two experienced otolaryngologists. Examiners analysed the records according to different evaluation methods; i.e., estimated, and measured percentage of choanal occlusion; as well as subjective and objective classificatory systems of adenoid hypertrophy. Results Data disclosed excellent intraexaminer reproducibility for both estimated and measured choanal occlusion. analysis revealed lower reproducibility rates of estimated in relation to measured choanal occlusion. Measured choanal occlusion also demonstrated less agreement among evaluations made through the right and left sides of the nasal cavity. Alternatively, intra- and interexaminers reliability analysis revealed higher agreement for subjective than objective classificatory system. Besides, subjective method demonstrated higher agreement than the objective classificatory system, when opposite sides were compared. Conclusion Our results suggest that measured is superior to estimated percentage of choanal occlusion, particularly if employed bilaterally, diminishing the lack of agreement between sides. When adenoid categorization is used instead, the authors recommend subjective rather than objective classificatory system of adenoid hypertrophy. PMID:23526477

  2. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae

    PubMed Central

    Cabrales-Arellano, Patricia; Islas-Flores, Tania; Thomé, Patricia E.

    2017-01-01

    Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian–dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol’s iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10–25 μM after 7 days. Strobilation under optimal conditions reached 80–100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae. PMID:28265497

  3. Reproducibility of LCA models of crude oil production.

    PubMed

    Vafi, Kourosh; Brandt, Adam R

    2014-11-04

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  4. Establishing a reproducible protocol for measuring index active extension strength.

    PubMed

    Matter-Parrat, V; Hidalgo Diaz, J J; Collon, S; Salazar Botero, S; Prunières, G; Ichihara, S; Facca, S; Liverneaux, P

    2017-02-01

    The goal of this study was to establish a reproducible protocol to measure active extension strength in the index finger. The secondary objectives consisted in correlating the independent or associated index extension strength to the other fingers force of contraction of the extensor indicis propius with hand dominance. The population studied consisted of 24 healthy volunteers, including 19 women and 20 right-handed individuals. The independent and dependent index extension strength in each hand was measured three times with a dynamometer by three examiners at Day 0 and again at Day 7. Intra and inter-examiner reproducibility were, respectively, >0.90 and >0.75 in all cases. The independent extension strength was lower than the dependent one. There was no difference between the independent index extension strength on the dominant and non-dominant sides. The same was true for the dependent strength. Our results show that our protocol is reproducible in measuring independent and dependent index extension strength. Dominance did not come into account.

  5. Evaluation of the color reproducibility of all-ceramic restorations fabricated by the digital veneering method

    PubMed Central

    Kim, Jae-Hong; Kim, Ki-Baek; Kim, Woong-Chul; Kim, Hae-Young

    2014-01-01

    PURPOSE The objective of this study was to evaluate the clinical acceptability of all-ceramic crowns fabricated by the digital veneering method vis-à-vis the traditional method. MATERIALS AND METHODS Zirconia specimens manufactures by two different manufacturing method, conventional vs digital veneering, with three different thickness (0.3 mm, 0.5 mm, 0.7 mm) were prepared for analysis. Color measurement was performed using a spectrophotometer for the prepared specimens. The differences in shade in relation to the build-up method were calculated by quantifying ΔE* (mean color difference), with the use of color difference equations representing the distance from the measured values L*, a*, and b*, to the three-dimensional space of two colors. Two-way analysis of variance (ANOVA) combined with a Tukey multiple-range test was used to analyze the data (α=0.05). RESULTS In comparing means and standard deviations of L*, a*, and b* color values there was no significant difference by the manufacturing method and zirconia core thickness according to a two-way ANOVA. The color differences between two manufacturing methods were in a clinically acceptable range less than or equal to 3.7 in all the specimens. CONCLUSION Based on the results of this study, a carefully consideration is necessary while selecting upper porcelain materials, even if it is performed on a small scale. However, because the color reproducibility of the digital veneering system was within the clinically acceptable range when comparing with conventional layering system, it was possible to estimate the possibility of successful aesthetic prostheses in the latest technology. PMID:24843390

  6. General theory of experiment containing reproducible data: The reduction to an ideal experiment

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Raoul R.; Zhang, Wei; Striccoli, Domenico

    2015-10-01

    The authors suggest a general theory for consideration of all experiments associated with measurements of reproducible data in one unified scheme. The suggested algorithm does not contain unjustified suppositions and the final function that is extracted from these measurements can be compared with hypothesis that is suggested by the theory adopted for the explanation of the object/phenomenon studied. This true function is free from the influence of the apparatus (instrumental) function and when the "best fit", or the most acceptable hypothesis, is absent, can be presented as a segment of the Fourier series. The discrete set of the decomposition coefficients describes the final function quantitatively and can serve as an intermediate model that coincides with the amplitude-frequency response (AFR) of the object studied. It can be used by theoreticians also for comparison of the suggested theory with experimental observations. Two examples (Raman spectra of the distilled water and exchange by packets between two wireless sensor nodes) confirm the basic elements of this general theory. From this general theory the following important conclusions follow: 1. The Prony's decomposition should be used in detection of the quasi-periodic processes and for quantitative description of reproducible data. 2. The segment of the Fourier series should be used as the fitting function for description of observable data corresponding to an ideal experiment. The transition from the initial Prony's decomposition to the conventional Fourier transform implies also the elimination of the apparatus function that plays an important role in the reproducible data processing. 3. The suggested theory will be helpful for creation of the unified metrological standard (UMS) that should be used in comparison of similar data obtained from the same object studied but in different laboratories with the usage of different equipment. 4. Many cases when the conventional theory confirms the experimental

  7. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  8. Studying Student Teachers' Acceptance of Role Responsibility.

    ERIC Educational Resources Information Center

    Davis, Michael D.; Davis, Concetta M.

    1980-01-01

    There is variance in the way in which student teachers accept responsibility for the teaching act. This study explains why some variables may affect student teachers' acceptance of role responsibilities. (CM)

  9. [Subjective well-being and self acceptance].

    PubMed

    Makino, Y; Tagami, F

    1998-06-01

    The purpose of the present study was to examine the relationship between subjective well-being and self acceptance, and to design a happiness self-writing program to increase self acceptance and subjective well-being of adolescents. In study 1, we examined the relationship between social interaction and self acceptance. In study 2, we created a happiness self-writing program in cognitive behavioral approach, and examined whether the program promoted self acceptance and subjective well-being. Results indicated that acceptance of self-openness, an aspect of self acceptance, was related to subjective well-being. The happiness self-writing program increased subjective well-being, but it was not found to have increased self acceptance. It was discussed why the program could promote subjective well-being, but not self acceptance.

  10. Canine olfaction as an alternative to analytical instruments for disease diagnosis: understanding 'dog personality' to achieve reproducible results

    EPA Science Inventory

    Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all t...

  11. Adaptive learning in complex reproducing kernel Hilbert spaces employing Wirtinger's subgradients.

    PubMed

    Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios

    2012-03-01

    This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.

  12. Reproducing neutrino effects on the matter power spectrum through a degenerate Fermi gas approach

    SciTech Connect

    Perico, E.L.D.; Bernardini, A.E. E-mail: alexeb@ufscar.br

    2011-06-01

    Modifications on the predictions about the matter power spectrum based on the hypothesis of a tiny contribution from a degenerate Fermi gas (DFG) test-fluid to some dominant cosmological scenario are investigated. Reporting about the systematic way of accounting for all the cosmological perturbations, through the Boltzmann equation we obtain the analytical results for density fluctuation, δ, and fluid velocity divergence, θ, of the DFG. Small contributions to the matter power spectrum are analytically obtained for the radiation-dominated background, through an ultra-relativistic approximation, and for the matter-dominated and Λ-dominated eras, through a non-relativistic approximation. The results can be numerically reproduced and compared with those of considering non-relativistic and ultra-relativistic neutrinos into the computation of the matter power spectrum. Lessons concerning the formation of large scale structures of a DFG are depicted, and consequent deviations from standard ΛCDM predictions for the matter power spectrum (with and without neutrinos) are quantified.

  13. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  14. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance. The... offered have either achieved commercial market acceptance or been satisfactorily supplied to an...

  15. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) Section... either— (i) Achieved commercial market acceptance; or (ii) Been satisfactorily supplied to an...

  16. Older Adults' Acceptance of Information Technology

    ERIC Educational Resources Information Center

    Wang, Lin; Rau, Pei-Luen Patrick; Salvendy, Gavriel

    2011-01-01

    This study investigated variables contributing to older adults' information technology acceptance through a survey, which was used to find factors explaining and predicting older adults' information technology acceptance behaviors. Four factors, including needs satisfaction, perceived usability, support availability, and public acceptance, were…

  17. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  18. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  19. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  20. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  1. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  2. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... offered have either achieved commercial market acceptance or been satisfactorily supplied to an agency... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance....

  3. 21 CFR 820.86 - Acceptance status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Acceptance status. 820.86 Section 820.86 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Acceptance Activities § 820.86 Acceptance status. Each manufacturer...

  4. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) Section... either— (i) Achieved commercial market acceptance; or (ii) Been satisfactorily supplied to an...

  5. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) 41 U.S...) Achieved commercial market acceptance; or (ii) Been satisfactorily supplied to an agency under current...

  6. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Market acceptance. 2911.103... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance. The... offered have either achieved commercial market acceptance or been satisfactorily supplied to an...

  7. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) Section... either— (i) Achieved commercial market acceptance; or (ii) Been satisfactorily supplied to an...

  8. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance. The... offered have either achieved commercial market acceptance or been satisfactorily supplied to an...

  9. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance. The... offered have either achieved commercial market acceptance or been satisfactorily supplied to an...

  10. The relation between remembered parental acceptance in childhood and self-acceptance among young Turkish adults.

    PubMed

    Kuyumcu, Behire; Rohner, Ronald P

    2016-05-11

    This study examined the relation between young adults' age and remembrances of parental acceptance in childhood, and their current self-acceptance. The study was based on a sample of 236 young adults in Turkey (139 women and 97 men). The adult version of the Parental Acceptance-Rejection/Control Questionnaire for mothers and fathers along with the Self-Acceptance subscale of the Psychological Well-Being Scale, and the Personal Information Form were used as measures. Results showed that both men and women tended to remember having been accepted in childhood by both their mothers and fathers. Women, however, reported more maternal and paternal acceptance in childhood than did men. Similarly, the level of self-acceptance was high among both men and women. However, women's self-acceptance was higher than men's. Correlational analyses showed that self-acceptance was positively related to remembrances of maternal and paternal acceptance among both women and men. Results indicated that age and remembered paternal acceptance significantly predicted women's self-acceptance. Age and remembered maternal acceptance made significant and independent contributions to men's self-acceptance. Men's remembrances of paternal acceptance in childhood did not make significant contribution to their self-acceptance. Finally, the relation between women's age and self-acceptance was significantly moderated by remembrances of paternal acceptance in childhood.

  11. Formulation of a candidate glass for use as an acceptance test standard material

    SciTech Connect

    Ebert, W.L.; Strachan, D.M.; Wolf, S.F.

    1998-04-01

    In this report, the authors discuss the formulation of a glass that will be used in a laboratory testing program designed to measure the precision of test methods identified in the privatization contracts for the immobilization of Hanford low-activity wastes. Tests will be conducted with that glass to measure the reproducibility of tests and analyses that must be performed by glass producers as a part of the product acceptance procedure. Test results will be used to determine if the contractually required tests and analyses are adequate for evaluating the acceptability of likely immobilized low-activity waste (ILAW) products. They will also be used to evaluate if the glass designed for use in these tests can be used as an analytical standard test material for verifying results reported by vendors for tests withg ILAW products. The results of those tests and analyses will be presented in a separate report. The purpose of this report is to document the strategy used to formulate the glass to be used in the testing program. The low-activity waste reference glass LRM that will be used in the testing program was formulated to be compositionally similar to ILAW products to be made with wastes from Hanford. Since the ILAW product compositions have not been disclosed by the vendors participating in the Hanford privatization project, the composition of LRM was formulated based on simulated Hanford waste stream and amounts of added glass forming chemicals typical for vitrified waste forms. The major components are 54 mass % SiO{sub 2}, 20 mass % Na{sub 2}O, 10 mass % Al{sub 2}O{sub 3}, 8 mass % B{sub 2}O{sub 3}, and 1.5 mass % K{sub 2}O. Small amounts of other chemicals not present in Hanford wastes were also included in the glass, since they may be included as chemical additives in ILAW products. This was done so that the use of LRM as a composition standard could be evaluated. Radionuclides were not included in LRM because a nonradioactive material was desired.

  12. Venusian Polar Vortex reproduced by a general circulation model

    NASA Astrophysics Data System (ADS)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  13. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  14. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  15. Efficient and reproducible mammalian cell bioprocesses without probes and controllers?

    PubMed

    Tissot, Stéphanie; Oberbek, Agata; Reclari, Martino; Dreyer, Matthieu; Hacker, David L; Baldi, Lucia; Farhat, Mohamed; Wurm, Florian M

    2011-07-01

    Bioprocesses for recombinant protein production with mammalian cells are typically controlled for several physicochemical parameters including the pH and dissolved oxygen concentration (DO) of the culture medium. Here we studied whether these controls are necessary for efficient and reproducible bioprocesses in an orbitally shaken bioreactor (OSR). Mixing, gas transfer, and volumetric power consumption (P(V)) were determined in both a 5-L OSR and a 3-L stirred-tank bioreactor (STR). The two cultivation systems had a similar mixing intensity, but the STR had a lower volumetric mass transfer coefficient of oxygen (k(L)a) and a higher P(V) than the OSR. Recombinant CHO cell lines expressing either tumor necrosis factor receptor as an Fc fusion protein (TNFR:Fc) or an anti-RhesusD monoclonal antibody were cultivated in the two systems. The 5-L OSR was operated in an incubator shaker with 5% CO(2) in the gas environment but without pH and DO control whereas the STR was operated with or without pH and DO control. Higher cell densities and recombinant protein titers were obtained in the OSR as compared to both the controlled and the non-controlled STRs. To test the reproducibility of a bioprocess in a non-controlled OSR, the two CHO cell lines were each cultivated in parallel in six 5-L OSRs. Similar cell densities, cell viabilities, and recombinant protein titers along with similar pH and DO profiles were achieved in each group of replicates. Our study demonstrated that bioprocesses can be performed in OSRs without pH or DO control in a highly reproducible manner, at least at the scale of operation studied here.

  16. Emergence of reproducible spatiotemporal activity during motor learning.

    PubMed

    Peters, Andrew J; Chen, Simon X; Komiyama, Takaki

    2014-06-12

    The motor cortex is capable of reliably driving complex movements yet exhibits considerable plasticity during motor learning. These observations suggest that the fundamental relationship between motor cortex activity and movement may not be fixed but is instead shaped by learning; however, to what extent and how motor learning shapes this relationship are not fully understood. Here we addressed this issue by using in vivo two-photon calcium imaging to monitor the activity of the same population of hundreds of layer 2/3 neurons while mice learned a forelimb lever-press task over two weeks. Excitatory and inhibitory neurons were identified by transgenic labelling. Inhibitory neuron activity was relatively stable and balanced local excitatory neuron activity on a movement-by-movement basis, whereas excitatory neuron activity showed higher dynamism during the initial phase of learning. The dynamics of excitatory neurons during the initial phase involved the expansion of the movement-related population which explored various activity patterns even during similar movements. This was followed by a refinement into a smaller population exhibiting reproducible spatiotemporal sequences of activity. This pattern of activity associated with the learned movement was unique to expert animals and not observed during similar movements made during the naive phase, and the relationship between neuronal activity and individual movements became more consistent with learning. These changes in population activity coincided with a transient increase in dendritic spine turnover in these neurons. Our results indicate that a novel and reproducible activity-movement relationship develops as a result of motor learning, and we speculate that synaptic plasticity within the motor cortex underlies the emergence of reproducible spatiotemporal activity patterns for learned movements. These results underscore the profound influence of learning on the way that the cortex produces movements.

  17. Psychophysiological responses to pain identify reproducible human clusters.

    PubMed

    Farmer, Adam D; Coen, Steven J; Kano, Michiko; Paine, Peter A; Shwahdi, Mustafa; Jafari, Jafar; Kishor, Jessin; Worthen, Sian F; Rossiter, Holly E; Kumari, Veena; Williams, Steven C R; Brammer, Michael; Giampietro, Vincent P; Droney, Joanne; Riley, Julia; Furlong, Paul L; Knowles, Charles H; Lightman, Stafford L; Aziz, Qasim

    2013-11-01

    Pain is a ubiquitous yet highly variable experience. The psychophysiological and genetic factors responsible for this variability remain unresolved. We hypothesised the existence of distinct human pain clusters (PCs) composed of distinct psychophysiological and genetic profiles coupled with differences in the perception and the brain processing of pain. We studied 120 healthy subjects in whom the baseline personality and anxiety traits and the serotonin transporter-linked polymorphic region (5-HTTLPR) genotype were measured. Real-time autonomic nervous system parameters and serum cortisol were measured at baseline and after standardised visceral and somatic pain stimuli. Brain processing reactions to visceral pain were studied in 29 subjects using functional magnetic resonance imaging (fMRI). The reproducibility of the psychophysiological responses to pain was assessed at year. In group analysis, visceral and somatic pain caused an expected increase in sympathetic and cortisol responses and activated the pain matrix according to fMRI studies. However, using cluster analysis, we found 2 reproducible PCs: at baseline, PC1 had higher neuroticism/anxiety scores (P ≤ 0.01); greater sympathetic tone (P<0.05); and higher cortisol levels (P ≤ 0.001). During pain, less stimulus was tolerated (P ≤ 0.01), and there was an increase in parasympathetic tone (P ≤ 0.05). The 5-HTTLPR short allele was over-represented (P ≤ 0.005). PC2 had the converse profile at baseline and during pain. Brain activity differed (P ≤ 0.001); greater activity occurred in the left frontal cortex in PC1, whereas PC2 showed greater activity in the right medial/frontal cortex and right anterior insula. In health, 2 distinct reproducible PCs exist in humans. In the future, PC characterization may help to identify subjects at risk for developing chronic pain and may reduce variability in brain imaging studies.

  18. Brugada phenocopy clinical reproducibility demonstrated by recurrent hypokalemia.

    PubMed

    Genaro, Natalia R; Anselm, Daniel D; Cervino, Nahuel; Estevez, Ariel O; Perona, Carlos; Villamil, Alejandro M; Kervorkian, Ruben; Baranchuk, Adrian

    2014-07-01

    Brugada phenocopies (BrP) are clinical entities that are etiologically distinct from true congenital Brugada syndrome (BrS). BrP are characterized by type 1 or type 2 Brugada electrocardiogram (ECG) patterns in precordial leads V1 -V3 ; however, BrP are elicited by various underlying clinical conditions such as electrolyte disturbances, myocardial ischemia, or poor ECG filters. In this report, we describe the first case of clinically reproducible BrP which is important to the conceptual evolution of BrP.

  19. INFRARED IMAGING OF CARBON AND CERAMIC COMPOSITES: DATA REPRODUCIBILITY

    SciTech Connect

    Knight, B.; Howard, D. R.; Ringermacher, H. I.; Hudson, L. D.

    2010-02-22

    Infrared NDE techniques have proven to be superior for imaging of flaws in ceramic matrix composites (CMC) and carbon silicon carbide composites (C/SiC). Not only can one obtain accurate depth gauging of flaws such as delaminations and layered porosity in complex-shaped components such as airfoils and other aeronautical components, but also excellent reproducibility of image data is obtainable using the STTOF (Synthetic Thermal Time-of-Flight) methodology. The imaging of large complex shapes is fast and reliable. This methodology as applied to large C/SiC flight components at the NASA Dryden Flight Research Center will be described.

  20. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  1. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  2. Acceptability of GM foods among Pakistani consumers.

    PubMed

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-04-02

    In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers.

  3. Consumer Acceptability of Intramuscular Fat

    PubMed Central

    Frank, Damian; Joo, Seon-Tea

    2016-01-01

    Fat in meat greatly improves eating quality, yet many consumers avoid visible fat, mainly because of health concerns. Generations of consumers, especially in the English-speaking world, have been convinced by health authorities that animal fat, particularly saturated or solid fat, should be reduced or avoided to maintain a healthy diet. Decades of negative messages regarding animal fats has resulted in general avoidance of fatty cuts of meat. Paradoxically, low fat or lean meat tends to have poor eating quality and flavor and low consumer acceptability. The failure of low-fat high-carbohydrate diets to curb “globesity” has prompted many experts to re-evaluate of the place of fat in human diets, including animal fat. Attitudes towards fat vary dramatically between and within cultures. Previous generations of humans sought out fatty cuts of meat for their superior sensory properties. Many consumers in East and Southeast Asia have traditionally valued more fatty meat cuts. As nutritional messages around dietary fat change, there is evidence that attitudes towards animal fat are changing and many consumers are rediscovering and embracing fattier cuts of meat, including marbled beef. The present work provides a short overview of the unique sensory characteristics of marbled beef and changing consumer preferences for fat in meat in general. PMID:28115880

  4. Robust estimation of fractal measures for characterizing the structural complexity of the human brain: optimization and reproducibility.

    PubMed

    Goñi, Joaquín; Sporns, Olaf; Cheng, Hu; Aznárez-Sanado, Maite; Wang, Yang; Josa, Santiago; Arrondo, Gonzalo; Mathews, Vincent P; Hummer, Tom A; Kronenberger, William G; Avena-Koenigsberger, Andrea; Saykin, Andrew J; Pastor, María A

    2013-12-01

    High-resolution isotropic three-dimensional reconstructions of human brain gray and white matter structures can be characterized to quantify aspects of their shape, volume and topological complexity. In particular, methods based on fractal analysis have been applied in neuroimaging studies to quantify the structural complexity of the brain in both healthy and impaired conditions. The usefulness of such measures for characterizing individual differences in brain structure critically depends on their within-subject reproducibility in order to allow the robust detection of between-subject differences. This study analyzes key analytic parameters of three fractal-based methods that rely on the box-counting algorithm with the aim to maximize within-subject reproducibility of the fractal characterizations of different brain objects, including the pial surface, the cortical ribbon volume, the white matter volume and the gray matter/white matter boundary. Two separate datasets originating from different imaging centers were analyzed, comprising 50 subjects with three and 24 subjects with four successive scanning sessions per subject, respectively. The reproducibility of fractal measures was statistically assessed by computing their intra-class correlations. Results reveal differences between different fractal estimators and allow the identification of several parameters that are critical for high reproducibility. Highest reproducibility with intra-class correlations in the range of 0.9-0.95 is achieved with the correlation dimension. Further analyses of the fractal dimensions of parcellated cortical and subcortical gray matter regions suggest robustly estimated and region-specific patterns of individual variability. These results are valuable for defining appropriate parameter configurations when studying changes in fractal descriptors of human brain structure, for instance in studies of neurological diseases that do not allow repeated measurements or for disease

  5. Robust estimation of fractal measures for characterizing the structural complexity of the human brain: optimization and reproducibility

    PubMed Central

    Goñi, Joaquín; Sporns, Olaf; Cheng, Hu; Aznárez-Sanado, Maite; Wang, Yang; Josa, Santiago; Arrondo, Gonzalo; Mathews, Vincent P; Hummer, Tom A; Kronenberger, William G; Avena-Koenigsberger, Andrea; Saykin, Andrew J.; Pastor, María A.

    2013-01-01

    High-resolution isotropic three-dimensional reconstructions of human brain gray and white matter structures can be characterized to quantify aspects of their shape, volume and topological complexity. In particular, methods based on fractal analysis have been applied in neuroimaging studies to quantify the structural complexity of the brain in both healthy and impaired conditions. The usefulness of such measures for characterizing individual differences in brain structure critically depends on their within-subject reproducibility in order to allow the robust detection of between-subject differences. This study analyzes key analytic parameters of three fractal-based methods that rely on the box-counting algorithm with the aim to maximize within-subject reproducibility of the fractal characterizations of different brain objects, including the pial surface, the cortical ribbon volume, the white matter volume and the grey matter/white matter boundary. Two separate datasets originating from different imaging centers were analyzed, comprising, 50 subjects with three and 24 subjects with four successive scanning sessions per subject, respectively. The reproducibility of fractal measures was statistically assessed by computing their intra-class correlations. Results reveal differences between different fractal estimators and allow the identification of several parameters that are critical for high reproducibility. Highest reproducibility with intra-class correlations in the range of 0.9–0.95 is achieved with the correlation dimension. Further analyses of the fractal dimensions of parcellated cortical and subcortical gray matter regions suggest robustly estimated and region-specific patterns of individual variability. These results are valuable for defining appropriate parameter configurations when studying changes in fractal descriptors of human brain structure, for instance in studies of neurological diseases that do not allow repeated measurements or for disease

  6. Percolating silicon nanowire networks with highly reproducible electrical properties.

    PubMed

    Serre, Pauline; Mongillo, Massimo; Periwal, Priyanka; Baron, Thierry; Ternon, Céline

    2015-01-09

    Here, we report the morphological and electrical properties of self-assembled silicon nanowires networks, also called Si nanonets. At the macroscopic scale, the nanonets involve several millions of nanowires. So, the observed properties should result from large scale statistical averaging, minimizing thus the discrepancies that occur from one nanowire to another. Using a standard filtration procedure, the so-obtained Si nanonets are highly reproducible in terms of their morphology, with a Si nanowire density precisely controlled during the nanonet elaboration. In contrast to individual Si nanowires, the electrical properties of Si nanonets are highly consistent, as demonstrated here by the similar electrical properties obtained in hundreds of Si nanonet-based devices. The evolution of the Si nanonet conductance with Si nanowire density demonstrates that Si nanonets behave like standard percolating media despite the presence of numerous nanowire-nanowire intersecting junctions into the nanonets and the native oxide shell surrounding the Si nanowires. Moreover, when silicon oxidation is prevented or controlled, the electrical properties of Si nanonets are stable over many months. As a consequence, Si nanowire-based nanonets constitute a promising flexible material with stable and reproducible electrical properties at the macroscopic scale while being composed of nanoscale components, which confirms the Si nanonet potential for a wide range of applications including flexible electronic, sensing and photovoltaic applications.

  7. Ranking and averaging independent component analysis by reproducibility (RAICAR).

    PubMed

    Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping

    2008-06-01

    Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data.

  8. The flux qubit revisited to enhance coherence and reproducibility

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Gustavsson, Simon; Kamal, Archana; Birenbaum, Jeffrey; Sears, Adam P.; Hover, David; Gudmundsen, Ted J.; Rosenberg, Danna; Samach, Gabriel; Weber, S.; Yoder, Jonilyn L.; Orlando, Terry P.; Clarke, John; Kerman, Andrew J.; Oliver, William D.

    2016-11-01

    The scalable application of quantum information science will stand on reproducible and controllable high-coherence quantum bits (qubits). Here, we revisit the design and fabrication of the superconducting flux qubit, achieving a planar device with broad-frequency tunability, strong anharmonicity, high reproducibility and relaxation times in excess of 40 μs at its flux-insensitive point. Qubit relaxation times T1 across 22 qubits are consistently matched with a single model involving resonator loss, ohmic charge noise and 1/f-flux noise, a noise source previously considered primarily in the context of dephasing. We furthermore demonstrate that qubit dephasing at the flux-insensitive point is dominated by residual thermal-photons in the readout resonator. The resulting photon shot noise is mitigated using a dynamical decoupling protocol, resulting in T2~85 μs, approximately the 2T1 limit. In addition to realizing an improved flux qubit, our results uniquely identify photon shot noise as limiting T2 in contemporary qubits based on transverse qubit-resonator interaction.

  9. Reproducibility of the measurement of sweet taste preferences.

    PubMed

    Asao, Keiko; Luo, Wendy; Herman, William H

    2012-12-01

    Developing interventions to prevent and treat obesity are medical and public health imperatives. Taste is a major determinant of food intake and reliable methods to measure taste preferences need to be established. This study aimed to establish the short-term reproducibility of sweet taste preference measurements using 5-level sucrose concentrations in healthy adult volunteers. We defined sweet taste preference as the geometric mean of the preferred sucrose concentration determined from two series of two-alternative, forced-choice staircase procedures administered 10min apart on a single day. We repeated the same procedures at a second visit 3-7days later. Twenty-six adults (13 men and 13 women, age 33.2±12.2years) completed the measurements. The median number of pairs presented for each series was three (25th and 75th percentiles: 3, 4). The intraclass correlation coefficients between the measurements was 0.82 (95% confidence interval [CI]: 0.63-0.92) within a few days. This study showed high short-term reproducibility of a simple, 5-level procedure for measuring sweet taste preferences. This method may be useful for assessing sweet taste preferences and the risks resulting from those preferences.

  10. Reproducibility of Neonate Ocular Circulation Measurements Using Laser Speckle Flowgraphy

    PubMed Central

    Matsumoto, Tadashi; Itokawa, Takashi; Shiba, Tomoaki; Katayama, Yuji; Arimura, Tetsushi; Mizukaki, Norio; Yoda, Hitoshi; Hori, Yuichi

    2015-01-01

    Measuring the ocular blood flow in neonates may clarify the relationships between eye diseases and ocular circulation abnormalities. However, no method for noninvasively measuring ocular circulation in neonates is established. We used laser speckle flowgraphy (LSFG) modified for neonates to measure their ocular circulation and investigated whether this method is reproducible. During their normal sleep, we studied 16 subjects (adjusted age of 34–48 weeks) whose blood flow could be measured three consecutive times. While the subjects slept in the supine position, three mean blur rate (MBR) values of the optic nerve head (ONH) were obtained: the MBR-A (mean of all values), MBR-V (vessel mean), and MBR-T (tissue mean), and nine blood flow pulse waveform parameters in the ONH were examined. We analyzed the coefficient of variation (COV) and the intraclass correlation coefficient (ICC) for each parameter. The COVs of the MBR values were all ≤10%. The ICCs of the MBR values were all >0.8. Good COVs were observed for the blowout score, blowout time, rising rate, falling rate, and acceleration time index. Although the measurement of ocular circulation in the neonates was difficult, our results exhibited reproducibility, suggesting that this method could be used in clinical research. PMID:26557689

  11. Reproducibility in Nerve Morphometry: Comparison between Methods and among Observers

    PubMed Central

    Bilego Neto, Antônio Paulo da Costa; Silveira, Fernando Braga Cassiano; Rodrigues da Silva, Greice Anne; Sanada, Luciana Sayuri; Fazan, Valéria Paula Sassoli

    2013-01-01

    We investigated the reproducibility of a semiautomated method (computerized with manual intervention) for nerve morphometry (counting and measuring myelinated fibers) between three observers with different levels of expertise and experience with the method. Comparisons between automatic (fully computerized) and semiautomated morphometric methods performed by the same computer software using the same nerve images were also performed. Sural nerves of normal adult rats were used. Automatic and semiautomated morphometry of the myelinated fibers were made through the computer software KS-400. Semiautomated morphometry was conducted by three independent observers on the same images, using the semiautomated method. Automatic morphometry overestimated the myelin sheath area, thus overestimating the myelinated fiber size and underestimating the axon size. Fiber distributions overestimation was of 0.5 μm. For the semiautomated morphometry, no differences were found between observers for myelinated fiber and axon size distributions. Overestimation of the myelin sheath size of normal fibers by the fully automatic method might have an impact when morphometry is used for diagnostic purposes. We suggest that not only semiautomated morphometry results can be compared between different centers in clinical trials but it can also be performed by more than one investigator in one single experiment, being a reliable and reproducible method. PMID:23841086

  12. Assessment of Modeling Capability for Reproducing Storm Impacts on TEC

    NASA Astrophysics Data System (ADS)

    Shim, J. S.; Kuznetsova, M. M.; Rastaetter, L.; Bilitza, D.; Codrescu, M.; Coster, A. J.; Emery, B. A.; Foerster, M.; Foster, B.; Fuller-Rowell, T. J.; Huba, J. D.; Goncharenko, L. P.; Mannucci, A. J.; Namgaladze, A. A.; Pi, X.; Prokhorov, B. E.; Ridley, A. J.; Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Zhu, L.

    2014-12-01

    During geomagnetic storm, the energy transfer from solar wind to magnetosphere-ionosphere system adversely affects the communication and navigation systems. Quantifying storm impacts on TEC (Total Electron Content) and assessment of modeling capability of reproducing storm impacts on TEC are of importance to specifying and forecasting space weather. In order to quantify storm impacts on TEC, we considered several parameters: TEC changes compared to quiet time (the day before storm), TEC difference between 24-hour intervals, and maximum increase/decrease during the storm. We investigated the spatial and temporal variations of the parameters during the 2006 AGU storm event (14-15 Dec. 2006) using ground-based GPS TEC measurements in the selected 5 degree eight longitude sectors. The latitudinal variations were also studied in two longitude sectors among the eight sectors where data coverage is relatively better. We obtained modeled TEC from various ionosphere/thermosphere (IT) models. The parameters from the models were compared with each other and with the observed values. We quantified performance of the models in reproducing the TEC variations during the storm using skill scores. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  13. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  14. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  15. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments

    PubMed Central

    Qi, Yuan; Liu, Xiuping; Liu, Chang-gong; Wang, Bailing; Hess, Kenneth R.; Symmans, W. Fraser; Shi, Weiwei; Pusztai, Lajos

    2015-01-01

    Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome) (~3.2 Mb) using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14) or triplicate (n=3) to assess concordance of all calls and single nucleotide variant (SNV) calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC), variant allele frequency (VAF), variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA) with accession number EGAS00001000826. PMID:26136146

  16. [Study of the validity and reproducibility of passive ozone monitors].

    PubMed

    Cortez-Lugo, M; Romieu, I; Palazuelos-Rendón, E; Hernández-Avila, M

    1995-01-01

    The aim of this study was to evaluate the validity and reproducibility between ozone measurements obtained with passive ozone monitors and those registered with a continuous ozone monitor, to determine the applicability of passive monitors in epidemiological research. The study was carried out during November and December 1992. Indoor and outdoor classroom air ozone concentrations were analyzed using 28 passive monitors and using a continuous monitor. The correlation between both measurements was highly significant (r = 0.089, p < 0.001), indicating a very good validity. Also, the correlation between the measurements obtained with two different passive monitors exposed concurrently was very high (r = 0.97, p < 0.001), indicating a good reproducibility in the measurements of the passive monitors. The relative error between the concentrations measured by the passive monitors and those from the continuous monitor tended to decrease with increasing ozone concentrations. The results suggest that passive monitors should be used to determine cumulative exposure of ozone exceeding 100 ppb, corresponding to an exposure period greater than five days, if used to analyze indoor air.

  17. The flux qubit revisited to enhance coherence and reproducibility

    PubMed Central

    Yan, Fei; Gustavsson, Simon; Kamal, Archana; Birenbaum, Jeffrey; Sears, Adam P; Hover, David; Gudmundsen, Ted J.; Rosenberg, Danna; Samach, Gabriel; Weber, S; Yoder, Jonilyn L.; Orlando, Terry P.; Clarke, John; Kerman, Andrew J.; Oliver, William D.

    2016-01-01

    The scalable application of quantum information science will stand on reproducible and controllable high-coherence quantum bits (qubits). Here, we revisit the design and fabrication of the superconducting flux qubit, achieving a planar device with broad-frequency tunability, strong anharmonicity, high reproducibility and relaxation times in excess of 40 μs at its flux-insensitive point. Qubit relaxation times T1 across 22 qubits are consistently matched with a single model involving resonator loss, ohmic charge noise and 1/f-flux noise, a noise source previously considered primarily in the context of dephasing. We furthermore demonstrate that qubit dephasing at the flux-insensitive point is dominated by residual thermal-photons in the readout resonator. The resulting photon shot noise is mitigated using a dynamical decoupling protocol, resulting in T2≈85 μs, approximately the 2T1 limit. In addition to realizing an improved flux qubit, our results uniquely identify photon shot noise as limiting T2 in contemporary qubits based on transverse qubit–resonator interaction. PMID:27808092

  18. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  19. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  20. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  1. Towards reproducible MRM based biomarker discovery using dried blood spots.

    PubMed

    Ozcan, Sureyya; Cooper, Jason D; Lago, Santiago G; Kenny, Diarmuid; Rustogi, Nitin; Stocki, Pawel; Bahn, Sabine

    2017-03-27

    There is an increasing interest in the use of dried blood spot (DBS) sampling and multiple reaction monitoring in proteomics. Although several groups have explored the utility of DBS by focusing on protein detection, the reproducibility of the approach and whether it can be used for biomarker discovery in high throughput studies is yet to be determined. We assessed the reproducibility of multiplexed targeted protein measurements in DBS compared to serum. Eighty-two medium to high abundance proteins were monitored in a number of technical and biological replicates. Importantly, as part of the data analysis, several statistical quality control approaches were evaluated to detect inaccurate transitions. After implementing statistical quality control measures, the median CV on the original scale for all detected peptides in DBS was 13.2% and in Serum 8.8%. We also found a strong correlation (r = 0.72) between relative peptide abundance measured in DBS and serum. The combination of minimally invasive sample collection with a highly specific and sensitive mass spectrometry (MS) technique allows for targeted quantification of multiple proteins in a single MS run. This approach has the potential to fundamentally change clinical proteomics and personalized medicine by facilitating large-scale studies.

  2. A Telescope Inventor's Spyglass Possibly Reproduced in a Brueghel's Painting

    NASA Astrophysics Data System (ADS)

    Molaro, P.; Selvelli, P.

    2011-06-01

    Jan Brueghel the Elder depicted spyglasses belonging to the Archduke Albert VII of Habsburg in at least five paintings in the period between 1608 and 1625. Albert VII was fascinated by art and science and he obtained spyglasses directly from Lipperhey and Sacharias Janssen approximately at the time when the telescope was first shown at The Hague at the end of 1608. In the Extensive Landscape with View of the Castle of Mariemont, dated 1608-1612, the Archduke is looking at his Mariemont castle through an optical tube and this is the first time a spyglass was painted whatsoever. It is quite possible that the painting reproduces one of the first telescopes ever made. Two other Albert VII's telescopes are prominently reproduced in two Allegories of Sight painted a few years later (1617-1618). They are sophisticated instruments and their structure, in particular the shape of the eyepiece, suggests that they are composed by two convex lenses in a Keplerian optical configuration which became of common use only more than two decades later. If this is the case, these paintings are the first available record of a Keplerian telescope.

  3. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  4. Towards reproducible MRM based biomarker discovery using dried blood spots

    PubMed Central

    Ozcan, Sureyya; Cooper, Jason D.; Lago, Santiago G.; Kenny, Diarmuid; Rustogi, Nitin; Stocki, Pawel; Bahn, Sabine

    2017-01-01

    There is an increasing interest in the use of dried blood spot (DBS) sampling and multiple reaction monitoring in proteomics. Although several groups have explored the utility of DBS by focusing on protein detection, the reproducibility of the approach and whether it can be used for biomarker discovery in high throughput studies is yet to be determined. We assessed the reproducibility of multiplexed targeted protein measurements in DBS compared to serum. Eighty-two medium to high abundance proteins were monitored in a number of technical and biological replicates. Importantly, as part of the data analysis, several statistical quality control approaches were evaluated to detect inaccurate transitions. After implementing statistical quality control measures, the median CV on the original scale for all detected peptides in DBS was 13.2% and in Serum 8.8%. We also found a strong correlation (r = 0.72) between relative peptide abundance measured in DBS and serum. The combination of minimally invasive sample collection with a highly specific and sensitive mass spectrometry (MS) technique allows for targeted quantification of multiple proteins in a single MS run. This approach has the potential to fundamentally change clinical proteomics and personalized medicine by facilitating large-scale studies. PMID:28345601

  5. Reproducible and inexpensive probe preparation for oligonucleotide arrays.

    PubMed

    Zhang, Y; Price, B D; Tetradis, S; Chakrabarti, S; Maulik, G; Makrigiorgos, G M

    2001-07-01

    We present a new protocol for the preparation of nucleic acids for microarray hybridization. DNA is fragmented quantitatively and reproducibly by using a hydroxyl radical-based reaction, which is initiated by hydrogen peroxide, iron(II)-EDTA and ascorbic acid. Following fragmentation, the nucleic acid fragments are densely biotinylated using a biotinylated psoralen analog plus UVA light and hybridized on microarrays. This non-enzymatic protocol circumvents several practical difficulties associated with DNA preparation for microarrays: the lack of reproducible fragmentation patterns associated with enzymatic methods; the large amount of labeled nucleic acids required by some array designs, which is often combined with a limited amount of starting material; and the high cost associated with currently used biotinylation methods. The method is applicable to any form of nucleic acid, but is particularly useful when applying double-stranded DNA on oligonucleotide arrays. Validation of this protocol is demonstrated by hybridizing PCR products with oligonucleotide-coated microspheres and PCR amplified cDNA with Affymetrix Cancer GeneChip microarrays.

  6. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  7. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  8. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  9. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 39 Postal Service 1 2014-07-01 2014-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  10. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  11. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  12. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  13. Acceptance in Romantic Relationships: The Frequency and Acceptability of Partner Behavior Inventory

    ERIC Educational Resources Information Center

    Doss, Brian D.; Christensen, Andrew

    2006-01-01

    Despite the recent emphasis on acceptance in romantic relationships, no validated measure of relationship acceptance presently exists. To fill this gap, the 20-item Frequency and Acceptability of Partner Behavior Inventory (FAPBI; A. Christensen & N. S. Jacobson, 1997) was created to assess separately the acceptability and frequency of both…

  14. 24 CFR 203.202 - Plan acceptability and acceptance renewal criteria-general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HUD acceptance of such change or modification, except that changes mandated by other applicable laws... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Plan acceptability and acceptance... Underwriting Procedures Insured Ten-Year Protection Plans (plan) § 203.202 Plan acceptability and...

  15. Quality by design compliant analytical method validation.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-01-03

    The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies.

  16. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    NASA Astrophysics Data System (ADS)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  17. [Acceptance and mindfulness-based cognitive-behavioral therapies].

    PubMed

    Ngô, Thanh-Lan

    2013-01-01

    achieve specific goals. They focus on the present moment rather than on historical causes. However, they also present significant differences: control vs acceptance of thoughts, focus on cognition vs behavior, focus on the relationship between the individual and his thoughts vs cognitive content, goal of modifying dysfunctional beliefs vs metacognitive processes, use of experiential vs didactic methods, focus on symptoms vs quality of life, strategies used before vs after the unfolding of full emotional response. The main interventions based on mindfulness meditation and acceptance are: Acceptance and Commitment Therapy, Functional Analytic Therapy, the expanded model of Behavioral Activation, Metacognitive Therapy, Mindfulness based Cognitive Therapy, Dialectic Behavior Therapy, Integrative Behavioral Couples Therapy and Compassionate Mind Training. These are described in this article. They offer concepts and techniques which might enhance therapeutic efficacy. They teach a new way to deploy attention and to enter into a relationship with current experience (for example, defusion) in order to diminish cognitive reactivity, a maintenance factor for psychopathology, and to enhance psychological flexibility. The focus on cognitive process, metacognition as well as cognitive content might yield additional benefits in therapy. It is possible to combine traditional CBT with third wave approaches by using psychoeducation and cognitive restructuring in the beginning phases of therapy in order to establish thought bias and to then encourage acceptance of internal experiences as well as exposure to feared stimuli rather than to continue to use cognitive restructuring techniques. Traditional CBT and third wave approaches seem to impact different processes: the former enhance the capacity to observe and describe experiences and the latter diminish experiential avoidance and increase conscious action as well as acceptance. The identification of personal values helps to motivate the

  18. Monitoring the analytic surface.

    PubMed

    Spence, D P; Mayes, L C; Dahl, H

    1994-01-01

    How do we listen during an analytic hour? Systematic analysis of the speech patterns of one patient (Mrs. C.) strongly suggests that the clustering of shared pronouns (e.g., you/me) represents an important aspect of the analytic surface, preconsciously sensed by the analyst and used by him to determine when to intervene. Sensitivity to these patterns increases over the course of treatment, and in a final block of 10 hours shows a striking degree of contingent responsivity: specific utterances by the patient are consistently echoed by the analyst's interventions.

  19. Frontiers in analytical chemistry

    SciTech Connect

    Amato, I.

    1988-12-15

    Doing more with less was the modus operandi of R. Buckminster Fuller, the late science genius, and inventor of such things as the geodesic dome. In late September, chemists described their own version of this maxim--learning more chemistry from less material and in less time--in a symposium titled Frontiers in Analytical Chemistry at the 196th National Meeting of the American Chemical Society in Los Angeles. Symposium organizer Allen J. Bard of the University of Texas at Austin assembled six speakers, himself among them, to survey pretty widely different areas of analytical chemistry.

  20. Analytical study of electronic structure in armchair graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Zheng, Huaixiu; Wang, Z. F.; Luo, Tao; Shi, Q. W.; Chen, Jie

    2007-04-01

    We present the analytical solution of the wave function and energy dispersion of armchair graphene nanoribbons (GNRs) based on the tight-binding approximation. By imposing the hard-wall boundary condition, we find that the wave vector in the confined direction is discretized. This discrete wave vector serves as the index of different subbands. Our analytical solutions of wave function and associated energy dispersion reproduce the results of numerical tight-binding and the solutions based on the k•p approximation. In addition, we also find that all armchair GNRs with edge deformation have energy gaps, which agrees with recently reported first-principles calculations.

  1. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  2. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    NASA Astrophysics Data System (ADS)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  3. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  4. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    NASA Astrophysics Data System (ADS)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  5. Analytical Services Management System

    SciTech Connect

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  6. Analytics: Changing the Conversation

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  7. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  8. Social Learning Analytics

    ERIC Educational Resources Information Center

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  9. Effects of cold sphere walls in PET phantom measurements on the volume reproducing threshold.

    PubMed

    Hofheinz, F; Dittrich, S; Pötzsch, C; Hoff, J van den

    2010-02-21

    We studied quantitatively the effects of the discontinuity introduced in an otherwise homogeneous background by the cold walls of the standard spherical glass inserts commonly used in phantom measurements for calibration of threshold-based approaches to volumetric evaluation of PET investigations. We concentrated especially on the question of threshold-based volume determination. We computed analytically the convolution of an isotropic Gaussian point-spread function with the insert geometry (hot sphere + cold wall + warm background) and derived the theoretical background dependence of the volume reproducing threshold. This analysis shows a clear wall-related reduction of the optimal threshold with increasing background. The predictions of our theoretical analysis were verified in phantom measurements at background fractions between 0 and 0.29. Defining the background-corrected relative threshold [formula: see text] (T(abs): absolute volume reproducing threshold, A: measured activity at centre, B: background), we find that for a wall-less sphere T is independent of the background level. In the presence of cold walls, T drops (for not too small spheres, where recovery at the centre approaches 100%) from about 43% at B/A = 0 to about 25% at B/A = 0.5. Applying these thresholds to wall-less spheres leads to sizeable overestimates of the true volumes (43% at B/A = 0.5 for a sphere of 6 ml volume). We conclude that phantom measurements with standard sphere inserts for calibration of optimal thresholding algorithms introduce a systematic bias if performed at finite background levels. The observed background dependence is an artefact of the measurement procedure and does not reflect the conditions present in actual patient investigations.

  10. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser

    PubMed Central

    Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.

    2012-01-01

    Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and

  11. Reproducible, rugged, and inexpensive photocathode x-ray diode

    SciTech Connect

    Idzorek, G. C.; Tierney, T. E.; Lockard, T. E.; Moy, K. J.; Keister, J. W.

    2008-10-15

    The photoemissive cathode type of x-ray diode (XRD) is popular for measuring time and spectrally resolved output of pulsed power experiments. Vitreous carbon XRDs currently used on the Sandia National Laboratories Z-machine were designed in the early 1980s and use materials and processes no longer available. Additionally cathodes used in the high x-ray flux and dirty vacuum environment of a machine such as Z suffer from response changes requiring recalibration. In searching for a suitable replacement cathode, we discovered very high purity vitreous-carbon planchets are commercially available for use as biological substrates in scanning electron microscope (SEM) work. After simplifying the photocathode mounting to use commercially available components, we constructed a set of 20 XRDs using SEM planchets that were then calibrated at the National Synchrotron Light Source at Brookhaven National Laboratory. We present comparisons of the reproducibility and absolute calibrations between the current vitreous-carbon XRDs and our new design.

  12. New model for datasets citation and extraction reproducibility in VAMDC

    NASA Astrophysics Data System (ADS)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  13. Reproducible surface-enhanced Raman spectroscopy of small molecular anions

    NASA Astrophysics Data System (ADS)

    Owens, F. J.

    2011-03-01

    A gold-coated silicon substrate having an array of pyramidal shaped holes is shown to provide a reproducible surface-enhanced Raman spectra (SERS) in a number of inorganic ions such as ? , ? , ? , and ? deposited on the substrate as 10-3 to10-4 molar aqueous solutions of their salts. Of particular interest is the observation of a SERS effect in ? , the anion of ammonium nitrate, a commonly used terrorist explosive, suggesting the potential for sensitive detection of this material. An unusual increase in the frequency of the ? bending mode frequency is observed in the SERS spectra of KNO2. Density Functional Theory calculations of the frequencies of the normal modes of vibration of ? bonded to gold predict an upward shift of the frequencies compared with the calculated results for a free ? , suggesting a possible explanation for the shifts.

  14. GigaDB: promoting data dissemination and reproducibility

    PubMed Central

    Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612

  15. On the reproducibility of SSNTD track counting efficiency

    NASA Astrophysics Data System (ADS)

    Guedes O, S.; Hadler N, J. C.; Iunes, P. J.; Paulo, S. R.; Tello S, C. A.

    1998-12-01

    In this work, the influence of track density and chemical etching on the reproducibility of the track counting efficiency, ɛ0, in solid state nuclear track detectors (SSNTDs) is studied. This was performed by means of the analysis of CR-39 sheets that were attached to a thin film of natural uranium. Maintaining the chemical etching parameters constant and varying the exposition time, ɛ0 is observed to be constant for track densities varying between values approximately equal to the track background and those corresponding to the track overlapping limit, where track counting becomes difficult ( ˜10 5 cm-2 at our conditions). Otherwise, keeping constant the exposition time and varying the etching temperature, a variation in ɛ0 is found if a usual track counting criterion is employed. However, such a variation vanishes statistically when a more rigorous criterion is adopted.

  16. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  17. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  18. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

  19. High Interlaboratory Reproducibility and Accuracy of Next-Generation-Sequencing-Based Bacterial Genotyping in a Ring Trial.

    PubMed

    Mellmann, Alexander; Andersen, Paal Skytt; Bletz, Stefan; Friedrich, Alexander W; Kohl, Thomas A; Lilje, Berit; Niemann, Stefan; Prior, Karola; Rossen, John W; Harmsen, Dag

    2017-03-01

    Today, next-generation whole-genome sequencing (WGS) is increasingly used to determine the genetic relationships of bacteria on a nearly whole-genome level for infection control purposes and molecular surveillance. Here, we conducted a multicenter ring trial comprising five laboratories to determine the reproducibility and accuracy of WGS-based typing. The participating laboratories sequenced 20 blind-coded Staphylococcus aureus DNA samples using 250-bp paired-end chemistry for library preparation in a single sequencing run on an Illumina MiSeq sequencer. The run acceptance criteria were sequencing outputs >5.6 Gb and Q30 read quality scores of >75%. Subsequently, spa typing, multilocus sequence typing (MLST), ribosomal MLST, and core genome MLST (cgMLST) were performed by the participants. Moreover, discrepancies in cgMLST target sequences in comparisons with the included and also published sequence of the quality control strain ATCC 25923 were resolved using Sanger sequencing. All five laboratories fulfilled the run acceptance criteria in a single sequencing run without any repetition. Of the 400 total possible typing results, 394 of the reported spa types, sequence types (STs), ribosomal STs (rSTs), and cgMLST cluster types were correct and identical among all laboratories; only six typing results were missing. An analysis of cgMLST allelic profiles corroborated this high reproducibility; only 3 of 183,927 (0.0016%) cgMLST allele calls were wrong. Sanger sequencing confirmed all 12 discrepancies of the ring trial results in comparison with the published sequence of ATCC 25923. In summary, this ring trial demonstrated the high reproducibility and accuracy of current next-generation sequencing-based bacterial typing for molecular surveillance when done with nearly completely locked-down methods.

  20. High Interlaboratory Reproducibility and Accuracy of Next-Generation-Sequencing-Based Bacterial Genotyping in a Ring Trial

    PubMed Central

    Andersen, Paal Skytt; Bletz, Stefan; Friedrich, Alexander W.; Kohl, Thomas A.; Lilje, Berit; Niemann, Stefan; Prior, Karola; Rossen, John W.; Harmsen, Dag

    2017-01-01

    ABSTRACT Today, next-generation whole-genome sequencing (WGS) is increasingly used to determine the genetic relationships of bacteria on a nearly whole-genome level for infection control purposes and molecular surveillance. Here, we conducted a multicenter ring trial comprising five laboratories to determine the reproducibility and accuracy of WGS-based typing. The participating laboratories sequenced 20 blind-coded Staphylococcus aureus DNA samples using 250-bp paired-end chemistry for library preparation in a single sequencing run on an Illumina MiSeq sequencer. The run acceptance criteria were sequencing outputs >5.6 Gb and Q30 read quality scores of >75%. Subsequently, spa typing, multilocus sequence typing (MLST), ribosomal MLST, and core genome MLST (cgMLST) were performed by the participants. Moreover, discrepancies in cgMLST target sequences in comparisons with the included and also published sequence of the quality control strain ATCC 25923 were resolved using Sanger sequencing. All five laboratories fulfilled the run acceptance criteria in a single sequencing run without any repetition. Of the 400 total possible typing results, 394 of the reported spa types, sequence types (STs), ribosomal STs (rSTs), and cgMLST cluster types were correct and identical among all laboratories; only six typing results were missing. An analysis of cgMLST allelic profiles corroborated this high reproducibility; only 3 of 183,927 (0.0016%) cgMLST allele calls were wrong. Sanger sequencing confirmed all 12 discrepancies of the ring trial results in comparison with the published sequence of ATCC 25923. In summary, this ring trial demonstrated the high reproducibility and accuracy of current next-generation sequencing-based bacterial typing for molecular surveillance when done with nearly completely locked-down methods. PMID:28053217

  1. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  2. A workflow for reproducing mean benthic gas fluxes

    NASA Astrophysics Data System (ADS)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  3. Reproducibility of pre-syncopal responses to repeated orthostatic challenge

    NASA Astrophysics Data System (ADS)

    Goswami, Nandu; Grasser, Erik; Roessler, Andreas; Hinghofer-Szalkay, Helmut

    Aims: To study individual patterns of hemodynamic adjustments in subjects reaching orthostatically induced presyncope and to observe whether these are reproducible across three runs. Procedures and methods: 10 healthy young males were subjected to extreme cardiovascular stress three times: Graded orthostatic stress (GOS), consisting of head-up tilt combined with lower body negative pressure, was used to achieve a pre-syncopal end-point. All test runs were separated by two week intervals. Orthostatic effects on cardiac and vascular function were continuously monitored and standing times noted. Results: Across the group, heart rate (HR) increased 112 percent, while mean arterial blood pressure dropped by 15 percent, pulse pressure by 36 percent, and stroke volume index by 51 percent on average from supine control to presyncope. Repetitions of the orthostatic protocols did not influence standing times of test persons from the 1st to the 3rd trial (15 plus minus 6 to 17 plus minus 7 min). Some individuals responded either with an increase in HR only, while the others with combined HR and total peripheral resistance increase, albeit shortly, and this individual specifc pattern was observed across the three runs of combined GOS. Conclusion: Strategies for maintaining blood pressure in response to central hypovolemia in subjects induced by orthostatic stress are different between subjects. However, the same individual specific hemodynamic mechanism is employed each time to maintain the blood pressure when reconfronted by this stress. Individual patterns of hemodynamic adjustments to orthostatic stress are highly reproducible when these subjects reach pre-syncope three times.

  4. Relative Validity and Reproducibility of a Quantitative Food Frequency Questionnaire for Adolescents with Type 1 Diabetes: Validity of a Food Frequency Questionnaire

    PubMed Central

    Marques, Rosana de Moraes Borges; de Oliveira, Amanda Cristine; Teles, Sheylle Almeida da Silva; Stringuini, Maria Luiza Ferreira; Fornés, Nélida Shimid

    2014-01-01

    Background. Food frequency questionnaires are used to assess dietary intake in epidemiological studies. Objective. The aim of the study was to assess the relative validity and reproducibility of a quantitative food frequency questionnaire (QFFQ) for adolescents with type 1 diabetes. Methods: Validity was evaluated by comparing the data generated by QFFQs to those of 24-hour recalls (24 hrs). QFFQs were applied twice per patient to assess reproducibility. Statistical analysis included performing t-tests, obtaining Pearson correlation coefficients when necessary, correcting measurements for randomness by the weighted kappa method, calculating intraclass correlation coefficients, and generating Bland-Altman plots (P < 0,05). Results. The total energy and nutrient intake as estimated by the QFFQs were significantly higher than those from 24 hrs. Pearson correlation coefficients for energy-adjusted, deattenuated data ranged from 0.32 (protein) to 0.75 (lipid, unsaturated fat and calcium). Weighted kappa values ranged from 0.15 (vitamin C) to 0.45 (calcium). Bland-Altman plots indicated acceptable validity. As for reproducibility, intraclass correlation coefficients ranged from 0.24 (calcium) to 0.65 (lipid), and the Bland-Altman plots showed good agreement between the two questionnaires. Conclusion: The QFFQ presented an acceptable ability to classify correctly and with good reproducibility, adolescents with type 1 diabetes according to their levels of dietary intake. PMID:25250051

  5. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  6. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  7. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  8. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  9. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  10. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  11. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  12. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  13. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  14. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  15. Acceptance Criteria for Aerospace Structural Adhesives.

    DTIC Science & Technology

    ADHESIVES, *AIRFRAMES, PRIMERS, STRUCTURAL ENGINEERING, CHEMICAL COMPOSITION, MECHANICAL PROPERTIES, INDUSTRIAL PRODUCTION , DATA ACQUISITION , PARTICLE SIZE, ACCEPTANCE TESTS, ELASTOMERS, BONDING, QUALITY CONTROL, .

  16. 2013 SYR Accepted Poster Abstracts.

    PubMed

    2013-01-01

    SYR 2013 Accepted Poster abstracts: 1. Benefits of Yoga as a Wellness Practice in a Veterans Affairs (VA) Health Care Setting: If You Build It, Will They Come? 2. Yoga-based Psychotherapy Group With Urban Youth Exposed to Trauma. 3. Embodied Health: The Effects of a Mind�Body Course for Medical Students. 4. Interoceptive Awareness and Vegetable Intake After a Yoga and Stress Management Intervention. 5. Yoga Reduces Performance Anxiety in Adolescent Musicians. 6. Designing and Implementing a Therapeutic Yoga Program for Older Women With Knee Osteoarthritis. 7. Yoga and Life Skills Eating Disorder Prevention Among 5th Grade Females: A Controlled Trial. 8. A Randomized, Controlled Trial Comparing the Impact of Yoga and Physical Education on the Emotional and Behavioral Functioning of Middle School Children. 9. Feasibility of a Multisite, Community based Randomized Study of Yoga and Wellness Education for Women With Breast Cancer Undergoing Chemotherapy. 10. A Delphi Study for the Development of Protocol Guidelines for Yoga Interventions in Mental Health. 11. Impact Investigation of Breathwalk Daily Practice: Canada�India Collaborative Study. 12. Yoga Improves Distress, Fatigue, and Insomnia in Older Veteran Cancer Survivors: Results of a Pilot Study. 13. Assessment of Kundalini Mantra and Meditation as an Adjunctive Treatment With Mental Health Consumers. 14. Kundalini Yoga Therapy Versus Cognitive Behavior Therapy for Generalized Anxiety Disorder and Co-Occurring Mood Disorder. 15. Baseline Differences in Women Versus Men Initiating Yoga Programs to Aid Smoking Cessation: Quitting in Balance Versus QuitStrong. 16. Pranayam Practice: Impact on Focus and Everyday Life of Work and Relationships. 17. Participation in a Tailored Yoga Program is Associated With Improved Physical Health in Persons With Arthritis. 18. Effects of Yoga on Blood Pressure: Systematic Review and Meta-analysis. 19. A Quasi-experimental Trial of a Yoga based Intervention to Reduce Stress and

  17. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources.

  18. Reproducibility and Variability of the Cost Functions Reconstructed from Experimental Recordings in Multi-Finger Prehension

    PubMed Central

    Niu, Xun; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2012-01-01

    The main goal of the study is to examine whether the cost (objective) functions reconstructed from experimental recordings in multi-finger prehension tasks are reproducible over time, i.e., whether the functions reflect stable preferences of the subjects and can be considered personal characteristics of motor coordination. Young, healthy participants grasped an instrumented handle with varied values of external torque, load and target grasping force and repeated the trials on three days: Day 1, Day 2, and Day 7. By following Analytical Inverse Optimization (ANIO) computation procedures, the cost functions for individual subjects were reconstructed from the experimental recordings (individual finger forces) for each day. The cost functions represented second-order polynomials of finger forces with non-zero linear terms. To check whether the obtained cost functions were reproducible over time a cross-validation was performed: a cost function obtained on Day i was applied to experimental data observed on Day j (i≠j). In spite of the observed day-to-day variability of the performance and the cost functions, the ANIO reconstructed cost functions were found to be reproducible over time: application of a cost function Ci to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) The 2nd order coefficients Ki of the cost function showed negative linear relations with finger force magnitudes. This fact may be interpreted as encouraging involvement of stronger fingers in tasks requiring higher total force magnitude production. (b) The finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space, which has been confirmed for all subjects and all testing sessions. (c) The discovered principal components in the principal component analysis of the finger forces agreed well with the principle of superposition, i.e. the complex action of

  19. In acceptance we trust? Conceptualising acceptance as a viable approach to NGO security management.

    PubMed

    Fast, Larissa A; Freeman, C Faith; O'Neill, Michael; Rowley, Elizabeth

    2013-04-01

    This paper documents current understanding of acceptance as a security management approach and explores issues and challenges non-governmental organisations (NGOs) confront when implementing an acceptance approach to security management. It argues that the failure of organisations to systematise and clearly articulate acceptance as a distinct security management approach and a lack of organisational policies and procedures concerning acceptance hinder its efficacy as a security management approach. The paper identifies key and cross-cutting components of acceptance that are critical to its effective implementation in order to advance a comprehensive and systematic concept of acceptance. The key components of acceptance illustrate how organisational and staff functions affect positively or negatively an organisation's acceptance, and include: an organisation's principles and mission, communications, negotiation, programming, relationships and networks, stakeholder and context analysis, staffing, and image. The paper contends that acceptance is linked not only to good programming, but also to overall organisational management and structures.

  20. Quantum dot (QD)-modified carbon tape electrodes for reproducible electrochemiluminescence (ECL) emission on a paper-based platform.

    PubMed

    Shi, Chuan-Guo; Shan, Xia; Pan, Zhong-Qin; Xu, Jing-Juan; Lu, Chang; Bao, Ning; Gu, Hai-Ying

    2012-03-20

    Stable and sensitive electrochemiluminescence (ECL) detection relies on successful immobilization of quantum dots (QDs) on working electrodes. Herein, we report a new technique to apply double-sided carbon adhesive tape as the working electrode to improve the stability and reproducibility of QD-based ECL emission. CdS QD-modified electrodes were prepared by dropping and drying CdS QD suspension on the carbon adhesive tape supported by indium tin oxide (ITO) glass. The ECL detection was performed with the prepared electrode on a paper-based platform. We tested our system using H(2)O(2) of various concentrations and demonstrated that consistent ECL emission could be obtained. We attribute stable and reproducible ECL emission to the robust attachment of CdS QDs on the carbon adhesive tape. The proposed method could be used to quantify the concentration of dopamine from 1 μM to 10 mM based on the quenching effect of dopamine on ECL emission of CdS QD system using H(2)O(2) as the coreactant. Our approach addressed the problem in the integration of stable QD-based ECL detection with portable paper-based analytical devices. The similar design offers great potential for low-cost electrochemical and ECL analytical instruments.

  1. Reproducibility of Dynamic MR Imaging Pelvic Measurements: A Multi-institutional Study

    PubMed Central

    Lockhart, Mark E.; Fielding, Julia R.; Richter, Holly E.; Brubaker, Linda; Salomon, Caryl G.; Ye, Wen; Hakim, Christiane M.; Wai, Clifford Y.; Stolpen, Alan H.; Weber, Anne M.

    2008-01-01

    Purpose: To assess the reproducibility of bone and soft-tissue pelvimetry measurements obtained from dynamic magnetic resonance (MR) imaging studies in primiparous women across multiple centers. Materials and Methods: All subjects prospectively gave consent for participation in this institutional review board–approved, HIPAA-compliant study. At six clinical sites, standardized dynamic pelvic 1.5-T multiplanar T2-weighted MR imaging was performed in three groups of primiparous women at 6–12 months after birth: Group 1, vaginal delivery with anal sphincter tear (n = 93); group 2, vaginal delivery without anal sphincter tear (n = 79); and group 3, cesarean delivery without labor (n = 26). After standardized central training, blinded readers at separate clinical sites and a blinded expert central reader measured nine bone and 10 soft-tissue pelvimetry parameters. Subsequently, three readers underwent additional standardized training, and reread 20 MR imaging studies. Measurement variability was assessed by using intraclass correlation for agreement between the clinical site and central readers. Acceptable agreement was defined as an intraclass correlation coefficient (ICC) of at least 0.7. Results: There was acceptable agreement (ICC range, 0.71–0.93) for eight of 19 MR imaging parameters at initial readings of 198 subjects. The remaining parameters had an ICC range of 0.13–0.66. Additional training reduced measurement variability: Twelve of 19 parameters had acceptable agreement (ICC range, 0.70–0.92). Correlations were greater for bone (ICC, ≥0.70 in five [initial readings] and eight of nine [rereadings] variables) than for soft-tissue measurements (ICC, ≥0.70 in three [initial readings] of 10 and four [rereadings] of 10 readings, respectively). Conclusion: Despite standardized central training, there is high variability of pelvic MR imaging measurements among readers, particularly for soft-tissue structures. Although slightly improved with additional

  2. PET Imaging of D2/3 agonist binding in healthy human subjects with the radiotracer [11C]-N-propyl-nor-apomorphine (NPA): preliminary evaluation and reproducibility studies

    PubMed Central

    Narendran, Rajesh; Frankle, W. Gordon; Mason, N. Scott; Laymon, Charles M.; Lopresti, Brian J; Price, Julie C.; Kendro, Steve; Vora, Shivangi; Litschge, Maralee; Mountz, James M.; Mathis, Chester A.

    2009-01-01

    Objective (-)-N-[11C]-Propyl-norapomorphine (NPA) is a full dopamine D2/3 receptor agonist radiotracer suitable for imaging D2/3 receptors configured in a state of high affinity for agonists using Positron Emission Tomography (PET). The aim of the present study was to define the optimal analytic method to derive accurate and reliable D2/3 receptor parameters with [11C]NPA. Methods Six healthy subjects (4 females/2 males) underwent two [11C]NPA scans in the same day. D2/3 receptor binding parameters were estimated using kinetic analysis (using 1- and 2- tissue compartment models) as well as simplified reference tissue method in the three functional subdivisions of the striatum (associative striatum, AST; limbic striatum LST and sensorimotor striatum SMST). The test-retest variability and intraclass correlation coefficient were assessed for distribution volume (VT), binding potential relative to plasma concentration (BPP), and binding potential relative to nondisplaceable uptake (BPND) Results A two-tissue compartment kinetic model adequately described the functional subdivisions of the striatum as well as cerebellum time-activity data. The reproducibility of VT was excellent (≤ 10%) in all regions, for this approach. The reproducibility of both BPP (≤ 12%) and BPND (≤ 10%) was also excellent. The intraclass correlation coefficient of BPP and BPND were acceptable as well (> 0.75) in the three functional subdivisions of the striatum. Although SRTM led to an underestimation of BPND values relative to that estimated by kinetic analysis by 8 to 13%, the values derived using both the methods were reasonably well correlated (r2 = 0.89, n = 84). Both methods were similarly effective at detecting the differences in [11C]NPA BPND between subjects. Conclusion The results of this study indicate that [11C]NPA can be used to measure D2/3 receptors configured in a state of high affinity for the agonists with high reliability and reproducibility in the functional subdivisions

  3. On the accuracy and reproducibility of fiber optic (FO) and infrared (IR) temperature measurements of solid materials in microwave applications

    NASA Astrophysics Data System (ADS)

    Durka, Tomasz; Stefanidis, Georgios D.; Van Gerven, Tom; Stankiewicz, Andrzej

    2010-04-01

    The accuracy and reproducibility of temperature measurements in solid materials under microwave heating are investigated in this work using two of the most celebrated temperature measurement techniques, namely fiber optic probes (FO) and infrared (IR) sensors. Two solid materials with a wide range of applications in heterogeneous catalysis and different microwave absorbing capabilities are examined: CeO2-ZrO2 and Al2O3 particles. We investigate a number of effects ranging from purely technical issues, such as the use of a glass probe guide, over process operation parameters, such as the kind and the volume of the heated sample, to measurement related issues, such as the exact location of the probe in the sample. In this frame, the FO and IR methods are benchmarked. It was found that when using bare FO probes, not only is their lifetime reduced but also the reproducibility of the results is compromised. Using a glass probe guide greatly assists in precise location of the probe in the sample resulting in more reproducible temperature measurements. The FO reproducibility, though, decreases with increasing temperature. Besides, contrary to conventional heating, the sample temperature decreases with decreasing sample mass (and volume) at constant irradiation power level, confirming the volumetric nature of microwave heating. Furthermore, a strongly non-uniform temperature field is developed in the reactor despite the use of a monomode cavity and small amounts of samples. These temperature variations depending on the volume and position can only by detected by FO. In contrast, IR, which actually measures temperature at the exterior of the reactor wall, remains nearly insensitive to them and consistently underestimates the real temperature in the reactor. The modeler and the experimentalist should be rather circumspect in accepting the IR output as a representative reactor temperature.

  4. Distributed data networks: a blueprint for Big Data sharing and healthcare analytics.

    PubMed

    Popovic, Jennifer R

    2017-01-01

    This paper defines the attributes of distributed data networks and outlines the data and analytic infrastructure needed to build and maintain a successful network. We use examples from one successful implementation of a large-scale, multisite, healthcare-related distributed data network, the U.S. Food and Drug Administration-sponsored Sentinel Initiative. Analytic infrastructure-development concepts are discussed from the perspective of promoting six pillars of analytic infrastructure: consistency, reusability, flexibility, scalability, transparency, and reproducibility. This paper also introduces one use case for machine learning algorithm development to fully utilize and advance the portfolio of population health analytics, particularly those using multisite administrative data sources.

  5. Finite element algorithm reproducing hip squeak measured in experiment

    NASA Astrophysics Data System (ADS)

    Kang, Jaeyoung

    2017-04-01

    In this study, the frequency spectrum of squeak noise in hip joint system is measured in experiment. The numerical reproduction of hip squeak signal involves the formulation of the finite element geometry, the analytical contact kinematics such as Hertz theory and Coulomb's law and the mode-discretization. For general approach, the contact kinematics are analytically modeled to easily adjust the contact location, the contact area, the rotation direction, the pressure distribution, the friction law, and so on. Furthermore the friction stress vectors act on the 3-dimensional spherical contact surfaces where they can be divided into the steady-sliding and its transverse slip directions. Numerical calculations for the various contact parameters are conducted to investigate the possibility of hip squeak occurrence and the nonlinear oscillations after the onset of squeak are also solved. In the transient analysis, the periodic limit cycle of hip squeaking is shown to be the stick-slip type oscillation. Then the numerical frequency spectrum is qualitatively compared with hip squeak signal measured in experiment. The stick-slip oscillation during hip squeaking and its contact behavior will be also discussed over the contact area within one period.

  6. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  7. Multifunctional nanoparticles: analytical prospects.

    PubMed

    de Dios, Alejandro Simón; Díaz-García, Marta Elena

    2010-05-07

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifunctional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  8. Avatars in Analytical Gaming

    SciTech Connect

    Cowell, Andrew J.; Cowell, Amanda K.

    2009-08-29

    This paper discusses the design and use of anthropomorphic computer characters as nonplayer characters (NPC’s) within analytical games. These new environments allow avatars to play a central role in supporting training and education goals instead of planning the supporting cast role. This new ‘science’ of gaming, driven by high-powered but inexpensive computers, dedicated graphics processors and realistic game engines, enables game developers to create learning and training opportunities on par with expensive real-world training scenarios. However, there needs to be care and attention placed on how avatars are represented and thus perceived. A taxonomy of non-verbal behavior is presented and its application to analytical gaming discussed.

  9. Nuclear analytical chemistry

    SciTech Connect

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  10. Ultrasound in analytical chemistry.

    PubMed

    Priego Capote, F; Luque de Castro, M D

    2007-01-01

    Ultrasound is a type of energy which can help analytical chemists in almost all their laboratory tasks, from cleaning to detection. A generic view of the different steps which can be assisted by ultrasound is given here. These steps include preliminary operations usually not considered in most analytical methods (e.g. cleaning, degassing, and atomization), sample preparation being the main area of application. In sample preparation ultrasound is used to assist solid-sample treatment (e.g. digestion, leaching, slurry formation) and liquid-sample preparation (e.g. liquid-liquid extraction, emulsification, homogenization) or to promote heterogeneous sample treatment (e.g. filtration, aggregation, dissolution of solids, crystallization, precipitation, defoaming, degassing). Detection techniques based on use of ultrasonic radiation, the principles on which they are based, responses, and the quantities measured are also discussed.

  11. Analytic Modeling of Insurgencies

    DTIC Science & Technology

    2014-08-01

    influenced by interests and utilities. 4.1 Carrots and Sticks An analytic model that captures the aforementioned utilitarian aspect is presented in... carrots ” x. A dynamic utility-based model is developed in [26] in which the state variables are the fractions of contrarians (supporters of the...Unanticipated Political Revolution," Public Choice, vol. 61, pp. 41-74, 1989. [26] M. P. Atkinson, M. Kress and R. Szechtman, " Carrots , Sticks and Fog

  12. Industrial Analytics Corporation

    SciTech Connect

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  13. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  14. Meal Replacement Mass Reduction and Integration Acceptability Study

    NASA Technical Reports Server (NTRS)

    Sirmons, T.; Barrett, A.; Richardson, M.; Arias, D.; Schneiderman, J.; Slack, K.; Williams, T.; Douglas, G.

    2017-01-01

    NASA, in planning for long-duration missions, has an imperative to provide a food system with the necessary nutrition, acceptability, and safety to ensure sustainment of crew health and performance. The Orion Multi-Purpose Crew Vehicle (MPCV) and future exploration missions are mass constrained; therefore the team is challenged to reduce the mass of the food system by 10% while maintaining product safety, nutrition, and acceptability. Commercially available products do not meet the nutritional requirements for a full meal replacement in the spaceflight food system, and it is currently unknown if daily meal replacements will impact crew food intake and psychosocial health over time. The purpose of this study was to develop a variety of nutritionally balanced breakfast replacement bars that meet spaceflight nutritional, microbiological, sensorial, and shelf-life requirements, while enabling a 10% savings in food mass. To date, six nutrient-dense meal replacement bars (approximately 700 calories per bar) have been developed, using traditional methods of compression as well as novel ultrasonic compression technologies developed by Creative Resonance Inc. (Phoenix, AZ). The four highest rated bars were evaluated in the Human Exploration Research Analog (HERA) to assess the frequency with which actual meal replacement options may be implemented. Specifically, overall impact of bars on mood, satiety, digestive discomfort, and satisfaction with food. These factors are currently being analyzed to inform successful implementation strategies where crew maintain adequate food intake. In addition, these bars are currently undergoing shelf-life testing to determine long-term sensory acceptability, nutritional stability, qualitative stability of analytical measurements (i.e. water activity and texture), and microbiological compliance over two years of storage at room temperature and potential temperature abuse conditions to predict long-term acceptability. It is expected that

  15. Heavy Metal, Religiosity, and Suicide Acceptability.

    ERIC Educational Resources Information Center

    Stack, Steven

    1998-01-01

    Reports on data taken from the General Social Survey that found a link between "heavy metal" rock fanship and suicide acceptability. Finds that relationship becomes nonsignificant once level of religiosity is controlled. Heavy metal fans are low in religiosity, which contributes to greater suicide acceptability. (Author/JDM)

  16. Hanford Site liquid waste acceptance criteria

    SciTech Connect

    LUECK, K.J.

    1999-09-11

    This document provides the waste acceptance criteria for liquid waste managed by Waste Management Federal Services of Hanford, Inc. (WMH). These waste acceptance criteria address the various requirements to operate a facility in compliance with applicable environmental, safety, and operational requirements. This document also addresses the sitewide miscellaneous streams program.

  17. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  18. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  19. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  20. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  1. Nevada Test Site Waste Acceptance Criteria (NTSWAC)

    SciTech Connect

    NNSA /NSO Waste Management Project

    2008-06-01

    This document establishes the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office, Nevada Test Site Waste Acceptance Criteria (NTSWAC). The NTSWAC provides the requirements, terms, and conditions under which the Nevada Test Site will accept low-level radioactive (LLW) and LLW Mixed Waste (MW) for disposal.

  2. Consumer acceptance of ginseng food products.

    PubMed

    Chung, Hee Sook; Lee, Young-Chul; Rhee, Young Kyung; Lee, Soo-Yeun

    2011-01-01

    Ginseng has been utilized less in food products than in dietary supplements in the United States. Sensory acceptance of ginseng food products by U.S. consumers has not been reported. The objectives of this study were to: (1) determine the sensory acceptance of commercial ginseng food products and (2) assess influence of the addition of sweeteners to ginseng tea and ginseng extract to chocolate on consumer acceptance. Total of 126 consumers participated in 3 sessions for (1) 7 commercial red ginseng food products, (2) 10 ginseng teas varying in levels of sugar or honey, and (3) 10 ginseng milk or dark chocolates varying in levels of ginseng extract. Ginseng candy with vitamin C and ginseng crunchy white chocolate were the most highly accepted, while sliced ginseng root product was the least accepted among the seven commercial products. Sensory acceptance increased in proportion to the content of sugar and honey in ginseng tea, whereas acceptance decreased with increasing content of ginseng extract in milk and dark chocolates. Findings demonstrate that ginseng food product types with which consumers have been already familiar, such as candy and chocolate, will have potential for success in the U.S. market. Chocolate could be suggested as a food matrix into which ginseng can be incorporated, as containing more bioactive compounds than ginseng tea at a similar acceptance level. Future research may include a descriptive analysis with ginseng-based products to identify the key drivers of liking and disliking for successful new product development.

  3. Genres Across Cultures: Types of Acceptability Variation

    ERIC Educational Resources Information Center

    Shaw, Philip; Gillaerts, Paul; Jacobs, Everett; Palermo, Ofelia; Shinohara, Midori; Verckens, J. Piet

    2004-01-01

    One can ask four questions about genre validity across cultures. Does a certain form or configuration occur in the culture in question? Is it acceptable? If acceptable, is it in practice preferred? Is it recommended by prescriptive authorities? This paper reports the results of an attempt to answer these questions empirically by testing the…

  4. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) Section... may, under appropriate circumstances, require offerors to demonstrate that the items offered— (1)...

  5. 48 CFR 2811.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Market acceptance. 2811.103... Planning DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 2811.103 Market acceptance... offerors to demonstrate that the items offered meet the criteria set forth in FAR 11.103(a)....

  6. 5 CFR 1655.11 - Loan acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Loan acceptance. 1655.11 Section 1655.11 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD LOAN PROGRAM § 1655.11 Loan acceptance. The TSP record keeper will reject a loan application if: (a) The participant is not qualified to apply...

  7. 5 CFR 1655.11 - Loan acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Loan acceptance. 1655.11 Section 1655.11 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD LOAN PROGRAM § 1655.11 Loan acceptance. The TSP record keeper will reject a loan application if: (a) The participant is not qualified to apply...

  8. 5 CFR 1655.11 - Loan acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Loan acceptance. 1655.11 Section 1655.11 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD LOAN PROGRAM § 1655.11 Loan acceptance. The TSP record keeper will reject a loan application if: (a) The participant is not qualified to apply...

  9. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  10. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  11. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  12. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND SECURITY... Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf of the head...

  13. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  14. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  15. Fabricating Cotton Analytical Devices.

    PubMed

    Lin, Shang-Chi; Hsu, Min-Yen; Kuan, Chen-Meng; Tseng, Fan-Gang; Cheng, Chao-Min

    2016-08-30

    A robust, low-cost analytical device should be user-friendly, rapid, and affordable. Such devices should also be able to operate with scarce samples and provide information for follow-up treatment. Here, we demonstrate the development of a cotton-based urinalysis (i.e., nitrite, total protein, and urobilinogen assays) analytical device that employs a lateral flow-based format, and is inexpensive, easily fabricated, rapid, and can be used to conduct multiple tests without cross-contamination worries. Cotton is composed of cellulose fibers with natural absorptive properties that can be leveraged for flow-based analysis. The simple but elegant fabrication process of our cotton-based analytical device is described in this study. The arrangement of the cotton structure and test pad takes advantage of the hydrophobicity and absorptive strength of each material. Because of these physical characteristics, colorimetric results can persistently adhere to the test pad. This device enables physicians to receive clinical information in a timely manner and shows great potential as a tool for early intervention.

  16. Understanding diversity: the importance of social acceptance.

    PubMed

    Chen, Jacqueline M; Hamilton, David L

    2015-04-01

    Two studies investigated how people define and perceive diversity in the historically majority-group dominated contexts of business and academia. We hypothesized that individuals construe diversity as both the numeric representation of racial minorities and the social acceptance of racial minorities within a group. In Study 1, undergraduates' (especially minorities') perceptions of campus diversity were predicted by perceived social acceptance on a college campus, above and beyond perceived minority representation. Study 2 showed that increases in a company's representation and social acceptance independently led to increases in perceived diversity of the company among Whites. Among non-Whites, representation and social acceptance only increased perceived diversity of the company when both qualities were high. Together these findings demonstrate the importance of both representation and social acceptance to the achievement of diversity in groups and that perceiver race influences the relative importance of these two components of diversity.

  17. Heavy metal, religiosity, and suicide acceptability.

    PubMed

    Stack, S

    1998-01-01

    There has been little work at the national level on the subject of musical subcultures and suicide acceptability. The present work explores the link between "heavy metal" rock fanship and suicide acceptability. Metal fanship is thought to elevate suicide acceptability through such means as exposure to a culture of personal and societal chaos marked by hopelessness, and through its associations with demographic risk factors such as gender, socioeconomic status, and education. Data are taken from the General Social Survey. A link between heavy metal fanship and suicide acceptability is found. However, this relationship becomes nonsignificant once level of religiosity is controlled. Metal fans are low in religiosity, which contributes, in turn, to greater suicide acceptability.

  18. Monte Carlo determination of Phoswich Array acceptance

    SciTech Connect

    Costales, J.B.; E859 Collaboration

    1992-07-01

    The purpose of this memo is to describe the means by which the acceptance of the E859 Phoswich Array is determined. By acceptance, two things are meant: first, the geometrical acceptance (the angular size of the modules); second, the detection acceptance (the probability that a particle of a given 4-momentum initially in the detector line-of-sight is detected as such). In particular, this memo will concentrate on those particles for which the energy of the particle can be sufficiently measured; that is to say, protons, deuterons and tritons. In principle, the phoswich array can measure the low end of the pion energy spectrum, but with a poor resolution. The detection acceptance of pions and baryon clusters heavier than tritons will be neglected in this memo.

  19. Highly reproducible SERS detection in sequential injection analysis: real time preparation and application of photo-reduced silver substrate in a moving flow-cell.

    PubMed

    El-Zahry, Marwa R; Genner, Andreas; Refaat, Ibrahim H; Mohamed, Horria A; Lendl, Bernhard

    2013-11-15

    This paper reports an improved way for performing highly reproducible surface enhanced Raman scattering of different analytes using an automated flow system. The method uses a confocal Raman microscope to prepare SERS active silver spots on the window of a flow cell by photo-reduction of silver nitrate in the presence of citrate. Placement of the flow cell on an automated x and y stages of the Raman microscope allows to prepare a fresh spot for every new measurement. This procedure thus efficiently avoids any carry over effects which might result from adsorption of the analyte on the SERS active material and enables highly reproducible SERS measurements. For reproducible liquid handling the used sequential injection analysis system as well as the Raman microscope was operated by the flexible LabVIEW based software ATLAS developed in our group. Quantitative aspects were investigated using Cu(PAR)2 as a model analyte. Concentration down to 5×10(-6) M provided clear SERS spectra, a linear concentration dependence of the SERS intensities at 1333 cm(-1) was obtained from 5×10(-5) to 1×10(-3) with a correlation coefficient r=0.999. The coefficient of variation of the method Vxo was found to be 5.6% and the calculated limit of detection 1.7×10(-5) M. The results demonstrate the potential of SERS spectroscopy to be used as a molecular specific detector in aqueous flow systems.

  20. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  1. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  2. Virtual Raters for Reproducible and Objective Assessments in Radiology

    NASA Astrophysics Data System (ADS)

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-04-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics.

  3. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  4. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  5. Reproducibility and reliability of fetal cardiac time intervals using magnetocardiography.

    PubMed

    van Leeuwen, P; Lange, S; Klein, A; Geue, D; Zhang, Y; Krause, H J; Grönemeyer, D

    2004-04-01

    We investigated several factors which may affect the accuracy of fetal cardiac time intervals (CTI) determined in magnetocardiographic (MCG) recordings: observer differences, the number of available recording sites and the type of sensor used in acquisition. In 253 fetal MCG recordings, acquired using different biomagnetometer devices between the 15th and 42nd weeks of gestation, P-wave, QRS complex and T-wave onsets and ends were identified in signal averaged data sets independently by different observers. Using a defined procedure for setting signal events, interobserver reliability was high. Increasing the number of registration sites led to more accurate identification of the events. The differences in wave morphology between magnetometer and gradiometer configurations led to deviations in timing whereas the differences between low and high temperature devices seemed to be primarily due to noise. Signal-to-noise ratio played an important overall role in the accurate determination of CTI and changes in signal amplitude associated with fetal maturation may largely explain the effects of gestational age on reproducibility. As fetal CTI may be of value in the identification of pathologies such as intrauterine growth retardation or fetal cardiac hypertrophy, their reliable estimation will be enhanced by strategies which take these factors into account.

  6. Direct, quantitative clinical assessment of hand function: usefulness and reproducibility.

    PubMed

    Goodson, Alexander; McGregor, Alison H; Douglas, Jane; Taylor, Peter

    2007-05-01

    Methods of assessing functional impairment in arthritic hands include pain assessments and disability scoring scales which are subjective, variable over time and fail to take account of the patients' need to adapt to deformities. The aim of this study was to evaluate measures of functional strength and joint motion in the assessment of the rheumatoid (RA) and osteoarthritic (OA) hand. Ten control subjects, ten RA and ten OA patients were recruited for the study. All underwent pain and disability scoring and functional assessment of the hand using measures of pinch/grip strength and range of joint motion (ROM). Functional assessments including ROM analyses at interphalangeal (IP), metacarpophalangeal (MCP) and wrist joints along with pinch/grip strength clearly discriminated between patient groups (RA vs. OA MCP ROM P<0.0001), pain and disability scales were unable to. In the RA there were demonstrable relationships between ROM measurements and disability (R2=0.31) as well as disease duration (R2=0.37). Intra-patient measures of strength were robust whereas inter-patient comparisons showed variability. In conclusion, pinch/grip strength and ROM are clinically reproducible assessments that may more accurately reflect functional impairment associated with arthritis.

  7. Numerically reproduced internal wave spectra in the deep ocean

    NASA Astrophysics Data System (ADS)

    Sugiyama, Yoshifumi; Niwa, Yoshihiro; Hibiya, Toshiyuki

    2009-04-01

    A vertically two-dimensional internal wave field is forced equally at the near-inertial frequency and the semidiurnal tidal frequency both at the lowest vertical wavenumber. These correspond to wind forcing and internal tide forcing, the main energy sources for the internal wave field. After 5 years of spin-up, a quasi-stationary internal wave field with characteristics of the Garrett-Munk-like spectrum is successfully reproduced. Furthermore, we carry out additional experiments by changing the strength of the semidiurnal tidal forcing relative to the near-inertial forcing. It is demonstrated that the Garrett-Munk-like spectrum is created and maintained only when energy is supplied both from the near-inertial forcing and the semidiurnal tidal forcing. So long as both energy sources are available, nonlinear interactions among internal waves occur such that the resulting internal wave spectrum becomes close to the Garrett-Munk-like spectrum irrespective of the ratio of the near-inertial forcing to the semidiurnal tidal forcing.

  8. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  9. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  10. Can a coupled meteorology–chemistry model reproduce the ...

    EPA Pesticide Factsheets

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  11. Highly reproducible thermocontrolled electrospun fiber based organic photovoltaic devices.

    PubMed

    Kim, Taehoon; Yang, Seung Jae; Sung, Sae Jin; Kim, Yern Seung; Chang, Mi Se; Jung, Haesol; Park, Chong Rae

    2015-03-04

    In this work, we examined the reasons underlying the humidity-induced morphological changes of electrospun fibers and suggest a method of controlling the electrospun fiber morphology under high humidity conditions. We fabricated OPV devices composed of electrospun fibers, and the performance of the OPV devices depends significantly on the fiber morphology. The evaporation rate of a solvent at various relative humidity was measured to investigate the effects of the relative humidity during electrospinning process. The beaded nanofiber morphology of electrospun fibers was originated due to slow solvent evaporation rate under high humidity conditions. To increase the evaporation rate under high humidity conditions, warm air was applied to the electrospinning system. The beads that would have formed on the electrospun fibers were completely avoided, and the power conversion efficiencies of OPV devices fabricated under high humidity conditions could be restored. These results highlight the simplicity and effectiveness of the proposed method for improving the reproducibility of electrospun nanofibers and performances of devices consisting of the electrospun nanofibers, regardless of the relative humidity.

  12. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  13. The inter-observer reproducibility of Shafer's sign.

    PubMed

    Qureshi, F; Goble, R

    2009-03-01

    Pigment cells in the anterior vitreous (Shafer's sign) are known to be associated with retinal breaks. We sought to identify the reproducibility of Shafer's sign between different grades of ophthalmic staff. In all 47 patients were examined by a consultant vitreo-retinal surgeon, a senior house officer (SHO) and optician for Shafer's sign. Cohen's kappa for consultant vs SHO assessment of Shafer's sign was 0.55 while for consultant vs optician assessment, kappa was 0.28. Retinal tears were present in 63.8% of our series. Consultant assessment of Shafer's sign with fundoscopy findings, we found specificity to be 93.5% while sensitivity was 93.8%. Kappa for consultant assessment of Shafer's sign vs break presence was 0.86.Consultant and SHO assessment of Shafer's sign is of moderate agreement while optician assessment is fair. These results suggest a relationship between training and the assessment of Shafer's sign. We feel this study suggests caution in undue reliance on Shafer's sign particularly for inexperienced members of staff.

  14. A silicon retina that reproduces signals in the optic nerve

    NASA Astrophysics Data System (ADS)

    Zaghloul, Kareem A.; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor—and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  15. Reproducibility of measurements of trace gas concentrations in expired air.

    PubMed

    Strocchi, A; Ellis, C; Levitt, M D

    1991-07-01

    Measurement of the pulmonary excretion of trace gases has been used as a simple means of assessing metabolic reactions. End alveolar trace gas concentration, rather than excretory rate, is usually measured. However, the reproducibility of this measurement has received little attention. In 17 healthy subjects, duplicate collections of alveolar air were obtained within 1 minute of each other using a commercially available alveolar air sampler. The concentrations of hydrogen, methane, carbon monoxide, and carbon dioxide were measured. When the subject received no instruction on how to expire into the device, a difference of 28% +/- 19% (1SD) was found between duplicate determinations of hydrogen. Instructing the subjects to avoid hyperventilation or to inspire maximally and exhale immediately resulted in only minor reduction in variability. However, a maximal inspiration held for 15 seconds before exhalation reduced the difference to a mean of 9.6% +/- 8.0%, less than half that observed with the other expiratory techniques. Percentage difference of methane measurements with the four different expiratory techniques yielded results comparable to those obtained for hydrogen. In contrast, percentage differences for carbon monoxide measurements were similar for all expiratory techniques. When normalized to a PCO2 of 5%, the variability of hydrogen measurements with the breath-holding technique was reduced to 6.8% +/- 4.7%, a value significantly lower than that obtained with the other expiratory methods. This study suggests that attention to the expiratory technique could improve the accuracy of tests using breath hydrogen measurements.

  16. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    SciTech Connect

    Tabb, David L.; Wang, Xia; Carr, Steven A.; Clauser, Karl R.; Mertins, Philipp; Chambers, Matthew C.; Holman, Jerry D.; Wang, Jing; Zhang, Bing; Zimmerman, Lisa J.; Chen, Xian; Gunawardena, Harsha P.; Davies, Sherri R.; Ellis, Matthew J. C.; Li, Shunqiang; Townsend, R. Reid; Boja, Emily S.; Ketchum, Karen A.; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Liu, Tao; Kim, Sangtae; McDermott, Jason E.; Payne, Samuel H.; Petyuk, Vladislav A.; Rodland, Karin D.; Smith, Richard D.; Yang, Feng; Chan, Daniel W.; Zhang, Bai; Zhang, Hui; Zhang, Zhen; Zhou, Jian-Ying; Liebler, Daniel C.

    2016-03-04

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC-MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation. From these assessments we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61-93% of the time. When comparing across different instruments and quantitative technologies, differential genes were reproduced by other data sets from 67-99% of the time. Projecting gene differences to biological pathways and networks increased the similarities. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation.

  17. Histopathologic reproducibility of thyroid disease in an epidemiologic study

    SciTech Connect

    Ron, E.; Griffel, B.; Liban, E.; Modan, B.

    1986-03-01

    An investigation of the long-term effects of childhood scalp irradiation demonstrated a significantly increased risk of thyroid tumors in the irradiated population. Because of the complexity of thyroid cancer diagnosis, a histopathologic slide review of 59 of the 68 patients (irradiated and nonirradiated) with thyroid disease was undertaken. The review revealed 90% agreement (kappa = +0.85, P less than 0.01) between the original and review diagnosis. Four of 27 cases previously diagnosed as malignant were reclassified as benign, yielding a cancer misdiagnosis rate of 14.8%. All four of the misdiagnosed cancers were of follicular or mixed papillary-follicular type. As a result of the histologic review, the ratio of malignant to benign tumors decreased from 2.55 to 1.75. Since disagreement in diagnosis was similar in the irradiated and nonirradiated groups, the relative risk of radiation-associated neoplasms did not change substantially. The histopathologic review shows that although there were some problems in diagnostic reproducibility, they were not statistically significant and did not alter our previous conclusions regarding radiation exposure. However, a 15% reduction in the number of malignancies might affect epidemiologic studies with an external comparison as well as geographic or temporal comparisons.

  18. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder

    PubMed Central

    Richardson, Jason R.; Taylor, Michele M.; Shalat, Stuart L.; Guillot, Thomas S.; Caudle, W. Michael; Hossain, Muhammad M.; Mathews, Tiffany A.; Jones, Sara R.; Cory-Slechta, Deborah A.; Miller, Gary W.

    2015-01-01

    Attention-deficit hyperactivity disorder (ADHD) is estimated to affect 8–12% of school-age children worldwide. ADHD is a complex disorder with significant genetic contributions. However, no single gene has been linked to a significant percentage of cases, suggesting that environmental factors may contribute to ADHD. Here, we used behavioral, molecular, and neurochemical techniques to characterize the effects of developmental exposure to the pyrethroid pesticide deltamethrin. We also used epidemiologic methods to determine whether there is an association between pyrethroid exposure and diagnosis of ADHD. Mice exposed to the pyrethroid pesticide deltamethrin during development exhibit several features reminiscent of ADHD, including elevated dopamine transporter (DAT) levels, hyperactivity, working memory and attention deficits, and impulsive-like behavior. Increased DAT and D1 dopamine receptor levels appear to be responsible for the behavioral deficits. Epidemiologic data reveal that children aged 6–15 with detectable levels of pyrethroid metabolites in their urine were more than twice as likely to be diagnosed with ADHD. Our epidemiologic finding, combined with the recapitulation of ADHD behavior in pesticide-treated mice, provides a mechanistic basis to suggest that developmental pyrethroid exposure is a risk factor for ADHD.—Richardson, J. R., Taylor, M. M., Shalat, S. L., Guillot III, T. S., Caudle, W. M., Hossain, M. M., Mathews, T. A., Jones, S. R., Cory-Slechta, D. A., Miller, G. W. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder. PMID:25630971

  19. Evaluating the reproducibility of quantifying modified nucleosides from ribonucleic acids by LC–UV–MS

    PubMed Central

    Russell, Susan P.; Limbach, Patrick A.

    2013-01-01

    Post-transcriptional chemical covalent modification of adenosine, guanosine, uridine and cytidine occurs frequently in all types of ribonucleic acids (RNAs). In ribosomal RNA (rRNA) and transfer RNA (tRNA) these modifications make important contributions to RNA structure and stability and to the accuracy and efficiency of protein translation. The functional dynamics, synergistic nature and regulatory roles of these posttranscriptional nucleoside modifications within the cell are not well characterized. These modifications are present at very low levels and isolation of individual nucleosides for analysis requires a complex multi-step approach. The focus of this study is to characterize the reproducibility of a liquid chromatography method used to isolate and quantitatively characterize modified nucleosides in tRNA and rRNA when nucleoside detection is performed using ultraviolet and mass spectrometric detection (UV and MS, respectively). Despite the analytical challenges of sample isolation and dynamic range, quantitative profiling of modified nucleosides obtained from bacterial tRNAs and rRNAs is feasible at relative standard deviations of 5% RSD or less. PMID:23500350

  20. Discrete restricted four-body problem: Existence of proof of equilibria and reproducibility of periodic orbits

    SciTech Connect

    Minesaki, Yukitaka

    2015-01-01

    We propose the discrete-time restricted four-body problem (d-R4BP), which approximates the orbits of the restricted four-body problem (R4BP). The d-R4BP is given as a special case of the discrete-time chain regularization of the general N-body problem published in Minesaki. Moreover, we analytically prove that the d-R4BP yields the correct orbits corresponding to the elliptic relative equilibrium solutions of the R4BP when the three primaries form an equilateral triangle at any time. Such orbits include the orbit of a relative equilibrium solution already discovered by Baltagiannis and Papadakis. Until the proof in this work, there has been no discrete analog that preserves the orbits of elliptic relative equilibrium solutions in the R4BP. For a long time interval, the d-R4BP can precisely compute some stable periodic orbits in the Sun–Jupiter–Trojan asteroid–spacecraft system that cannot necessarily be reproduced by other generic integrators.

  1. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  2. An effective meshfree reproducing kernel method for buckling analysis of cylindrical shells with and without cutouts

    NASA Astrophysics Data System (ADS)

    Sadamoto, S.; Ozdemir, M.; Tanaka, S.; Taniguchi, K.; Yu, T. T.; Bui, T. Q.

    2017-02-01

    The paper is concerned with eigen buckling analysis of curvilinear shells with and without cutouts by an effective meshfree method. In particular, shallow shell, cylinder and perforated cylinder buckling problems are considered. A Galerkin meshfree reproducing kernel (RK) approach is then developed. The present meshfree curvilinear shell model is based on Reissner-Mindlin plate formulation, which allows the transverse shear deformation of the curved shells. There are five degrees of freedom per node (i.e., three displacements and two rotations). In this setting, the meshfree interpolation functions are derived from the RK. A singular kernel is introduced to impose the essential boundary conditions because of the RK shape functions, which do not automatically possess the Kronecker delta property. The stiffness matrix is derived using the stabilized conforming nodal integration technique. A convected coordinate system is introduced into the formulation to deal with the curvilinear surface. More importantly, the RKs taken here are used not only for the interpolation of the curved geometry, but also for the approximation of field variables. Several numerical examples with shallow shells and full cylinder models are considered, and the critical buckling loads and their buckling mode shapes are calculated by the meshfree eigenvalue analysis and examined. To show the accuracy and performance of the developed meshfree method, the computed critical buckling loads and mode shapes are compared with reference solutions based on boundary domain element, finite element and analytical methods.

  3. Discrete Restricted Four-Body Problem: Existence of Proof of Equilibria and Reproducibility of Periodic Orbits

    NASA Astrophysics Data System (ADS)

    Minesaki, Yukitaka

    2015-01-01

    We propose the discrete-time restricted four-body problem (d-R4BP), which approximates the orbits of the restricted four-body problem (R4BP). The d-R4BP is given as a special case of the discrete-time chain regularization of the general N-body problem published in Minesaki. Moreover, we analytically prove that the d-R4BP yields the correct orbits corresponding to the elliptic relative equilibrium solutions of the R4BP when the three primaries form an equilateral triangle at any time. Such orbits include the orbit of a relative equilibrium solution already discovered by Baltagiannis and Papadakis. Until the proof in this work, there has been no discrete analog that preserves the orbits of elliptic relative equilibrium solutions in the R4BP. For a long time interval, the d-R4BP can precisely compute some stable periodic orbits in the Sun-Jupiter-Trojan asteroid-spacecraft system that cannot necessarily be reproduced by other generic integrators.

  4. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  5. ANALYTIC MODELING OF THE MORETON WAVE KINEMATICS

    SciTech Connect

    Temmer, M.; Veronig, A. M.

    2009-09-10

    The issue whether Moreton waves are flare-ignited or coronal mass ejection (CME)-driven, or a combination of both, is still a matter of debate. We develop an analytical model describing the evolution of a large-amplitude coronal wave emitted by the expansion of a circular source surface in order to mimic the evolution of a Moreton wave. The model results are confronted with observations of a strong Moreton wave observed in association with the X3.8/3B flare/CME event from 2005 January 17. Using different input parameters for the expansion of the source region, either derived from the real CME observations (assuming that the upward moving CME drives the wave), or synthetically generated scenarios (expanding flare region, lateral expansion of the CME flanks), we calculate the kinematics of the associated Moreton wave signature. Those model input parameters are determined which fit the observed Moreton wave kinematics best. Using the measured kinematics of the upward moving CME as the model input, we are not able to reproduce the observed Moreton wave kinematics. The observations of the Moreton wave can be reproduced only by applying a strong and impulsive acceleration for the source region expansion acting in a piston mechanism scenario. Based on these results we propose that the expansion of the flaring region or the lateral expansion of the CME flanks is more likely the driver of the Moreton wave than the upward moving CME front.

  6. A Positive View of Peer Acceptance in Aggressive Youth: Risk for Future Peer Acceptance.

    ERIC Educational Resources Information Center

    Hughes, Jan N.; Cavell, Timothy A.; Prasad-Gaur, Archna

    2001-01-01

    Uses longitudinal data to determine whether a positive view of perceived peer acceptance is a risk factor for continued aggression and social rejection for aggressive children. Results indicate that perceived peer acceptance did not predict aggression. However, children who reported higher levels of perceived peer acceptance received lower actual…

  7. Various versions of analytic QCD and skeleton-motivated evaluation of observables

    SciTech Connect

    Cvetic, Gorazd; Valenzuela, Cristian

    2006-12-01

    We present skeleton-motivated evaluation of QCD observables. The approach can be applied in analytic versions of QCD in certain classes of renormalization schemes. We present two versions of analytic QCD which can be regarded as low-energy modifications of the ''minimal'' analytic QCD and which reproduce the measured value of the semihadronic {tau} decay ratio r{sub {tau}}. Further, we describe an approach of calculating the higher-order analytic couplings A{sub k} (k=2,3,...) on the basis of logarithmic derivatives of the analytic coupling A{sub 1}(Q{sup 2}). This approach can be applied in any version of analytic QCD. We adjust the free parameters of the aforementioned two analytic models in such a way that the skeleton-motivated evaluation reproduces the correct known values of r{sub {tau}} and of the Bjorken polarized sum rule (BjPSR) d{sub b}(Q{sup 2}) at a given point (e.g., at Q{sup 2}=2 GeV{sup 2}). We then evaluate the low-energy behavior of the Adler function d{sub v}(Q{sup 2}) and the BjPSR d{sub b}(Q{sup 2}) in the aforementioned evaluation approach, in the three analytic versions of QCD. We compare with the results obtained in the minimal analytic QCD and with the evaluation approach of Milton et al. and Shirkov.

  8. Consumer Acceptance of Dry Dog Food Variations

    PubMed Central

    Donfrancesco, Brizio Di; Koppel, Kadri; Swaney-Stueve, Marianne; Chambers, Edgar

    2014-01-01

    Simple Summary The objectives of this study were to compare the acceptance of different dry dog food products by consumers, determine consumer clusters for acceptance, and identify the characteristics of dog food that drive consumer acceptance. Pet owners evaluated dry dog food samples available in the US market. The results indicated that appearance of the sample, especially the color, influenced pet owner’s overall liking more than the aroma of the product. Abstract The objectives of this study were to compare the acceptance of different dry dog food products by consumers, determine consumer clusters for acceptance, and identify the characteristics of dog food that drive consumer acceptance. Eight dry dog food samples available in the US market were evaluated by pet owners. In this study, consumers evaluated overall liking, aroma, and appearance liking of the products. Consumers were also asked to predict their purchase intent, their dog’s liking, and cost of the samples. The results indicated that appearance of the sample, especially the color, influenced pet owner’s overall liking more than the aroma of the product. Overall liking clusters were not related to income, age, gender, or education, indicating that general consumer demographics do not appear to play a main role in individual consumer acceptance of dog food products. PMID:26480043

  9. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  10. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  11. Color accuracy and reproducibility in whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Hulsken, Bas

    2014-01-01

    Abstract We propose a workflow for color reproduction in whole slide imaging (WSI) scanners, such that the colors in the scanned images match to the actual slide color and the inter-scanner variation is minimum. We describe a new method of preparation and verification of the color phantom slide, consisting of a standard IT8-target transmissive film, which is used in color calibrating and profiling the WSI scanner. We explore several International Color Consortium (ICC) compliant techniques in color calibration/profiling and rendering intents for translating the scanner specific colors to the standard display (sRGB) color space. Based on the quality of the color reproduction in histopathology slides, we propose the matrix-based calibration/profiling and absolute colorimetric rendering approach. The main advantage of the proposed workflow is that it is compliant to the ICC standard, applicable to color management systems in different platforms, and involves no external color measurement devices. We quantify color difference using the CIE-DeltaE2000 metric, where DeltaE values below 1 are considered imperceptible. Our evaluation on 14 phantom slides, manufactured according to the proposed method, shows an average inter-slide color difference below 1 DeltaE. The proposed workflow is implemented and evaluated in 35 WSI scanners developed at Philips, called the Ultra Fast Scanners (UFS). The color accuracy, measured as DeltaE between the scanner reproduced colors and the reference colorimetric values of the phantom patches, is improved on average to 3.5 DeltaE in calibrated scanners from 10 DeltaE in uncalibrated scanners. The average inter-scanner color difference is found to be 1.2 DeltaE. The improvement in color performance upon using the proposed method is apparent with the visual color quality of the tissue scans. PMID:26158041

  12. Development of a Consistent and Reproducible Porcine Scald Burn Model

    PubMed Central

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  13. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    NASA Astrophysics Data System (ADS)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  14. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  15. Scan-rescan reproducibility of CT densitometric measures of emphysema

    NASA Astrophysics Data System (ADS)

    Chong, D.; van Rikxoort, E. M.; Kim, H. J.; Goldin, J. G.; Brown, M. S.

    2011-03-01

    This study investigated the reproducibility of HRCT densitometric measures of emphysema in patients scanned twice one week apart. 24 emphysema patients from a multicenter study were scanned at full inspiration (TLC) and expiration (RV), then again a week later for four scans total. Scans for each patient used the same scanner and protocol, except for tube current in three patients. Lung segmentation with gross airway removal was performed on the scans. Volume, weight, mean lung density (MLD), relative area under -950HU (RA-950), and 15th percentile (PD-15) were calculated for TLC, and volume and an airtrapping mask (RA-air) between -950 and -850HU for RV. For each measure, absolute differences were computed for each scan pair, and linear regression was performed against volume difference in a subgroup with volume difference <500mL. Two TLC scan pairs were excluded due to segmentation failure. The mean lung volumes were 5802 +/- 1420mL for TLC, 3878 +/- 1077mL for RV. The mean absolute differences were 169mL for TLC volume, 316mL for RV volume, 14.5g for weight, 5.0HU for MLD, 0.66p.p. for RA-950, 2.4HU for PD-15, and 3.1p.p. for RA-air. The <500mL subgroup had 20 scan pairs for TLC and RV. The R2 values were 0.8 for weight, 0.60 for MLD, 0.29 for RA-950, 0.31 for PD-15, and 0.64 for RA-air. Our results indicate that considerable variability exists in densitometric measures over one week that cannot be attributed to breathhold or physiology. This has implications for clinical trials relying on these measures to assess emphysema treatment efficacy.

  16. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  17. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  18. Reproducing American Sign Language sentences: cognitive scaffolding in working memory

    PubMed Central

    Supalla, Ted; Hauser, Peter C.; Bavelier, Daphne

    2014-01-01

    The American Sign Language Sentence Reproduction Test (ASL-SRT) requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall) and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects' recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies when they failed to recall the sentence correctly. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are considered. PMID

  19. [Our concept of defecography. Methods and reproducibility of results].

    PubMed

    Sutorý, M; Brhelová, H; Michek, J; Kubacák, J; Vasícková, J; Stursa, V; Sehnalová, H

    1999-06-01

    Defecography is used in the Czech Republic only exceptionally. Since 1988 the authors made 402 defecographic examinations. They submit a detailed description of hitherto assembled experience and their own modification of the examination. As contrast material they use at present Micropaque susp. thickened by means of wheat bran. They administer it by means of a modified press for dough preparation. The X-rays are taken on a modified ordinary stool made from soft timber. For screening of uncovered places in the visual field they use individually placed copper plates 2 mm thick. For better evaluation of the X-rays the authors place during examination an X-ray contrasting net behind the patient. Pictures are taken at rest, during contraction, during modified Valsalva's manoeuvre and during all stages of defecation. The authors mention the most interesting pathological pictures they encountered so far--internal prolapse, levator hernia, rectocele, sphincter defect, various forms of prolapses and dyskineses of the pelvic floor. In the authors opinion the basic quantifiable parameters are the magnitude of the anorectal angles. They used the assessment method described by Mahieu, as well as the mediorectal angle which in their opinion is a reflection of the patient's somatotype and levator function. More than the absolute values of the angles they emphasize the difference of the two angles and change of the latter during contraction and defecation. In their opinion enlargement of the difference during contraction and diminution to values close to zero is normal. Converse values are according to the authors evidence of dyssynergy of the pelvic floor. Independent assessment of the angles and magnitude of the lift of the pelvic floor by three subjects are subjected to statistical analysis. They provide evidence of complete reproducibility of results of anorectal angles according to the authors' definition. The results of assessment can be used to investigate relations with

  20. Approaches to acceptable risk: a critical guide

    SciTech Connect

    Fischhoff, B.; Lichtenstein, S.; Slovic, P.; Keeney, R.; Derby, S.

    1980-12-01

    Acceptable-risk decisions are an essential step in the management of technological hazards. In many situations, they constitute the weak (or missing) link in the management process. The absence of an adequate decision-making methodology often produces indecision, inconsistency, and dissatisfaction. The result is neither good for hazard management nor good for society. This report offers a critical analysis of the viability of various approaches as guides to acceptable-risk decisions. This report seeks to define acceptable-risk decisions and to examine some frequently proposed, but inappropriate, solutions. 255 refs., 22 figs., 25 tabs.

  1. Hanford Site Solid Waste Acceptance Criteria

    SciTech Connect

    Not Available

    1993-11-17

    This manual defines the Hanford Site radioactive, hazardous, and sanitary solid waste acceptance criteria. Criteria in the manual represent a guide for meeting state and federal regulations; DOE Orders; Hanford Site requirements; and other rules, regulations, guidelines, and standards as they apply to acceptance of radioactive and hazardous solid waste at the Hanford Site. It is not the intent of this manual to be all inclusive of the regulations; rather, it is intended that the manual provide the waste generator with only the requirements that waste must meet in order to be accepted at Hanford Site TSD facilities.

  2. Chinese Nurses' Acceptance of PDA: A Cross-Sectional Survey Using a Technology Acceptance Model.

    PubMed

    Wang, Yanling; Xiao, Qian; Sun, Liu; Wu, Ying

    2016-01-01

    This study explores Chinese nurses' acceptance of PDA, using a questionnaire based on the framework of Technology Acceptance Model (TAM). 357 nurses were involved in the study. The results reveal the scores of the nurses' acceptance of PDA were means 3.18~3.36 in four dimensions. The younger of nurses, the higher nurses' title, the longer previous usage time, the more experienced using PDA, and the more acceptance of PDA. Therefore, the hospital administrators may change strategies to enhance nurses' acceptance of PDA, and promote the wide application of PDA.

  3. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  4. Learning Analytics: Readiness and Rewards

    ERIC Educational Resources Information Center

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  5. Acceptance and control of aircraft interior noise and vibration

    NASA Technical Reports Server (NTRS)

    Stephens, D. G.; Leatherwood, J. D.

    1980-01-01

    Ride quality criteria for noise, vibration, and their combination in the helicopter cabin environment are discussed. Results are presented of laboratory and field studies of passenger responses to interior noise and vibration during the performance of a listening task and during reverie, as well as to the interaction of noise with multi-frequency and multi-axis vibration. A study of means for reducing helicopter interior noise based on analytical, experimental and flight studies of the near-field noise source characteristics of the aircraft, the transmission of noise through aircraft structures and the attenuation of noise by various noise control treatments is then presented which has resulted in a reduction of 3 dB in helicopter cabin noise. Finally, a model under development to evaluate passenger acceptance of a helicopter noise and vibration environment is indicated which incorporates the observed noise and vibration effects on comfort and is expected to provide insights for more effective noise and vibration control.

  6. The analytic renormalization group

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank

    2016-08-01

    Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k ∈ Z, associated with the Matsubara frequencies νk = 2 πk / β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct "Analytic Renormalization Group" linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk | < μ (with the possible exception of the zero mode G0), together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk | ≥ μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  7. Characterization of Analytical Reference Glass-1 (ARG-1)

    SciTech Connect

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers` analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ``round robin`` methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ``Analysis of Nuclear Waste Glass and Related Materials,`` January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers` analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms.

  8. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  9. Safe and Reproducible Preparation of Functional Dendritic Cells for Immunotherapy in Glioblastoma Patients

    PubMed Central

    Lisini, Daniela; Pogliani, Simona; Dossena, Marta; Bersano, Anna; Pellegatta, Serena; Parati, Eugenio; Finocchiaro, Gaetano; Frigerio, Simona

    2015-01-01

    Cell therapy based on dendritic cells (DCs) pulsed with tumor lysate is a promising approach in addition to conventional therapy for the treatment of patients with glioblastoma (GB). The success of this approach strongly depends on the ability to generate high-quality, functionally mature DCs (mDCs), with a high level of standardization and in compliance with Good Manufacturing Practices. In the cell factory of the Carlo Besta Foundation, two phase I clinical trials on immunotherapy with tumor lysate-loaded DCs as treatment for GB are ongoing. From 2010 to 2014, 54 patients were enrolled in the studies and 54 batches of DCs were prepared. We retrospectively analyzed the results of the quality control tests carried out on each produced batch, evaluating yield of mDCs and their quality in terms of microbiological safety and immunological efficacy. The number of mDCs obtained allowed the treatment of all the enrolled patients. All 54 batches were sterile, conformed to acceptable endotoxin levels, and were free of Mycoplasma species and adventitious viruses. During culture, cells maintained a high percentage of viability (87%–98%), and all batches showed high viability after thawing (mean ± SD: 94.6% ± 2.9%). Phenotype evaluation of mDCs showed an evident upregulation of markers typical of DC maturation; mixed lymphocyte reaction tests for the functional evaluation of DCs demonstrated that all batches were able to induce lymphocyte responses. These results demonstrated that our protocol for DC preparation is highly reproducible and permits generation of large numbers of safe and functional DCs for in vivo use in immunotherapy approaches. Significance Cell therapy based on antigen-pulsed dendritic cells (DCs) is a promising approach for the treatment of glioblastoma patients. The success of this approach strongly depends on the ability to generate high-quality, functional DCs with a high level of standardization, ensuring reproducibility, efficacy, and safety of the

  10. What Are Acceptable Limits of Radiation?

    NASA Video Gallery

    Brad Gersey, lead research scientist at the Center for Radiation Engineering and Science for Space Exploration, or CRESSE, at Prairie View A&M University, describes the legal and acceptable limits ...

  11. Behavioral genetics: scientific and social acceptance.

    PubMed

    Lorenz, David R

    2003-01-01

    Human behavioral genetics can be broadly defined as the attempt to characterize and define the genetic or hereditary basis for human behavior. Examination of the history of these scientific enterprises reveals episodes of controversy, and an apparent distinction between scientific and social acceptance of the genetic nature of such complex behaviors. This essay will review the history and methodology of behavioral genetics research, including a more detailed look at case histories involving behavioral genetic research for aggressive behavior and alcoholism. It includes a discussion of the scientific versus social qualities of the acceptance of behavioral genetics research, as well as the development of a general model for scientific acceptance involving the researchers, the scientific literature, the scientific peer group, the mainstream media, and the public at large. From this model follows a discussion of the means and complications by which behavioral genetics research may be accepted by society, and an analysis of how future studies might be conducted.

  12. 7 CFR 1205.326 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE COTTON RESEARCH AND PROMOTION Cotton Research and Promotion Order Cotton Board § 1205.326 Acceptance. Any person selected by the Secretary as...

  13. 7 CFR 1205.326 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE COTTON RESEARCH AND PROMOTION Cotton Research and Promotion Order Cotton Board § 1205.326 Acceptance. Any person selected by the Secretary as...

  14. 7 CFR 1205.326 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE COTTON RESEARCH AND PROMOTION Cotton Research and Promotion Order Cotton Board § 1205.326 Acceptance. Any person selected by the Secretary as...

  15. Integrated Model for E-Learning Acceptance

    NASA Astrophysics Data System (ADS)

    Ramadiani; Rodziah, A.; Hasan, S. M.; Rusli, A.; Noraini, C.

    2016-01-01

    E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.

  16. 49 CFR 193.2303 - Construction acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...: FEDERAL SAFETY STANDARDS Construction § 193.2303 Construction acceptance. No person may place in service any component until it passes all applicable inspections and tests prescribed by this subpart and...

  17. 49 CFR 193.2303 - Construction acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: FEDERAL SAFETY STANDARDS Construction § 193.2303 Construction acceptance. No person may place in service any component until it passes all applicable inspections and tests prescribed by this subpart and...

  18. 49 CFR 193.2303 - Construction acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: FEDERAL SAFETY STANDARDS Construction § 193.2303 Construction acceptance. No person may place in service any component until it passes all applicable inspections and tests prescribed by this subpart and...

  19. 49 CFR 193.2303 - Construction acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: FEDERAL SAFETY STANDARDS Construction § 193.2303 Construction acceptance. No person may place in service any component until it passes all applicable inspections and tests prescribed by this subpart and...

  20. Gas characterization system software acceptance test report

    SciTech Connect

    Vo, C.V.

    1996-03-28

    This document details the results of software acceptance testing of gas characterization systems. The gas characterization systems will be used to monitor the vapor spaces of waste tanks known to contain measurable concentrations of flammable gases.

  1. Nevada Test Site Waste Acceptance Criteria

    SciTech Connect

    U.S. Department of Energy, Nevada Operations Office, Waste Acceptance Criteria

    1999-05-01

    This document provides the requirements, terms, and conditions under which the Nevada Test Site will accept low-level radioactive and mixed waste for disposal; and transuranic and transuranic mixed waste for interim storage at the Nevada Test Site.

  2. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records

    PubMed Central

    Zozus, Meredith N.; Richesson, Rachel L.; Walden, Anita; Tenenbaum, Jessie D.; Hammond, W.E.

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  3. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records.

    PubMed

    Zozus, Meredith N; Richesson, Rachel L; Walden, Anita; Tenenbaum, Jessie D; Hammond, W E

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs.

  4. Assessing the Accuracy and Precision of Inorganic Geochemical Data Produced through Flux Fusion and Acid Digestions: Multiple (60+) Comprehensive Analyses of BHVO-2 and the Development of Improved "Accepted" Values

    NASA Astrophysics Data System (ADS)

    Ireland, T. J.; Scudder, R.; Dunlea, A. G.; Anderson, C. H.; Murray, R. W.

    2014-12-01

    The use of geological standard reference materials (SRMs) to assess both the accuracy and the reproducibility of geochemical data is a vital consideration in determining the major and trace element abundances of geologic, oceanographic, and environmental samples. Calibration curves commonly are generated that are predicated on accurate analyses of these SRMs. As a means to verify the robustness of these calibration curves, a SRM can also be run as an unknown item (i.e., not included as a data point in the calibration). The experimentally derived composition of the SRM can thus be compared to the certified (or otherwise accepted) value. This comparison gives a direct measure of the accuracy of the method used. Similarly, if the same SRM is analyzed as an unknown over multiple analytical sessions, the external reproducibility of the method can be evaluated. Two common bulk digestion methods used in geochemical analysis are flux fusion and acid digestion. The flux fusion technique is excellent at ensuring complete digestion of a variety of sample types, is quick, and does not involve much use of hazardous acids. However, this technique is hampered by a high amount of total dissolved solids and may be accompanied by an increased analytical blank for certain trace elements. On the other hand, acid digestion (using a cocktail of concentrated nitric, hydrochloric and hydrofluoric acids) provides an exceptionally clean digestion with very low analytical blanks. However, this technique results in a loss of Si from the system and may compromise results for a few other elements (e.g., Ge). Our lab uses flux fusion for the determination of major elements and a few key trace elements by ICP-ES, while acid digestion is used for Ti and trace element analyses by ICP-MS. Here we present major and trace element data for BHVO-2, a frequently used SRM derived from a Hawaiian basalt, gathered over a period of over two years (30+ analyses by each technique). We show that both digestion

  5. Analytical-scale microwave-assisted extraction.

    PubMed

    Eskilsson, C S; Björklund, E

    2000-12-01

    Microwave-assisted extraction (MAE) is a process of using microwave energy to heat solvents in contact with a sample in order to partition analytes from the sample matrix into the solvent. The ability to rapidly heat the sample solvent mixture is inherent to MAE and the main advantage of this technique. By using closed vessels the extraction can be performed at elevated temperatures accelerating the mass transfer of target compounds from the sample matrix. A typical extraction procedure takes 15-30 min and uses small solvent volumes in the range of 10-30 ml. These volumes are about 10 times smaller than volumes used by conventional extraction techniques. In addition, sample throughput is increased as several samples can be extracted simultaneously. In most cases recoveries of analytes and reproducibility are improved compared to conventional techniques, as shown in several applications. This review gives a brief theoretical background of microwave heating and the basic principles of using microwave energy for extraction. It also attempts to summarize all studies performed on closed-vessel MAE until now. The influences of parameters such as solvent choice, solvent volume, temperature, time and matrix characteristics (including water content) are discussed.

  6. Analytical Protein Microarrays: Advancements Towards Clinical Applications

    PubMed Central

    Sauer, Ursula

    2017-01-01

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048

  7. ANALYTICAL STAR FORMATION RATE FROM GRAVOTURBULENT FRAGMENTATION

    SciTech Connect

    Hennebelle, Patrick; Chabrier, Gilles

    2011-12-20

    We present an analytical determination of the star formation rate (SFR) in molecular clouds, based on a time-dependent extension of our analytical theory of the stellar initial mass function. The theory yields SFRs in good agreement with observations, suggesting that turbulence is the dominant, initial process responsible for star formation. In contrast to previous SFR theories, the present one does not invoke an ad hoc density threshold for star formation; instead, the SFR continuously increases with gas density, naturally yielding two different characteristic regimes, thus two different slopes in the SFR versus gas density relationship, in agreement with observational determinations. Besides the complete SFR derivation, we also provide a simplified expression, which reproduces the complete calculations reasonably well and can easily be used for quick determinations of SFRs in cloud environments. A key property at the heart of both our complete and simplified theory is that the SFR involves a density-dependent dynamical time, characteristic of each collapsing (prestellar) overdense region in the cloud, instead of one single mean or critical freefall timescale. Unfortunately, the SFR also depends on some ill-determined parameters, such as the core-to-star mass conversion efficiency and the crossing timescale. Although we provide estimates for these parameters, their uncertainty hampers a precise quantitative determination of the SFR, within less than a factor of a few.

  8. Applying analytical ultracentrifugation to nanocrystal suspensions.

    PubMed

    Jamison, Jennifer A; Krueger, Karl M; Mayo, J T; Yavuz, Cafer T; Redden, Jacina J; Colvin, Vicki L

    2009-09-02

    While applied frequently in physical biochemistry to the study of protein complexes, the quantitative use of analytical ultracentrifugation (AUC) for nanocrystal analysis is relatively rare. Its application in nanoscience is potentially very powerful as it provides a measure of nanocrystal density, size and structure directly in the solution phase. Towards that end, this paper examines the best practices for applying data collection and analysis methods for AUC, geared towards the study of biomolecules, to the unique problems of nanoparticle analysis. Using uniform nanocrystals of cadmium selenide, we compared several schemes for analyzing raw sedimentation data. Comparable values of the mean sedimentation coefficients (s-value) were found using several popular analytical approaches; however, the distribution in sample s-values is best captured using the van Holde-Weischt algorithm. Measured s-values could be reproducibly collected if sample temperature and concentration were controlled; under these circumstances, the variability for average sedimentation values was typically 5%. The full shape of the distribution in s-values, however, is not easily subjected to quantitative interpretation. Moreover, the selection of the appropriate sedimentation speed is crucial for AUC of nanocrystals as the density of inorganic nanocrystals is much larger than that of solvents. Quantitative analysis of sedimentation properties will allow for better agreement between experimental and theoretical models of nanocrystal solution behavior, as well as providing deeper insight into the hydrodynamic size and solution properties of nanomaterials.

  9. Analytical Protein Microarrays: Advancements Towards Clinical Applications.

    PubMed

    Sauer, Ursula

    2017-01-29

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems.

  10. Acceptance Test Plan for ANSYS Software

    SciTech Connect

    CREA, B.A.

    2000-10-25

    This plan governs the acceptance testing of the ANSYS software (Full Mechanical Release 5.5) for use on Project Word Management Contract (PHMC) computer systems (either UNIX or Microsoft Windows/NT). There are two phases to the acceptance testing covered by this test plan: program execution in accordance with the guidance provided in installation manuals; and ensuring results of the execution are consistent with the expected physical behavior of the system being modeled.

  11. Analytical variability in sport hematology: its importance in an antidoping setting.

    PubMed

    Banfi, Giuseppe; Lombardi, Giovanni; Colombini, Alessandra; Lippi, Giuseppe

    2011-05-01

    Hematologic parameters are commonly utilized in sports medicine and antidoping testing. However, there are no universally accepted methodologies for comparing the performance of automated blood analyzer systems. To address this problem, we selected and examined 19 studies from a review of literature published from 2000 to 2010. Meaningful discrepancies were found between measurements obtained with different analytical systems. Because harmonization and clear standardization of methods are lacking, the analytical variability often largely exceeds intra- and inter-individual biological differences, producing equivocal test results unreliable for clinical and antidoping testing. A central criticality to applying the Bayesian approach is analytical variability, but the use of different analytical technologies precludes the comparison of inter-methods for determining the robustness of blood variables and their clinical significance. Therefore, future multicenter studies are needed to compare analytical methodologies and blood analyzer systems, and to establish worldwide-accepted standards and quality control protocols.

  12. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    PubMed

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-05

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  13. Evaluation of mammographic density patterns: reproducibility and concordance among scales

    PubMed Central

    2010-01-01

    this percentage was lower for the quantitative scales (21.89% for BI-RADS and 21.86% for Boyd). Conclusions Visual scales of mammographic density show a high reproducibility when appropriate training is provided. Their ability to distinguish between high and low risk render them useful for routine use by breast cancer screening programs. Quantitative-based scales are more specific than pattern-based scales in classifying populations in the high-risk group. PMID:20836850

  14. Assay Reproducibility in Clinical Studies of Plasma miRNA

    PubMed Central

    Rice, Jonathan; Roberts, Henry; Burton, James; Pan, Jianmin; States, Vanessa; Rai, Shesh N.; Galandiuk, Susan

    2015-01-01

    There are increasing reports of plasma miRNAs as biomarkers of human disease but few standards in methodologic reporting, leading to inconsistent data. We systematically reviewed plasma miRNA studies published between July 2013-June 2014 to assess methodology. Six parameters were investigated: time to plasma extraction, methods of RNA extraction, type of miRNA, quantification, cycle threshold (Ct) setting, and methods of statistical analysis. We compared these data with a proposed standard methodologic technique. Beginning with initial screening for 380 miRNAs using microfluidic array technology and validation in an additional cohort of patients, we compared 11 miRNAs that exhibited differential expression between 16 patients with benign colorectal neoplasms (advanced adenomas) and 16 patients without any neoplasm (controls). Plasma was isolated immediately, 12, 24, 48, or 72 h following phlebotomy. miRNA was extracted using two different techniques (Trizol LS with pre-amplification or modified miRNeasy). We performed Taqman-based RT-PCR assays for the 11 miRNAs with subsequent analyses using a variable Ct setting or a fixed Ct set at 0.01, 0.03, 0.05, or 0.5. Assays were performed in duplicate by two different operators. RNU6 was the internal reference. Systematic review yielded 74 manuscripts meeting inclusion criteria. One manuscript (1.4%) documented all 6 methodological parameters, while < 5% of studies listed Ct setting. In our proposed standard technique, plasma extraction ≤12 h provided consistent ΔCt. miRNeasy extraction yielded higher miRNA concentrations and fewer non-expressed miRNAs compared to Trizol LS (1/704 miRNAs [0.14%] vs 109/704 miRNAs [15%], not expressed, respectively). A fixed Ct bar setting of 0.03 yielded the most reproducible data, provided that <10% miRNA were non-expressed. There was no significant intra-operator variability. There was significant inter-operator variation using Trizol LS extraction, while this was negligible using

  15. Specific conductance; theoretical considerations and application to analytical quality control

    USGS Publications Warehouse

    Miller, Ronald L.; Bradford, Wesley L.; Peters, Norman E.

    1988-01-01

    This report considers several theoretical aspects and practical applications of specific conductance to the study of natural waters. A review of accepted measurements of conductivity of secondary standard 0.01 N KCl solution suggests that a widely used algorithm for predicting the temperature variation in conductivity is in error. A new algorithm is derived and compared with accepted measurements. Instrumental temperature compensation circuits based on 0.01 N KCl or NaCl are likely to give erroneous results in unusual or special waters, such as seawater, acid mine waters, and acid rain. An approach for predicting the specific conductance of a water sample from the analytically determined major ion composition is described and critically evaluated. The model predicts the specific conductance to within ?8 percent (one standard deviation) in waters with specific conductances of 0 to 600 microS/cm. Application of this approach to analytical quality control is discussed.

  16. Analytic pion form factor

    NASA Astrophysics Data System (ADS)

    Lomon, Earle L.; Pacetti, Simone

    2016-09-01

    The pion electromagnetic form factor and two-pion production in electron-positron collisions are simultaneously fitted by a vector dominance model evolving to perturbative QCD at large momentum transfer. This model was previously successful in simultaneously fitting the nucleon electromagnetic form factors (spacelike region) and the electromagnetic production of nucleon-antinucleon pairs (timelike region). For this pion case dispersion relations are used to produce the analytic connection of the spacelike and timelike regions. The fit to all the data is good, especially for the newer sets of timelike data. The description of high-q2 data, in the timelike region, requires one more meson with ρ quantum numbers than listed in the 2014 Particle Data Group review.

  17. VERDE Analytic Modules

    SciTech Connect

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates served within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.

  18. Normality in Analytical Psychology

    PubMed Central

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  19. A novel approach for quantitation of nonderivatized sialic acid in protein therapeutics using hydrophilic interaction chromatographic separation and nano quantity analyte detection.

    PubMed

    Chemmalil, Letha; Suravajjala, Sreekanth; See, Kate; Jordan, Eric; Furtado, Marsha; Sun, Chong; Hosselet, Stephen

    2015-01-01

    This paper describes a novel approach for the quantitation of nonderivatized sialic acid in glycoproteins, separated by hydrophilic interaction chromatography, and detection by Nano Quantity Analyte Detector (NQAD). The detection technique of NQAD is based on measuring change in the size of dry aerosol and converting the particle count rate into chromatographic output signal. NQAD detector is suitable for the detection of sialic acid, which lacks sufficiently active chromophore or fluorophore. The water condensation particle counting technology allows the analyte to be enlarged using water vapor to provide highest sensitivity. Derivatization-free analysis of glycoproteins using HPLC/NQAD method with PolyGLYCOPLEX™ amide column is well correlated with HPLC method with precolumn derivatization using 1, 2-diamino-4, 5-methylenedioxybenzene (DMB) as well as the Dionex-based high-pH anion-exchange chromatography (or ion chromatography) with pulsed amperometric detection (HPAEC-PAD). With the elimination of derivatization step, HPLC/NQAD method is more efficient than HPLC/DMB method. HPLC/NQAD method is more reproducible than HPAEC-PAD method as HPAEC-PAD method suffers high variability because of electrode fouling during analysis. Overall, HPLC/NQAD method offers broad linear dynamic range as well as excellent precision, accuracy, repeatability, reliability, and ease of use, with acceptable comparability to the commonly used HPAEC-PAD and HPLC/DMB methods.

  20. [Analytical epidemiology of urolithiasis].

    PubMed

    Kodama, H; Ohno, Y

    1989-06-01

    In this paper, urolithiasis is reviewed from the standpoint of analytical epidemiology, which examines a statistical association between a given disease and a hypothesized factor with an aim of inferring its causality. Factors incriminated epidemiologically for stone formation include age, sex, occupation, social class (level of affluence), season of the year and climate, dietary and fluid intake and genetic prodisposition. Since some of these factors are interlinked, they are broadly classified into five categories and epidemiologically looked over here. Genetic predisposition is essentially endorsed by the more frequent episodes of stone formation in the family members of stone formers, as compared to non-stone formers. Nevertheless, some environmental factors (likely to be dietary habits) shared by family members are believed to be relatively more important than genetic predisposition. A hot, sunny climate may influence stone formation through inducing dehydration with increased perspiration and increased solute concentration with decreased urine volume, coupled with inadequate liquid intake, and possibly through the greater exposure to ultraviolet radiation which eventually results in an increased vitamin D production, conceivably correlated with seasonal variation in calcium and oxalate excretion to the urine. Urinary tract infections are importantly involved in the formation of magnesium ammonium phosphate stones in particular. The association with regional water hardness is still in controversy. Excessive intake of coffee, tea and alcoholic beverages seemingly increase the risk of renal calculi, though not consistently confirmed. Many dietary elements have been suggested by numerous clinical and experimental investigations, but a few elements are substantiated by analytical epidemiological investigations. An increased ingestion of animal protein and sugar and a decreased ingestion of dietary fiber and green-yellow vegetables are linked with the higher