Science.gov

Sample records for acceptable analytical reproducibility

  1. Accelerate Healthcare Data Analytics: An Agile Practice to Perform Collaborative and Reproducible Analyses.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice. PMID:27577444

  2. Accelerate Healthcare Data Analytics: An Agile Practice to Perform Collaborative and Reproducible Analyses.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.

  3. Analytical strategies for improving the robustness and reproducibility of bioluminescent microbial bioreporters.

    PubMed

    Roda, Aldo; Roda, Barbara; Cevenini, Luca; Michelini, Elisa; Mezzanotte, Laura; Reschiglian, Pierluigi; Hakkila, Kaisa; Virta, Marko

    2011-07-01

    Whole-cell bioluminescent (BL) bioreporter technology is a useful analytical tool for developing biosensors for environmental toxicology and preclinical studies. However, when applied to real samples, several methodological problems prevent it from being widely used. Here, we propose a methodological approach for improving its analytical performance with complex matrix. We developed bioluminescent Escherichia coli and Saccharomyces cerevisiae bioreporters for copper ion detection. In the same cell, we introduced two firefly luciferases requiring the same luciferin substrate emitting at different wavelengths. The expression of one was copper ion specific. The other, constitutively expressed, was used as a cell viability internal control. Engineered BL cells were characterized using the noninvasive gravitational field-flow fractionation (GrFFF) technique. Homogeneous cell population was isolated. Cells were then immobilized in a polymeric matrix improving cell responsiveness. The bioassay was performed in 384-well black polystyrene microtiter plates directly on the sample. After 2 h of incubation at 37 °C and the addition of the luciferin, we measured the emitted light. These dual-color bioreporters showed more robustness and a wider dynamic range than bioassays based on the same strains with a single reporter gene and that uses a separate cell strain as BL control. The internal correction allowed to accurately evaluate the copper content even in simulated toxic samples, where reduced cell viability was observed. Homogenous cells isolated by GrFFF showed improvement in method reproducibility, particularly for yeast cells. The applicability of these bioreporters to real samples was demonstrated in tap water and wastewater treatment plant effluent samples spiked with copper and other metal ions. PMID:21603915

  4. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  5. The between-day reproducibility of fasting, satiety-related analytes, in 8 to 11year-old boys.

    PubMed

    Allsop, Susan; Rumbold, Penny L S; Green, Benjamin P

    2016-10-01

    The aim of the present study was to establish the between-day reproducibility of fasting plasma GLP-17-36, glucagon, leptin, insulin and glucose, in lean and overweight/obese 8-11year-old boys. A within-group study design was utilised wherein the boys attended two study days, separated by 1week, where a fasting fingertip capillary blood sample was obtained. Deming regression, mean difference, Bland-Altman limits of agreement (LOA) and typical imprecision as a percentage coefficient of variation (CV %), were utilised to assess reproducibility between-days. On a group level, Deming regression detected no evidence of systematic or proportional bias between-days for all of the satiety-related analytes however, only glucose and plasma GLP-17-36 displayed low typical and random imprecision. When analysed according to body composition, good reproducibility was maintained for glucose in the overweight/obese boys and for plasma GLP-17-36, in those with lean body mass. The present findings demonstrate that the measurement of glucose and plasma GLP-17-36 by fingertip capillary sampling on a group level, is reproducible between-days, in 8-11year-old boys. Comparison of blood glucose obtained by fingertip capillary sampling can be made between lean and overweight/obese 8-11year-old boys. Presently, the comparison of fasting plasma GLP-17-36 according to body weight is inappropriate due to high imprecision observed in lean boys between-days. The use of fingertip capillary sampling in the measurement of satiety-related analytes has the potential to provide a better understanding of mechanisms that affect appetite and feeding behaviour in children. PMID:27265877

  6. The between-day reproducibility of fasting, satiety-related analytes, in 8 to 11year-old boys.

    PubMed

    Allsop, Susan; Rumbold, Penny L S; Green, Benjamin P

    2016-10-01

    The aim of the present study was to establish the between-day reproducibility of fasting plasma GLP-17-36, glucagon, leptin, insulin and glucose, in lean and overweight/obese 8-11year-old boys. A within-group study design was utilised wherein the boys attended two study days, separated by 1week, where a fasting fingertip capillary blood sample was obtained. Deming regression, mean difference, Bland-Altman limits of agreement (LOA) and typical imprecision as a percentage coefficient of variation (CV %), were utilised to assess reproducibility between-days. On a group level, Deming regression detected no evidence of systematic or proportional bias between-days for all of the satiety-related analytes however, only glucose and plasma GLP-17-36 displayed low typical and random imprecision. When analysed according to body composition, good reproducibility was maintained for glucose in the overweight/obese boys and for plasma GLP-17-36, in those with lean body mass. The present findings demonstrate that the measurement of glucose and plasma GLP-17-36 by fingertip capillary sampling on a group level, is reproducible between-days, in 8-11year-old boys. Comparison of blood glucose obtained by fingertip capillary sampling can be made between lean and overweight/obese 8-11year-old boys. Presently, the comparison of fasting plasma GLP-17-36 according to body weight is inappropriate due to high imprecision observed in lean boys between-days. The use of fingertip capillary sampling in the measurement of satiety-related analytes has the potential to provide a better understanding of mechanisms that affect appetite and feeding behaviour in children.

  7. A reproducible analytical system based on the multi-component analysis of triterpene acids in Ganoderma lucidum.

    PubMed

    Da, Juan; Cheng, Chun-Ru; Yao, Shuai; Long, Hua-Li; Wang, Yan-Hong; Khan, Ikhlas A; Li, Yi-Feng; Wang, Qiu-Rong; Cai, Lu-Ying; Jiang, Bao-Hong; Liu, Xuan; Wu, Wan-Ying; Guo, De-An

    2015-06-01

    Ultra-performance liquid chromatography (UPLC) and Single Standard for Determination of Multi-Components (SSDMC) are becoming increasingly important for quality control of medicinal herbs; this approach was developed for Ganoderma lucidum. Special attention was necessary for the appropriate selection of markers, for determining the reproducibility of the relative retention times (RRT), and for the accuracy of conversion factors (F). Finally, ten components were determined, with ganoderic acid A serving as single standard. Stable system parameters were established, and with successful resolution of those issues, this analytical method could be used more broadly.

  8. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  9. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-01

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization. PMID:23084050

  10. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Proposals to change an accepted analytical... Postal Service POSTAL REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.11 Proposals to change an... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  11. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Proposals to change an accepted analytical... Postal Service POSTAL REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.11 Proposals to change an... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  12. 39 CFR 3050.11 - Proposals to change an accepted analytical principle applied in the Postal Service's annual...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Proposals to change an accepted analytical... Postal Service POSTAL REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.11 Proposals to change an... issue a notice of proceeding to change an accepted analytical principle. In addition, any...

  13. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  14. Using Functional Analytic Therapy to Train Therapists in Acceptance and Commitment Therapy, a Conceptual and Practical Framework

    ERIC Educational Resources Information Center

    Schoendorff, Benjamin; Steinwachs, Joanne

    2012-01-01

    How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…

  15. Fast gradient separation by very high pressure liquid chromatography: reproducibility of analytical data and influence of delay between successive runs.

    PubMed

    Stankovicha, Joseph J; Gritti, Fabrice; Beaver, Lois Ann; Stevensona, Paul G; Guiochon, Georges

    2013-11-29

    Five methods were used to implement fast gradient separations: constant flow rate, constant column-wall temperature, constant inlet pressure at moderate and high pressures (controlled by a pressure controller),and programmed flow constant pressure. For programmed flow constant pressure, the flow rates and gradient compositions are controlled using input into the method instead of the pressure controller. Minor fluctuations in the inlet pressure do not affect the mobile phase flow rate in programmed flow. There producibilities of the retention times, the response factors, and the eluted band width of six successive separations of the same sample (9 components) were measured with different equilibration times between 0 and 15 min. The influence of the length of the equilibration time on these reproducibilities is discussed. The results show that the average column temperature may increase from one separation to the next and that this contributes to fluctuation of the results.

  16. An analytical framework for flood water conservation considering forecast uncertainty and acceptable risk

    NASA Astrophysics Data System (ADS)

    Ding, Wei; Zhang, Chi; Peng, Yong; Zeng, Ruijie; Zhou, Huicheng; Cai, Ximing

    2015-06-01

    This paper addresses how much flood water can be conserved for use after the flood season through the operation of reservoir by taking into account the residual flood control capacity (the difference between flood conveyance capacity and the expected inflow in a lead time). A two-stage model for dynamic control of the flood-limited water level (the maximum allowed water level during the flood season, DC-FLWL) is established considering forecast uncertainty and acceptable flood risk. It is found that DC-FLWL is applicable when the reservoir inflow ranges from small to medium levels of the historical records, while both forecast uncertainty and acceptable risk in the downstream affect the feasible space of DC-FLWL. As forecast uncertainty increases (under a given risk level) or as acceptable risk level decreases (under a given forecast uncertainty level), the minimum required safety margin for flood control increases, and the chance for DC-FLWL decreases. The derived hedging rules from the modeling framework illustrate either the dominant role of water conservation or flood control or the trade-off between the two objectives under different levels of forecast uncertainty and acceptable risk. These rules may provide useful guidelines for conserving water from flood, especially in the area with heavy water stress. The analysis is illustrated via a case study with a real-world reservoir in northeastern China.

  17. An Analytical Framework for Flood Water Conservation Considering Forecast Uncertainty and Acceptable Risk

    NASA Astrophysics Data System (ADS)

    Ding, W.; Zhang, C.

    2015-12-01

    Reservoir water levels are usually not allowed to exceed the flood limited water level (FLWL) during flood season, which neglects the meteorological and real-time forecast information and leads to the great waste of water resources. With the development of weather forecasting, hydrologic modeling, and hydro-climatic teleconnection, the streamflow forecast precision have improved a lot, which provides the technical support for the flood water utilization. This paper addresses how much flood water can be conserved for use after the flood season through the operation of reservoir based on uncertain forecast information by taking into account the residual flood control capacity (the difference between flood conveyance capacity and the expected inflow in a lead time). A two-stage model for dynamic control of the flood limited water level (the maximum allowed water level during the flood season, DC-FLWL) is established considering forecast uncertainty and acceptable flood risk. It is found that DC-FLWL is applicable when the reservoir inflow ranges from small to medium levels of the historical records, while both forecast uncertainty and acceptable risk in the downstream affect the feasible space of DC-FLWL. As forecast uncertainty increases (under a given risk level) or as acceptable risk level decreases (under a given forecast uncertainty level), the minimum required safety margin for flood control increases, and the chance for DC-FLWL decreases. The derived hedging rules from the modeling framework illustrate either the dominant role of water conservation or flood control or the tradeoff between the two objectives under different levels of forecast uncertainty and acceptable risk. These rules may provide useful guidelines for conserving water from flood, especially in the area with heavy water stress.

  18. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation. PMID:26538323

  19. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  20. Towards more complete specifications for acceptable analytical performance - a plea for error grid analysis.

    PubMed

    Krouwer, Jan S; Cembrowski, George S

    2011-07-01

    Abstract We examine limitations of common analytical performance specifications for quantitative assays. Specifications can be either clinical or regulatory. Problems with current specifications include specifying limits for only 95% of the results, having only one set of limits that demarcate no harm from minor harm, using incomplete models for total error, not accounting for the potential of user error, and not supplying sufficient protocol requirements. Error grids are recommended to address these problems as error grids account for 100% of the data and stratify errors into different severity categories. Total error estimation from a method comparison can be used to estimate the inner region of an error grid, but the outer region needs to be addressed using risk management techniques. The risk management steps, foreign to many in laboratory medicine, are outlined.

  1. Reproducible Science▿

    PubMed Central

    Casadevall, Arturo; Fang, Ferric C.

    2010-01-01

    The reproducibility of an experimental result is a fundamental assumption in science. Yet, results that are merely confirmatory of previous findings are given low priority and can be difficult to publish. Furthermore, the complex and chaotic nature of biological systems imposes limitations on the replicability of scientific experiments. This essay explores the importance and limits of reproducibility in scientific manuscripts. PMID:20876290

  2. Acceptance and Commitment Therapy for Individuals with Disabilities: A Behavior Analytic Strategy for Addressing Private Events in Challenging Behavior.

    PubMed

    Hoffmann, Audrey N; Contreras, Bethany P; Clay, Casey J; Twohig, Michael P

    2016-03-01

    Applied behavior analysts work with many populations including individuals with developmental and intellectual disabilities. Although behavior analysts have a variety of empirically supported treatments to implement when working with individuals with disabilities, sometimes, other variables may adversely impact treatment effectiveness. The degree to which problematic thoughts and feelings (private events) influence behavior may be a variable that contributes to treatment efficacy. Traditional behavior analytic services are not always equipped to successfully address the private events influencing client behavior. In such cases, it may be beneficial for behavior analysts to consider additional philosophically aligned treatments for private events. One such treatment, acceptance and commitment therapy, may be a useful tool for behavior analysts to incorporate into their toolbox in order to help clients. The purpose of this paper is to introduce behavior analysts to a potential solution to the problem of effectively addressing private events in behavior analytic services. We then propose a model for thinking about private events in relation to clients with disabilities and present a guide for taking steps to address private events in the clinical setting. We conclude this paper with a call for research and present a possible research agenda for behavior analysts. PMID:27606236

  3. Acceptance and Commitment Therapy for Individuals with Disabilities: A Behavior Analytic Strategy for Addressing Private Events in Challenging Behavior.

    PubMed

    Hoffmann, Audrey N; Contreras, Bethany P; Clay, Casey J; Twohig, Michael P

    2016-03-01

    Applied behavior analysts work with many populations including individuals with developmental and intellectual disabilities. Although behavior analysts have a variety of empirically supported treatments to implement when working with individuals with disabilities, sometimes, other variables may adversely impact treatment effectiveness. The degree to which problematic thoughts and feelings (private events) influence behavior may be a variable that contributes to treatment efficacy. Traditional behavior analytic services are not always equipped to successfully address the private events influencing client behavior. In such cases, it may be beneficial for behavior analysts to consider additional philosophically aligned treatments for private events. One such treatment, acceptance and commitment therapy, may be a useful tool for behavior analysts to incorporate into their toolbox in order to help clients. The purpose of this paper is to introduce behavior analysts to a potential solution to the problem of effectively addressing private events in behavior analytic services. We then propose a model for thinking about private events in relation to clients with disabilities and present a guide for taking steps to address private events in the clinical setting. We conclude this paper with a call for research and present a possible research agenda for behavior analysts.

  4. Does Acceptance and Relationship Focused Behavior Therapy Contribute to Bupropion Outcomes? A Randomized Controlled Trial of Functional Analytic Psychotherapy and Acceptance and Commitment Therapy for Smoking Cessation

    ERIC Educational Resources Information Center

    Gifford, Elizabeth V.; Kohlenberg, Barbara S.; Hayes, Steven C.; Pierson, Heather M.; Piasecki, Melissa P.; Antonuccio, David O.; Palm, Kathleen M.

    2011-01-01

    This study evaluated a treatment combining bupropion with a novel acceptance and relationship focused behavioral intervention based on the acceptance and relationship context (ARC) model. Three hundred and three smokers from a community sample were randomly assigned to bupropion, a widely used smoking cessation medication, or bupropion plus…

  5. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs. PMID:24882687

  6. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs.

  7. Acceptance- and mindfulness-based interventions for the treatment of chronic pain: a meta-analytic review.

    PubMed

    Veehof, M M; Trompetter, H R; Bohlmeijer, E T; Schreurs, K M G

    2016-01-01

    The number of acceptance- and mindfulness-based interventions for chronic pain, such as acceptance and commitment therapy (ACT), mindfulness-based stress reduction (MBSR), and mindfulness-based cognitive therapy (MBCT), increased in recent years. Therefore an update is warranted of our former systematic review and meta-analysis of studies that reported effects on the mental and physical health of chronic pain patients. Pubmed, EMBASE, PsycInfo and Cochrane were searched for eligible studies. Current meta-analysis only included randomized controlled trials (RCTs). Studies were rated for quality. Mean quality did not improve in recent years. Pooled standardized mean differences using the random-effect model were calculated to represent the average intervention effect and, to perform subgroup analyses. Outcome measures were pain intensity, depression, anxiety, pain interference, disability and quality of life. Included were twenty-five RCTs totaling 1285 patients with chronic pain, in which we compared acceptance- and mindfulness-based interventions to the waitlist, (medical) treatment-as-usual, and education or support control groups. Effect sizes ranged from small (on all outcome measures except anxiety and pain interference) to moderate (on anxiety and pain interference) at post-treatment and from small (on pain intensity and disability) to large (on pain interference) at follow-up. ACT showed significantly higher effects on depression and anxiety than MBSR and MBCT. Studies' quality, attrition rate, type of pain and control group, did not moderate the effects of acceptance- and mindfulness-based interventions. Current acceptance- and mindfulness-based interventions, while not superior to traditional cognitive behavioral treatments, can be good alternatives. PMID:26818413

  8. Acceptance- and mindfulness-based interventions for the treatment of chronic pain: a meta-analytic review.

    PubMed

    Veehof, M M; Trompetter, H R; Bohlmeijer, E T; Schreurs, K M G

    2016-01-01

    The number of acceptance- and mindfulness-based interventions for chronic pain, such as acceptance and commitment therapy (ACT), mindfulness-based stress reduction (MBSR), and mindfulness-based cognitive therapy (MBCT), increased in recent years. Therefore an update is warranted of our former systematic review and meta-analysis of studies that reported effects on the mental and physical health of chronic pain patients. Pubmed, EMBASE, PsycInfo and Cochrane were searched for eligible studies. Current meta-analysis only included randomized controlled trials (RCTs). Studies were rated for quality. Mean quality did not improve in recent years. Pooled standardized mean differences using the random-effect model were calculated to represent the average intervention effect and, to perform subgroup analyses. Outcome measures were pain intensity, depression, anxiety, pain interference, disability and quality of life. Included were twenty-five RCTs totaling 1285 patients with chronic pain, in which we compared acceptance- and mindfulness-based interventions to the waitlist, (medical) treatment-as-usual, and education or support control groups. Effect sizes ranged from small (on all outcome measures except anxiety and pain interference) to moderate (on anxiety and pain interference) at post-treatment and from small (on pain intensity and disability) to large (on pain interference) at follow-up. ACT showed significantly higher effects on depression and anxiety than MBSR and MBCT. Studies' quality, attrition rate, type of pain and control group, did not moderate the effects of acceptance- and mindfulness-based interventions. Current acceptance- and mindfulness-based interventions, while not superior to traditional cognitive behavioral treatments, can be good alternatives.

  9. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  10. Thou Shalt Be Reproducible! A Technology Perspective.

    PubMed

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  11. Analytic performance studies and clinical reproducibility of a real-time PCR assay for the detection of epidermal growth factor receptor gene mutations in formalin-fixed paraffin-embedded tissue specimens of non-small cell lung cancer

    PubMed Central

    2013-01-01

    Background Epidermal growth factor receptor (EGFR) gene mutations identify patients with non-small cell lung cancer (NSCLC) who have a high likelihood of benefiting from treatment with anti-EGFR tyrosine kinase inhibitors. Sanger sequencing is widely used for mutation detection but can be technically challenging, resulting in longer turn-around-time, with limited sensitivity for low levels of mutations. This manuscript details the technical performance verification studies and external clinical reproducibility studies of the cobas EGFR Mutation Test, a rapid multiplex real-time PCR assay designed to detect 41 mutations in exons 18, 19, 20 and 21. Methods The assay’s limit of detection was determined using 25 formalin-fixed paraffin-embedded tissue (FFPET)-derived and plasmid DNA blends. Assay performance for a panel of 201 specimens was compared against Sanger sequencing with resolution of discordant specimens by quantitative massively parallel pyrosequencing (MPP). Internal and external reproducibility was assessed using specimens tested in duplicate by different operators, using different reagent lots, instruments and at different sites. The effects on the performance of the cobas EGFR test of endogenous substances and nine therapeutic drugs were evaluated in ten FFPET specimens. Other tests included an evaluation of the effects of necrosis, micro-organisms and homologous DNA sequences on assay performance, and the inclusivity of the assay for less frequent mutations. Results A >95% hit rate was obtained in blends with >5% mutant alleles, as determined by MPP analysis, at a total DNA input of 150 ng. The overall percent agreement between Sanger sequencing and the cobas test was 96.7% (negative percent agreement 97.5%; positive percent agreement 95.8%). Assay repeatability was 98% when tested with two operators, instruments, and reagent lots. In the external reproducibility study, the agreement was > 99% across all sites, all operators and all reagent lots

  12. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    SciTech Connect

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection

  13. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  14. Magnetogastrography (MGG) Reproducibility Assessments

    NASA Astrophysics Data System (ADS)

    de la Roca-Chiapas, J. M.; Córdova, T.; Hernández, E.; Solorio, S.; Solís Ortiz, S.; Sosa, M.

    2006-09-01

    Seven healthy subjects underwent a magnetic pulse of 32 mT for 17 ms, seven times in 90 minutes. The procedure was repeated one and two weeks later. Assessments of the gastric emptying were carried out for each one of the measurements and a statistical analysis of ANOVA was performed for every group of data. The gastric emptying time was 19.22 ± 5 min. Reproducibility estimation was above 85%. Therefore, magnetogastrography seems to be an excellent technique to be implemented in routine clinical trials.

  15. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  16. Reproducing in cities.

    PubMed

    Mace, Ruth

    2008-02-01

    Reproducing in cities has always been costly, leading to lower fertility (that is, lower birth rates) in urban than in rural areas. Historically, although cities provided job opportunities, initially residents incurred the penalty of higher infant mortality, but as mortality rates fell at the end of the 19th century, European birth rates began to plummet. Fertility decline in Africa only started recently and has been dramatic in some cities. Here it is argued that both historical and evolutionary demographers are interpreting fertility declines across the globe in terms of the relative costs of child rearing, which increase to allow children to outcompete their peers. Now largely free from the fear of early death, postindustrial societies may create an environment that generates runaway parental investment, which will continue to drive fertility ever lower.

  17. Reproducible Experiment Platform

    NASA Astrophysics Data System (ADS)

    Likhomanenko, Tatiana; Rogozhnikov, Alex; Baranov, Alexander; Khairullin, Egor; Ustyuzhanin, Andrey

    2015-12-01

    Data analysis in fundamental sciences nowadays is an essential process that pushes frontiers of our knowledge and leads to new discoveries. At the same time we can see that complexity of those analyses increases fast due to a) enormous volumes of datasets being analyzed, b) variety of techniques and algorithms one have to check inside a single analysis, c) distributed nature of research teams that requires special communication media for knowledge and information exchange between individual researchers. There is a lot of resemblance between techniques and problems arising in the areas of industrial information retrieval and particle physics. To address those problems we propose Reproducible Experiment Platform (REP), a software infrastructure to support collaborative ecosystem for computational science. It is a Python based solution for research teams that allows running computational experiments on shared datasets, obtaining repeatable results, and consistent comparisons of the obtained results. We present some key features of REP based on case studies which include trigger optimization and physics analysis studies at the LHCb experiment.

  18. Reproducibility in a multiprocessor system

    DOEpatents

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  19. Approaches to acceptable risk

    SciTech Connect

    Whipple, C.

    1997-04-30

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.

  20. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  1. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  2. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  3. Latent fingermark pore area reproducibility.

    PubMed

    Gupta, A; Buckley, K; Sutton, R

    2008-08-01

    The study of the reproducibility of friction ridge pore detail in fingermarks is a measure of their usefulness in personal identification. Pore area in latent prints developed using cyanoacrylate and ninhydrin were examined and measured by photomicrography using appropriate software tools. The data were analysed statistically and the results showed that pore area is not reproducible in developed latent prints, using either of the development techniques. The results add further support to the lack of reliability of pore area in personal identification. PMID:18617339

  4. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  5. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  6. Reproducible Research in Computational Science

    PubMed Central

    Peng, Roger D.

    2012-01-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible. PMID:22144613

  7. Reproducible research in computational science.

    PubMed

    Peng, Roger D

    2011-12-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  8. Performance reproducibility index for classification

    PubMed Central

    Yousefi, Mohammadmahdi R.; Dougherty, Edward R.

    2012-01-01

    Motivation: A common practice in biomarker discovery is to decide whether a large laboratory experiment should be carried out based on the results of a preliminary study on a small set of specimens. Consideration of the efficacy of this approach motivates the introduction of a probabilistic measure, for whether a classifier showing promising results in a small-sample preliminary study will perform similarly on a large independent sample. Given the error estimate from the preliminary study, if the probability of reproducible error is low, then there is really no purpose in substantially allocating more resources to a large follow-on study. Indeed, if the probability of the preliminary study providing likely reproducible results is small, then why even perform the preliminary study? Results: This article introduces a reproducibility index for classification, measuring the probability that a sufficiently small error estimate on a small sample will motivate a large follow-on study. We provide a simulation study based on synthetic distribution models that possess known intrinsic classification difficulties and emulate real-world scenarios. We also set up similar simulations on four real datasets to show the consistency of results. The reproducibility indices for different distributional models, real datasets and classification schemes are empirically calculated. The effects of reporting and multiple-rule biases on the reproducibility index are also analyzed. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routine and error estimation methods. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi12a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:22954625

  9. Reproducibility of NIF hohlraum measurements

    NASA Astrophysics Data System (ADS)

    Moody, J. D.; Ralph, J. E.; Turnbull, D. P.; Casey, D. T.; Albert, F.; Bachmann, B. L.; Doeppner, T.; Divol, L.; Grim, G. P.; Hoover, M.; Landen, O. L.; MacGowan, B. J.; Michel, P. A.; Moore, A. S.; Pino, J. E.; Schneider, M. B.; Tipton, R. E.; Smalyuk, V. A.; Strozzi, D. J.; Widmann, K.; Hohenberger, M.

    2015-11-01

    The strategy of experimentally ``tuning'' the implosion in a NIF hohlraum ignition target towards increasing hot-spot pressure, areal density of compressed fuel, and neutron yield relies on a level of experimental reproducibility. We examine the reproducibility of experimental measurements for a collection of 15 identical NIF hohlraum experiments. The measurements include incident laser power, backscattered optical power, x-ray measurements, hot-electron fraction and energy, and target characteristics. We use exact statistics to set 1-sigma confidence levels on the variations in each of the measurements. Of particular interest is the backscatter and laser-induced hot-spot locations on the hohlraum wall. Hohlraum implosion designs typically include variability specifications [S. W. Haan et al., Phys. Plasmas 18, 051001 (2011)]. We describe our findings and compare with the specifications. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  10. Cloning to reproduce desired genotypes.

    PubMed

    Westhusin, M E; Long, C R; Shin, T; Hill, J R; Looney, C R; Pryor, J H; Piedrahita, J A

    2001-01-01

    Cloned sheep, cattle, goats, pigs and mice have now been produced using somatic cells for nuclear transplantation. Animal cloning is still very inefficient with on average less than 10% of the cloned embryos transferred resulting in a live offspring. However successful cloning of a variety of different species and by a number of different laboratory groups has generated tremendous interest in reproducing desired genotypes. Some of these specific genotypes represent animal cell lines that have been genetically modified. In other cases there is a significant demand for cloning animals characterized by their inherent genetic value, for example prize livestock, household pets and rare or endangered species. A number of different variables may influence the ability to reproduce a specific genotype by cloning. These include species, source of recipient ova, cell type of nuclei donor, treatment of donor cells prior to nuclear transfer, and the techniques employed for nuclear transfer. At present, there is no solid evidence that suggests cloning will be limited to only a few specific animals, and in fact, most data collected to date suggests cloning will be applicable to a wide variety of different animals. The ability to reproduce any desired genotype by cloning will ultimately depend on the amount of time and resources invested in research.

  11. Datathons and Software to Promote Reproducible Research

    PubMed Central

    2016-01-01

    Background Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. Objective We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Methods Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Results Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes—a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. Conclusions This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility. PMID:27558834

  12. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are...

  13. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are...

  14. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are...

  15. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are...

  16. 46 CFR 56.97-38 - Initial service leak test (reproduces 137.7).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Initial service leak test (reproduces 137.7). 56.97-38... PIPING SYSTEMS AND APPURTENANCES Pressure Tests § 56.97-38 Initial service leak test (reproduces 137.7). (a) An initial service leak test and inspection is acceptable when other types of test are...

  17. Reproducibility of sterilized rubber impressions.

    PubMed

    Abdelaziz, Khalid M; Hassan, Ahmed M; Hodges, J S

    2004-01-01

    Impressions, dentures and other dental appliances may be contaminated with oral micro-flora or other organisms of varying pathogenicity from patient's saliva and blood. Several approaches have been tried to control the transmission of infectious organisms via dental impressions and because disinfection is less effective and has several drawbacks for impression characterization, several sterilization methods have been suggested. This study evaluated the reproducibility of rubber impressions after sterilization by different methods. Dimensional accuracy and wettability of two rubber impression materials (vinyl polysiloxane and polyether) were evaluated after sterilization by each of three well-known methods (immersion in 2% glutaraldehyde for 10 h, autoclaving and microwave radiation). Non-sterilized impressions served as control. The effect of the tray material on impression accuracy and the effect of topical surfactant on the wettability were also evaluated. One-way ANOVA with Dunnett's method was used for statistical analysis. All sterilizing methods reduced the reproducibility of rubber impressions, although not always significantly. Microwave sterilization had a small effect on both accuracy and wettability. The greater effects of the other methods could usually be overcome by using ceramic trays and by spraying impression surfaces with surfactant before pouring the gypsum mix. There was one exception: glutaraldehyde still degraded dimensional accuracy even with ceramic trays and surfactant. We conclude that a) sterilization of rubber impressions made on acrylic trays was usually associated with a degree of dimensional change; b) microwave energy seems to be a suitable technique for sterilizing rubber impressions; c) topical surfactant application helped restore wettability of sterilized impressions. PMID:15798825

  18. Evaluation of guidewire path reproducibility.

    PubMed

    Schafer, Sebastian; Hoffmann, Kenneth R; Noël, Peter B; Ionita, Ciprian N; Dmochowski, Jacek

    2008-05-01

    The number of minimally invasive vascular interventions is increasing. In these interventions, a variety of devices are directed to and placed at the site of intervention. The device used in almost all of these interventions is the guidewire, acting as a monorail for all devices which are delivered to the intervention site. However, even with the guidewire in place, clinicians still experience difficulties during the interventions. As a first step toward understanding these difficulties and facilitating guidewire and device guidance, we have investigated the reproducibility of the final paths of the guidewire in vessel phantom models on different factors: user, materials and geometry. Three vessel phantoms (vessel diameters approximately 4 mm) were constructed having tortuousity similar to the internal carotid artery from silicon tubing and encased in Sylgard elastomer. Several trained users repeatedly passed two guidewires of different flexibility through the phantoms under pulsatile flow conditions. After the guidewire had been placed, rotational c-arm image sequences were acquired (9 in. II mode, 0.185 mm pixel size), and the phantom and guidewire were reconstructed (512(3), 0.288 mm voxel size). The reconstructed volumes were aligned. The centerlines of the guidewire and the phantom vessel were then determined using region-growing techniques. Guidewire paths appear similar across users but not across materials. The average root mean square difference of the repeated placement was 0.17 +/- 0.02 mm (plastic-coated guidewire), 0.73 +/- 0.55 mm (steel guidewire) and 1.15 +/- 0.65 mm (steel versus plastic-coated). For a given guidewire, these results indicate that the guidewire path is relatively reproducible in shape and position.

  19. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  20. High throughput reproducible cantilever functionalization

    SciTech Connect

    Evans, Barbara R; Lee, Ida

    2014-01-21

    A method for functionalizing cantilevers is provided that includes providing a holder having a plurality of channels each having a width for accepting a cantilever probe and a plurality of probes. A plurality of cantilever probes are fastened to the plurality of channels of the holder by the spring clips. The wells of a well plate are filled with a functionalization solution, wherein adjacent wells in the well plate are separated by a dimension that is substantially equal to a dimension separating adjacent channels of the plurality of channels. Each cantilever probe that is fastened within the plurality of channels of the holder is applied to the functionalization solution that is contained in the wells of the well plate.

  1. High throughout reproducible cantilever functionalization

    SciTech Connect

    Evans, Barbara R; Lee, Ida

    2014-11-25

    A method for functionalizing cantilevers is provided that includes providing a holder having a plurality of channels each having a width for accepting a cantilever probe and a plurality of probes. A plurality of cantilever probes are fastened to the plurality of channels of the holder by the spring clips. The wells of a well plate are filled with a functionalization solution, wherein adjacent wells in the well plate are separated by a dimension that is substantially equal to a dimension separating adjacent channels of the plurality of channels. Each cantilever probe that is fastened within the plurality of channels of the holder is applied to the functionalization solution that is contained in the wells of the well plate.

  2. Ruggedness and reproducibility of the MBEC biofilm disinfectant efficacy test.

    PubMed

    Parker, A E; Walker, D K; Goeres, D M; Allan, N; Olson, M E; Omar, A

    2014-07-01

    The MBEC™ Physiology & Genetics Assay recently became the first approved ASTM standardized biofilm disinfectant efficacy test method. This report summarizes the results of the standardization process using Pseudomonas aeruginosa biofilms. Initial ruggedness testing of the MBEC method suggests that the assay is rugged (i.e., insensitive) to small changes to the protocol with respect to 4 factors: incubation time of the bacteria (when varied from 16 to 18h), treatment temperature (20-24°C), sonication duration (25-35min), and sonication power (130-480W). In order to assess the repeatability of MBEC results across multiple tests in the same laboratory and the reproducibility across multiple labs, an 8-lab study was conducted in which 8 concentrations of each of 3 disinfectants (a non-chlorine oxidizer, a phenolic, and a quaternary ammonium compound) were applied to biofilms using the MBEC method. The repeatability and reproducibility of the untreated control biofilms were acceptable, as indicated by small repeatability and reproducibility standard deviations (SD) (0.33 and 0.67 log10(CFU/mm(2)), respectively). The repeatability SDs of the biofilm log reductions after application of the 24 concentration and disinfectant combinations ranged from 0.22 to 1.61, and the reproducibility SDs ranged from 0.27 to 1.70. In addition, for each of the 3 disinfectant types considered, the assay was statistically significantly responsive to the increasing treatment concentrations.

  3. Reproducibility of topographic measures of the glaucomatous optic nerve head

    PubMed Central

    Geyer, O; Michaeli-Cohen, A; Silver, D; Versano, D; Neudorfer, M; Dzhanov, R; Lazar, M

    1998-01-01

    AIMS/BACKGROUND—Laser scanning tomography provides an assessment of three dimensional optic disc topography. For the clinical purpose of follow up of glaucoma patients, the repeatability of the various measured variables is essential. In the present study, the reproducibility of morphometric variables calculated by the topographic scanning system, TopSS (Laser Diagnostic Technology, San Diego, CA) was investigated.
METHODS—Two independent measurements (30 minutes apart) each consisting of three complete images of the optic disc were performed on 16 eyes of 16 glaucoma patients using a TopSS. The instrument calculates 14 morphometric variables for the characterisation of the optic disc.
RESULTS—From the two tailed paired tests, all variables were seen to have good reproducibility. However, correlation and regression analyses showed that only the three variables, volume below, half depth area, and average cup depth, are acceptably reproducible.
CONCLUSION—The TopSS provides three variables which describe the physiological shape of the optic disc that have high reproducibility. These three variables might be useful for following the progression of optic disc changes in glaucoma patients.

 Keywords: optic nerve head; scanning laser; glaucoma; tomography PMID:9536873

  4. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  5. Meteorite Atmospheric Entry Reproduced in Plasmatron

    NASA Astrophysics Data System (ADS)

    Pittarello, L.; McKibbin, S.; Goderis, S.; Soens, B.; Bariselli, F.; Barros Dias, B. R.; Zavalan, F. L.; Magin, T.; Claeys, Ph.

    2016-08-01

    Plasmatron facility allows experimental conditions that reproduce atmospheric entry of meteorites. Tests on basalt, as meteorite analogue, have been performed. Preliminary results have highlighted melting and evaporation effects.

  6. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  7. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  8. Offer/Acceptance Ratio.

    ERIC Educational Resources Information Center

    Collins, Mimi

    1997-01-01

    Explores how human resource professionals, with above average offer/acceptance ratios, streamline their recruitment efforts. Profiles company strategies with internships, internal promotion, cooperative education programs, and how to get candidates to accept offers. Also discusses how to use the offer/acceptance ratio as a measure of program…

  9. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  10. Reproducibility and uncertainty of wastewater turbidity measurements.

    PubMed

    Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G

    2008-01-01

    Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used.

  11. Analytical applications of aptamers

    NASA Astrophysics Data System (ADS)

    Tombelli, S.; Minunni, M.; Mascini, M.

    2007-05-01

    Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.

  12. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    NASA Astrophysics Data System (ADS)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  13. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  14. Assessing the reproducibility of discriminant function analyses.

    PubMed

    Andrew, Rose L; Albert, Arianne Y K; Renaut, Sebastien; Rennison, Diana J; Bock, Dan G; Vines, Tim

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  15. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  16. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  17. Reproducibility responsibilities in the HPC arena

    SciTech Connect

    Fahey, Mark R; McLay, Robert

    2014-01-01

    Expecting bit-for-bit reproducibility in the HPC arena is not feasible because of the ever changing hardware and software. No user s application is an island; it lives in an HPC eco-system that changes over time. Old hardware stops working and even old software won t run on new hardware. Further, software libraries change over time either by changing the internals or even interfaces. So bit-for-bit reproducibility should not be expected. Rather a reasonable expectation is that results are reproducible within error bounds; or that the answers are close (which is its own debate.) To expect a researcher to reproduce their own results or the results of others within some error bounds, there must be enough information to recreate all the details of the experiment. This requires complete documentation of all phases of the researcher s workflow; from code to versioning to programming and runtime environments to publishing of data. This argument is the core statement of the Yale 2009 Declaration on Reproducible Research [1]. Although the HPC ecosystem is often outside the researchers control, the application code could be built almost identically and there is a chance for very similar results with just only round-off error differences. To achieve complete documentation at every step, the researcher, the computing center, and the funding agencies all have a role. In this thesis, the role of the researcher is expanded upon as compared to the Yale report and the role of the computing centers is described.

  18. Reproducible measurements of MPI performance characteristics.

    SciTech Connect

    Gropp, W.; Lusk, E.

    1999-06-25

    In this paper we describe the difficulties inherent in making accurate, reproducible measurements of message-passing performance. We describe some of the mistakes often made in attempting such measurements and the consequences of such mistakes. We describe mpptest, a suite of performance measurement programs developed at Argonne National Laboratory, that attempts to avoid such mistakes and obtain reproducible measures of MPI performance that can be useful to both MPI implementers and MPI application writers. We include a number of illustrative examples of its use.

  19. The Economics of Reproducibility in Preclinical Research

    PubMed Central

    Freedman, Leonard P.; Cockburn, Iain M.; Simcoe, Timothy S.

    2015-01-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible—in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures. PMID:26057340

  20. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  1. Analytical Technology

    SciTech Connect

    Goheen, Steven C.

    2001-07-01

    Characterizing environmental samples has been exhaustively addressed in the literature for most analytes of environmental concern. One of the weak areas of environmental analytical chemistry is that of radionuclides and samples contaminated with radionuclides. The analysis of samples containing high levels of radionuclides can be far more complex than that of non-radioactive samples. This chapter addresses the analysis of samples with a wide range of radioactivity. The other areas of characterization examined in this chapter are the hazardous components of mixed waste, and special analytes often associated with radioactive materials. Characterizing mixed waste is often similar to characterizing waste components in non-radioactive materials. The largest differences are in associated safety precautions to minimize exposure to dangerous levels of radioactivity. One must attempt to keep radiological dose as low as reasonably achievable (ALARA). This chapter outlines recommended procedures to safely and accurately characterize regulated components of radioactive samples.

  2. Acceptability of BCG vaccination.

    PubMed

    Mande, R

    1977-01-01

    The acceptability of BCG vaccination varies a great deal according to the country and to the period when the vaccine is given. The incidence of complications has not always a direct influence on this acceptability, which depends, for a very large part, on the risk of tuberculosis in a given country at a given time.

  3. ATLAS ACCEPTANCE TEST

    SciTech Connect

    Cochrane, J. C. , Jr.; Parker, J. V.; Hinckley, W. B.; Hosack, K. W.; Mills, D.; Parsons, W. M.; Scudder, D. W.; Stokes, J. L.; Tabaka, L. J.; Thompson, M. C.; Wysocki, Frederick Joseph; Campbell, T. N.; Lancaster, D. L.; Tom, C. Y.

    2001-01-01

    The acceptance test program for Atlas, a 23 MJ pulsed power facility for use in the Los Alamos High Energy Density Hydrodynamics program, has been completed. Completion of this program officially releases Atlas from the construction phase and readies it for experiments. Details of the acceptance test program results and of machine capabilities for experiments will be presented.

  4. Reproducibility, Controllability, and Optimization of Lenr Experiments

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  5. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  6. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  7. Challenges for Visual Analytics

    SciTech Connect

    Thomas, James J.; Kielman, Joseph

    2009-09-23

    Visual analytics has seen unprecedented growth in its first five years of mainstream existence. Great progress has been made in a short time, yet great challenges must be met in the next decade to provide new technologies that will be widely accepted by societies throughout the world. This paper sets the stage for some of those challenges in an effort to provide the stimulus for the research, both basic and applied, to address and exceed the envisioned potential for visual analytics technologies. We start with a brief summary of the initial challenges, followed by a discussion of the initial driving domains and applications, as well as additional applications and domains that have been a part of recent rapid expansion of visual analytics usage. We look at the common characteristics of several tools illustrating emerging visual analytics technologies, and conclude with the top ten challenges for the field of study. We encourage feedback and collaborative participation by members of the research community, the wide array of user communities, and private industry.

  8. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    ERIC Educational Resources Information Center

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  9. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  10. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  11. A Physical Activity Questionnaire: Reproducibility and Validity

    PubMed Central

    Barbosa, Nicolas; Sanchez, Carlos E.; Vera, Jose A.; Perez, Wilson; Thalabard, Jean-Christophe; Rieu, Michel

    2007-01-01

    This study evaluates the Quantification de L’Activite Physique en Altitude chez les Enfants (QAPACE) supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE) on Bogotá’s schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC). The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2) from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97); by age categories 8-10, 0.94 (0.89-0. 97); 11-13, 0.98 (0.96- 0.99); 14-16, 0.95 (0.91-0.98). The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66) (p<0.01); by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87), 0.76 (0.78) and 0.88 (0.80) respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake. Key pointsThe presence of a supervisor, the limited size of the group with the possibility of answering to their questions could explain the high reproducibility for this questionnaire.No study in the literature had directly addressed the issue of estimating a yearly average PA including school and vacation period.A two step procedure, in the population of schoolchildren of Bogotá, gives confidence in the use of the QAPACE questionnaire in a large epidemiological survey in related populations. PMID:24149485

  12. Analytical sedimentology

    SciTech Connect

    Lewis, D.W. . Dept. of Geology); McConchie, D.M. . Centre for Coastal Management)

    1994-01-01

    Both a self instruction manual and a cookbook'' guide to field and laboratory analytical procedures, this book provides an essential reference for non-specialists. With a minimum of mathematics and virtually no theory, it introduces practitioners to easy, inexpensive options for sample collection and preparation, data acquisition, analytic protocols, result interpretation and verification techniques. This step-by-step guide considers the advantages and limitations of different procedures, discusses safety and troubleshooting, and explains support skills like mapping, photography and report writing. It also offers managers, off-site engineers and others using sediments data a quick course in commissioning studies and making the most of the reports. This manual will answer the growing needs of practitioners in the field, either alone or accompanied by Practical Sedimentology, which surveys the science of sedimentology and provides a basic overview of the principles behind the applications.

  13. Reproducibility of electroretinograms recorded with DTL electrodes.

    PubMed

    Hébert, M; Lachapelle, P; Dumont, M

    The purpose of this study was to examine whether the use of the DTL fiber electrode yields stable and reproducible electroretinographic recordings. To do so, luminance response function, derived from dark-adapted electroretinograms, was obtained from both eyes of 10 normal subjects at two recording sessions spaced by 7-14 days. The data thus generated was used to calculate Naka-Rushton Vmax and k parameters and values obtained at the two recording sessions were compared. Our results showed that there was no significant difference in the values of Vmax and k calculated from the data generated at the two recording sessions. The above clearly demonstrate that the use of the DTL fiber electrode does not jeopardize, in any way, the stability and reproducibility of ERG responses.

  14. Acceptance procedures: Microfilm printer

    NASA Technical Reports Server (NTRS)

    Lockwood, H. E.

    1973-01-01

    Acceptance tests were made for a special order automatic additive color microfilm printer. Tests include film capacity, film transport, resolution, illumination uniformity, exposure range checks, and color cuing considerations.

  15. Reproducing Actual Morphology of Planetary Lava Flows

    NASA Astrophysics Data System (ADS)

    Miyamoto, H.; Sasaki, S.

    1996-03-01

    Assuming that lava flows behave as non-isothermal laminar Bingham fluids, we developed a numerical code of lava flows. We take the self gravity effects and cooling mechanisms into account. The calculation method is a kind of cellular automata using a reduced random space method, which can eliminate the mesh shape dependence. We can calculate large scale lava flows precisely without numerical instability and reproduce morphology of actual lava flows.

  16. Reproducibility of liquid oxygen impact test results

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1975-01-01

    Results for 12,000 impacts on a wide range of materials were studied to determine the reproducibility of the liquid oxygen impact test method. Standard deviations representing the overall variability of results were in close agreement with the expected values for a binomial process. This indicates that the major source of variability is due to the go - no go nature of the test method and that variations due to sampling and testing operations were not significant.

  17. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  18. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  19. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  20. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  1. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  2. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  3. Proficiency analytical testing program

    SciTech Connect

    Groff, J.H.; Schlecht, P.C.

    1994-03-01

    The Proficiency Analytical Testing (PAT) Program is a collaborative effort of the American Industrial Hygiene Association (AIHA) and researchers at the Centers for Disease Control and Prevention (CDC), National Institute for Occupational Safety and Health (NIOSH). The PAT Program provides quality control reference samples to over 1400 occupational health and environmental laboratories in over 15 countries. Although one objective of the PAT Program is to evaluate the analytical ability of participating laboratories, the primary objective is to assist these laboratories in improving their laboratory performance. Each calendar quarter (designated a round), samples are mailed to participating laboratories and the data are analyzed to evaluate laboratory performance on a series of analyses. Each mailing and subsequent data analysis are completed in time for participants to obtain repeat samples and to correct analytical problems before the next calendar quarter starts. The PAT Program currently includes four sets of samples. A mixture of 3 of the 4 possible metals, and 3 of the 15 possible organic solvents are rotated for each round. Laboratories are evaluated for each analysis by comparing their reported results against an acceptable performance limit for each PAT Program sample the laboratory analyses. Reference laboratories are preselected to provide the performance limits for each sample. These reference laboratories must meet the following criteria: (1) the laboratory was rated proficient in the last PAT evaluation of all the contaminants in the Program; and (2) the laboratory, if located in the United States, is AIHA accredited. Data are acceptable if they fall within the performance limits. Laboratories are rated based upon performance in the PAT Program over the last year (i.e., four calendar quarters), as well as on individual contaminant performance and overall performance. 1 ref., 3 tabs.

  4. Proficiency analytical testing program

    SciTech Connect

    Schlecht, P.C.; Groff, J.H.

    1994-06-01

    The Proficiency Analytical Testing (PAT) Program is a collaborative effort of the American Industrial Hygiene Association (AIHA) and researchers at the Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health (NIOSH). The PAT Program provides quality control reference samples to over 1400 occupational health and environmental laboratories in over 15 countries. Although one objective of the PAT Program is to evaluate the analytical ability of participating laboratories, the primary objective is to assist these laboratories in improving their laboratory performance. Each calendar quarter (designated a round), samples are mailed to participating laboratories and the data are analyzed to evaluate laboratory performance on a series of analyses. Each mailing and subsequent data analysis is completed in time for participants to obtain repeat samples and to correct analytical problems before the next calendar quarter starts. The PAT Program currently includes four sets of samples. A mixture of 3 of the 4 possible metals, and 3 of the 15 possible organic solvents are rotated for each round. Laboratories are evaluated for each analysis by comparing their reported results against an acceptable performance limit for each PAT Program sample the laboratory analyses. Reference laboratories are preselected to provide the performance limits for each sample. These reference laboratories must meet the following criteria: (1) the laboratory was rated proficient in the last PAT evaluation of all the contaminants in the Program; and (2) the laboratory, if located in the United States, is AIHA accredited. Data are acceptable if they fall within the performance limits. Laboratories are rated based upon performance in the PAT Program over the last year (i.e., four calendar quarters), as well as on individual contaminant performance and overall performance. 1 ref., 3 tabs.

  5. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  6. Nonlinear sequential laminates reproducing hollow sphere assemblages

    NASA Astrophysics Data System (ADS)

    Idiart, Martín I.

    2007-07-01

    A special class of nonlinear porous materials with isotropic 'sequentially laminated' microstructures is found to reproduce exactly the hydrostatic behavior of 'hollow sphere assemblages'. It is then argued that this result supports the conjecture that Gurson's approximate criterion for plastic porous materials, and its viscoplastic extension of Leblond et al. (1994), may actually yield rigorous upper bounds for the hydrostatic flow stress of porous materials containing an isotropic, but otherwise arbitrary, distribution of porosity. To cite this article: M.I. Idiart, C. R. Mecanique 335 (2007).

  7. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  8. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  9. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  10. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.

  11. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted. PMID:26941312

  12. Reproducibility Data on SUMMiT

    SciTech Connect

    Irwin, Lloyd; Jakubczak, Jay; Limary, Siv; McBrayer, John; Montague, Stephen; Smith, James; Sniegowski, Jeffry; Stewart, Harold; de Boer, Maarten

    1999-07-16

    SUMMiT (Sandia Ultra-planar Multi-level MEMS Technology) at the Sandia National Laboratories' MDL (Microelectronics Development Laboratory) is a standardized MEMS (Microelectromechanical Systems) technology that allows designers to fabricate concept prototypes. This technology provides four polysilicon layers plus three sacrificial oxide layers (with the third oxide layer being planarized) to enable fabrication of complex mechanical systems-on-a-chip. Quantified reproducibility of the SUMMiT process is important for process engineers as well as designers. Summary statistics for critical MEMS technology parameters such as film thickness, line width, and sheet resistance will be reported for the SUMMiT process. Additionally, data from Van der Pauw test structures will be presented. Data on film thickness, film uniformity and critical dimensions of etched line widths are collected from both process and monitor wafers during manufacturing using film thickness metrology tools and SEM tools. A standardized diagnostic module is included in each SWiT run to obtain post-processing parametric data to monitor run-to-run reproducibility such as Van der Pauw structures for measuring sheet resistance. This characterization of the SUMMiT process enables design for manufacturability in the SUMMiT technology.

  13. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  14. Smaller hospitals accept advertising.

    PubMed

    Mackesy, R

    1988-07-01

    Administrators at small- and medium-sized hospitals gradually have accepted the role of marketing in their organizations, albeit at a much slower rate than larger institutions. This update of a 1983 survey tracks the increasing competitiveness, complexity and specialization of providing health care and of advertising a small hospital's services. PMID:10288550

  15. Students Accepted on Probation.

    ERIC Educational Resources Information Center

    Lorberbaum, Caroline S.

    This report is a justification of the Dalton Junior College admissions policy designed to help students who had had academic and/or social difficulties at other schools. These students were accepted on probation, their problems carefully analyzed, and much effort devoted to those with low academic potential. They received extensive academic and…

  16. Why was Relativity Accepted?

    NASA Astrophysics Data System (ADS)

    Brush, S. G.

    Historians of science have published many studies of the reception of Einstein's special and general theories of relativity. Based on a review of these studies, and my own research on the role of the light-bending prediction in the reception of general relativity, I discuss the role of three kinds of reasons for accepting relativity (1) empirical predictions and explanations; (2) social-psychological factors; and (3) aesthetic-mathematical factors. According to the historical studies, acceptance was a three-stage process. First, a few leading scientists adopted the special theory for aesthetic-mathematical reasons. In the second stage, their enthusiastic advocacy persuaded other scientists to work on the theory and apply it to problems currently of interest in atomic physics. The special theory was accepted by many German physicists by 1910 and had begun to attract some interest in other countries. In the third stage, the confirmation of Einstein's light-bending prediction attracted much public attention and forced all physicists to take the general theory of relativity seriously. In addition to light-bending, the explanation of the advance of Mercury's perihelion was considered strong evidence by theoretical physicists. The American astronomers who conducted successful tests of general relativity became defenders of the theory. There is little evidence that relativity was `socially constructed' but its initial acceptance was facilitated by the prestige and resources of its advocates.

  17. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. PMID:26315443

  18. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  19. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  20. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  1. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  2. Acceptability of human risk.

    PubMed Central

    Kasperson, R E

    1983-01-01

    This paper has three objectives: to explore the nature of the problem implicit in the term "risk acceptability," to examine the possible contributions of scientific information to risk standard-setting, and to argue that societal response is best guided by considerations of process rather than formal methods of analysis. Most technological risks are not accepted but are imposed. There is also little reason to expect consensus among individuals on their tolerance of risk. Moreover, debates about risk levels are often at base debates over the adequacy of the institutions which manage the risks. Scientific information can contribute three broad types of analyses to risk-setting deliberations: contextual analysis, equity assessment, and public preference analysis. More effective risk-setting decisions will involve attention to the process used, particularly in regard to the requirements of procedural justice and democratic responsibility. PMID:6418541

  3. Highly Sensitive and Reproducible SERS Performance from Uniform Film Assembled by Magnetic Noble Metal Composite Microspheres.

    PubMed

    Niu, Chunyu; Zou, Bingfang; Wang, Yongqiang; Cheng, Lin; Zheng, Haihong; Zhou, Shaomin

    2016-01-26

    To realize highly sensitive and reproducible SERS performance, a new route was put forward to construct uniform SERS film by using magnetic composite microspheres. In the experiment, monodisperse Fe3O4@SiO2@Ag microspheres with hierarchical surface were developed and used as building block of SERS substrate, which not only realized fast capturing analyte through dispersion and collection under external magnet but also could be built into uniform film through magnetically induced self-assembly. By using R6G as probe molecule, the as-obtained uniform film exhibited great improvement on SERS performance in both sensitivity and reproducibility when compared with nonuniform film, demonstrating the perfect integration of high sensitivity of hierarchal noble metal microspheres and high reproducibility of ordered microspheres array. Furthermore, the as-obtained product was used to detect pesticide thiram and also exhibited excellent SERS performance for trace detection.

  4. Age and Acceptance of Euthanasia.

    ERIC Educational Resources Information Center

    Ward, Russell A.

    1980-01-01

    Study explores relationship between age (and sex and race) and acceptance of euthanasia. Women and non-Whites were less accepting because of religiosity. Among older people less acceptance was attributable to their lesser education and greater religiosity. Results suggest that quality of life in old age affects acceptability of euthanasia. (Author)

  5. The rapid reproducers paradox: population control and individual procreative rights.

    PubMed

    Wissenburg, M

    1998-01-01

    This article argues that population policies need to be evaluated from macro and micro perspectives and to consider individual rights. Ecological arguments that are stringent conditions of liberal democracy are assessed against a moral standard. The moral standard is applied to a series of reasons for limiting procreative rights in the cause of sustainability. The focus is directly on legally enforced antinatalist measures and not on indirect policies with incentives and disincentives. The explicit assumption is that population policy violates the fairness to individuals for societal gain and that population policies are incompatible with stringent conditions of liberal democracy. The author identifies the individual-societal tradeoff as the "rapid reproducers paradox." The perfect sustainable population level is either not possible or is a repugnant alternative. 12 ecological arguments are presented, and none are found compatible with notions of a liberal democracy. Three alternative antinatalist options are the acceptance of less rigid and still coercive policies, amendments to the conception of liberal democracy, or loss of hope and choice of noncoercive solutions to sustainability, none of which is found viable. If voluntary abstinence and distributive solutions fail, then frugal demand options and technological supply options both will be necessary.

  6. Laboratory 20-km cycle time trial reproducibility.

    PubMed

    Zavorsky, G S; Murias, J M; Gow, J; Kim, D J; Poulin-Harnois, C; Kubow, S; Lands, L C

    2007-09-01

    This study evaluated the reproducibility of laboratory based 20-km time trials in well trained versus recreational cyclists. Eighteen cyclists (age = 34 +/- 8 yrs; body mass index = 23.1 +/- 2.2 kg/m (2); VO(2max) = 4.19 +/- 0.65 L/min) completed three 20-km time trials over a month on a Velotron cycle ergometer. Average power output (PO) (W), speed, and heart rate (HR) were significantly lower in the first time trial compared to the second and third time trial. The coefficients of variation (CV) between the second and third trial of the top eight performers for average PO, time to completion, and speed were 1.2 %, 0.6 %, 0.5 %, respectively, compared to 4.8 %, 2.0 %, and 2.3 % for the bottom ten. In addition, the average HR, VO(2), and percentage of VO(2max) were similar between trials. This study demonstrated that (1) a familiarization session improves the reliability of the measurements (i.e., average PO, time to completion and speed), and (2) the CV was much smaller for the best performers.

  7. The reproducible radio outbursts of SS Cygni

    NASA Astrophysics Data System (ADS)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disc material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  8. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  9. Reproducibility of transcutaneous oximetry and laser Doppler flowmetry in facial skin and gingival tissue.

    PubMed

    Svalestad, J; Hellem, S; Vaagbø, G; Irgens, A; Thorsen, E

    2010-01-01

    Laser Doppler flowmetry (LDF) and transcutaneous oximetry (TcPO(2)) are non-invasive techniques, widely used in the clinical setting, for assessing microvascular blood flow and tissue oxygen tension, e.g. recording vascular changes after radiotherapy and hyperbaric oxygen therapy. With standardized procedures and improved reproducibility, these methods might also be applicable in longitudinal studies. The aim of this study was to evaluate the reproducibility of facial skin and gingival LDF and facial skin TcPO(2). The subjects comprised ten healthy volunteers, 5 men, aged 31-68 years. Gingival perfusion was recorded with the LDF probe fixed to a custom made, tooth-supported acrylic splint. Skin perfusion was recorded on the cheek. TcPO(2) was recorded on the forehead and cheek and in the second intercostal space. The reproducibility of LDF measurements taken after vasodilation by heat provocation was greater than for basal flow in both facial skin and mandibular gingiva. Pronounced intraday variations were observed. Interweek reproducibility assessed by intraclass correlation coefficient ranged from 0.74 to 0.96 for LDF and from 0.44 to 0.75 for TcPO(2). The results confirm acceptable reproducibility of LDF and TcPO(2) in longitudinal studies in a vascular laboratory where subjects serve as their own controls. The use of thermoprobes is recommended. Repeat measurements should be taken at the same time of day.

  10. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or regulatory criteria. (b) FDA may require the development of an acceptable analytical method for the... such an acceptable analytical method, the agency will publish notice of that requirement in the...

  11. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or regulatory criteria. (b) FDA may require the development of an acceptable analytical method for the... such an acceptable analytical method, the agency will publish notice of that requirement in the...

  12. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or regulatory criteria. (b) FDA may require the development of an acceptable analytical method for the... such an acceptable analytical method, the agency will publish notice of that requirement in the...

  13. Baby-Crying Acceptance

    NASA Astrophysics Data System (ADS)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  14. High acceptance recoil polarimeter

    SciTech Connect

    The HARP Collaboration

    1992-12-05

    In order to detect neutrons and protons in the 50 to 600 MeV energy range and measure their polarization, an efficient, low-noise, self-calibrating device is being designed. This detector, known as the High Acceptance Recoil Polarimeter (HARP), is based on the recoil principle of proton detection from np[r arrow]n[prime]p[prime] or pp[r arrow]p[prime]p[prime] scattering (detected particles are underlined) which intrinsically yields polarization information on the incoming particle. HARP will be commissioned to carry out experiments in 1994.

  15. Experimental challenges to reproduce seismic fault motion

    NASA Astrophysics Data System (ADS)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  16. Reproducibility and utility of dune luminescence chronologies

    NASA Astrophysics Data System (ADS)

    Leighton, Carly L.; Thomas, David S. G.; Bailey, Richard M.

    2014-02-01

    Optically stimulated luminescence (OSL) dating of dune deposits has increasingly been used as a tool to investigate the response of aeolian systems to environmental change. Amalgamation of individual dune accumulation chronologies has been employed in order to distinguish regional from local geomorphic responses to change. However, advances in dating have produced chronologies of increasing complexity. In particular, questions regarding the interpretation of dune ages have been raised, including over the most appropriate method to evaluate the significance of suites of OSL ages when local 'noisy' and discontinuous records are combined. In this paper, these issues are reviewed and the reproducibility of dune chronologies is assessed. OSL ages from two cores sampled from the same dune in the northeast Rub' al Khali, United Arab Emirates, are presented and compared, alongside an analysis of previously published dune ages dated to within the last 30 ka. Distinct periods of aeolian activity and preservation are identified, which can be tied to regional climatic and environmental changes. This case study is used to address fundamental questions that are persistently asked of dune dating studies, including the appropriate spatial scale over which to infer environmental and climatic change based on dune chronologies, whether chronological hiatuses can be interpreted, how to most appropriately combine and display datasets, and the relationship between geomorphic and palaeoclimatic signals. Chronological profiles reflect localised responses to environmental variability and climatic forcing, and amalgamation of datasets, with consideration of sampling resolution, is required; otherwise local factors are always likely to dominate. Using net accumulation rates to display ages may provide an informative approach of analysing and presenting dune OSL chronologies less susceptible to biases resulting from insufficient sampling resolution.

  17. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  18. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  19. A reproducible surface-enhanced raman spectroscopy approach. Online SERS measurements in a segmented microfluidic system.

    PubMed

    Strehle, Katrin R; Cialla, Dana; Rösch, Petra; Henkel, Thomas; Köhler, Michael; Popp, Jürgen

    2007-02-15

    The application of a liquid/liquid microsegmented flow for serial high-throughput microanalytical systems shows promising prospects for applications in clinical chemistry, pharmaceutical research, process diagnostics, and analytical chemistry. Microscopy and microspectral analytics offer powerful approaches for the analytical readout of droplet based assays. Within the generated segments, individuality and integrity are retained during the complete diagnostic process making the approach favored for analysis of individual microscaled objects like cells and microorganisms embedded in droplets. Here we report on the online application of surface-enhanced micro-Raman spectroscopy for the detection and quantization of analytes in a liquid/liquid segmented microfluidic system. Data acquisition was performed in microsegments down to a volume of 180 nl. With this approach, we overcome the well-known problem of adhesion of colloid/analyte conjugates to the optical windows of detection cuvettes, which causes the so-called "memory effect". The combination of the segmented microfluidic system with the highly sensitive SERS technique reaches in a reproducible quantification of analytes with the SERS technique.

  20. The effect of sample holder material on ion mobility spectrometry reproducibility

    NASA Technical Reports Server (NTRS)

    Jadamec, J. Richard; Su, Chih-Wu; Rigdon, Stephen; Norwood, Lavan

    1995-01-01

    When a positive detection of a narcotic occurs during the search of a vessel, a decision has to be made whether further intensive search is warranted. This decision is based in part on the results of a second sample collected from the same area. Therefore, the reproducibility of both sampling and instrumental analysis is critical in terms of justifying an in depth search. As reported at the 2nd Annual IMS Conference in Quebec City, the U.S. Coast Guard has determined that when paper is utilized as the sample desorption medium for the Barringer IONSCAN, the analytical results using standard reference samples are reproducible. A study was conducted utilizing papers of varying pore sizes and comparing their performance as a desorption material relative to the standard Barringer 50 micron Teflon. Nominal pore sizes ranged from 30 microns down to 2 microns. Results indicate that there is some peak instability in the first two to three windows during the analysis. The severity of the instability was observed to increase as the pore size of the paper is decreased. However, the observed peak instability does not create a situation that results in a decreased reliability or reproducibility in the analytical result.

  1. Month-to-Month and Year-to-Year Reproducibility of High Frequency QRS ECG signals

    NASA Technical Reports Server (NTRS)

    Batdorf, Niles; Feiveson, Alan H.; Schlegel, Todd T.

    2006-01-01

    High frequency (HF) electrocardiography analyzing the entire QRS complex in the frequency range of 150 to 250 Hz may prove useful in the detection of coronary artery disease, yet the long-term stability of these waveforms has not been fully characterized. We therefore prospectively investigated the reproducibility of the root mean squared (RMS) voltage, kurtosis, and the presence versus absence of reduced amplitude zones (RAzs) in signal averaged 12-lead HF QRS recordings acquired in the supine position one month apart in 16 subjects and one year apart in 27 subjects. Reproducibility of RMS voltage and kurtosis was excellent over these time intervals in the limb leads, and acceptable in the precordial leads using both the V-lead and CR-lead derivations. The relative error of RMS voltage was 12% month-to-month and 16% year-to-year in the serial recordings when averaged over all 12 leads. RAzs were also reproducible at a rate of up to 87% and 8 1 %, respectively, for the month-to-month and year-to-year recordings. We conclude that 12-lead HF QRS electrocardiograms are sufficiently reproducible for clinical use.

  2. Month-to-month and year-to-year reproducibility of high frequency QRS ECG signals

    NASA Technical Reports Server (NTRS)

    Batdorf, Niles J.; Feiveson, Alan H.; Schlegel, Todd T.

    2004-01-01

    High frequency electrocardiography analyzing the entire QRS complex in the frequency range of 150 to 250 Hz may prove useful in the detection of coronary artery disease, yet the long-term stability of these waveforms has not been fully characterized. Therefore, we prospectively investigated the reproducibility of the root mean squared voltage, kurtosis, and the presence versus absence of reduced amplitude zones in signal averaged 12-lead high frequency QRS recordings acquired in the supine position one month apart in 16 subjects and one year apart in 27 subjects. Reproducibility of root mean squared voltage and kurtosis was excellent over these time intervals in the limb leads, and acceptable in the precordial leads using both the V-lead and CR-lead derivations. The relative error of root mean squared voltage was 12% month-to-month and 16% year-to-year in the serial recordings when averaged over all 12 leads. Reduced amplitude zones were also reproducible up to a rate of 87% and 81%, respectively, for the month-to-month and year-to-year recordings. We conclude that 12-lead high frequency QRS electrocardiograms are sufficiently reproducible for clinical use.

  3. Repeatability and Reproducibility of Noninvasive Keratograph 5M Measurements in Patients with Dry Eye Disease

    PubMed Central

    Tian, Lei; Qu, Jing-hao; zhang, Xiao-yu; Sun, Xu-guang

    2016-01-01

    Purpose. To determine the intraexaminer repeatability and interexaminer reproducibility of tear meniscus height (TMH) and noninvasive Keratograph tear breakup time (NIKBUT) measurements obtained with the Keratograph 5M (K5M) in a sample of healthy and dry eye populations. Methods. Forty-two patients with dry eye disease (DED group) and 42 healthy subjects (healthy group) were recruited in this prospective study. In all subjects, each eye received 3 consecutive measurements using the K5M for the TMH and NIKBUTs (NIKBUT-first and NIKBUT-average). And then a different examiner repeated the measurements. The repeatability and reproducibility of measurements were assessed by the coefficient of variation (CV) and intraclass correlation coefficient (ICC). Results. The repeatability and reproducibility of TMH and NIKBUTs were good in both DED and healthy groups (CV% ≤ 26.1% and ICC ≥ 0.75 for all measurements). Patients with DED showed better intraexaminer repeatability for NIKBUTs, but worse for TMH than healthy subjects. Average TMH, NIKBUT-first, and NIKBUT-average were significantly lower in DED group than in healthy group (all P values < 0.05). Conclusions. Measurements of TMH and NIKBUTs obtained with the K5M may provide a simple, noninvasive screening test for dry eye with acceptable repeatability and reproducibility. The NIKBUTs were more reliable, but TMH was less reliable in patients with DED. PMID:27190639

  4. Repeatability and Reproducibility of Noninvasive Keratograph 5M Measurements in Patients with Dry Eye Disease.

    PubMed

    Tian, Lei; Qu, Jing-Hao; Zhang, Xiao-Yu; Sun, Xu-Guang

    2016-01-01

    Purpose. To determine the intraexaminer repeatability and interexaminer reproducibility of tear meniscus height (TMH) and noninvasive Keratograph tear breakup time (NIKBUT) measurements obtained with the Keratograph 5M (K5M) in a sample of healthy and dry eye populations. Methods. Forty-two patients with dry eye disease (DED group) and 42 healthy subjects (healthy group) were recruited in this prospective study. In all subjects, each eye received 3 consecutive measurements using the K5M for the TMH and NIKBUTs (NIKBUT-first and NIKBUT-average). And then a different examiner repeated the measurements. The repeatability and reproducibility of measurements were assessed by the coefficient of variation (CV) and intraclass correlation coefficient (ICC). Results. The repeatability and reproducibility of TMH and NIKBUTs were good in both DED and healthy groups (CV% ≤ 26.1% and ICC ≥ 0.75 for all measurements). Patients with DED showed better intraexaminer repeatability for NIKBUTs, but worse for TMH than healthy subjects. Average TMH, NIKBUT-first, and NIKBUT-average were significantly lower in DED group than in healthy group (all P values < 0.05). Conclusions. Measurements of TMH and NIKBUTs obtained with the K5M may provide a simple, noninvasive screening test for dry eye with acceptable repeatability and reproducibility. The NIKBUTs were more reliable, but TMH was less reliable in patients with DED. PMID:27190639

  5. Sonic boom acceptability studies

    NASA Astrophysics Data System (ADS)

    Shepherd, Kevin P.; Sullivan, Brenda M.; Leatherwood, Jack D.; McCurdy, David A.

    1992-04-01

    The determination of the magnitude of sonic boom exposure which would be acceptable to the general population requires, as a starting point, a method to assess and compare individual sonic booms. There is no consensus within the scientific and regulatory communities regarding an appropriate sonic boom assessment metric. Loudness, being a fundamental and well-understood attribute of human hearing was chosen as a means of comparing sonic booms of differing shapes and amplitudes. The figure illustrates the basic steps which yield a calculated value of loudness. Based upon the aircraft configuration and its operating conditions, the sonic boom pressure signature which reaches the ground is calculated. This pressure-time history is transformed to the frequency domain and converted into a one-third octave band spectrum. The essence of the loudness method is to account for the frequency response and integration characteristics of the auditory system. The result of the calculation procedure is a numerical description (perceived level, dB) which represents the loudness of the sonic boom waveform.

  6. Sonic boom acceptability studies

    NASA Technical Reports Server (NTRS)

    Shepherd, Kevin P.; Sullivan, Brenda M.; Leatherwood, Jack D.; Mccurdy, David A.

    1992-01-01

    The determination of the magnitude of sonic boom exposure which would be acceptable to the general population requires, as a starting point, a method to assess and compare individual sonic booms. There is no consensus within the scientific and regulatory communities regarding an appropriate sonic boom assessment metric. Loudness, being a fundamental and well-understood attribute of human hearing was chosen as a means of comparing sonic booms of differing shapes and amplitudes. The figure illustrates the basic steps which yield a calculated value of loudness. Based upon the aircraft configuration and its operating conditions, the sonic boom pressure signature which reaches the ground is calculated. This pressure-time history is transformed to the frequency domain and converted into a one-third octave band spectrum. The essence of the loudness method is to account for the frequency response and integration characteristics of the auditory system. The result of the calculation procedure is a numerical description (perceived level, dB) which represents the loudness of the sonic boom waveform.

  7. Analytical Aspects of Hydrogen Exchange Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Engen, John R.; Wales, Thomas E.

    2015-07-01

    This article reviews the analytical aspects of measuring hydrogen exchange by mass spectrometry (HX MS). We describe the nature of analytical selectivity in hydrogen exchange, then review the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in HX MS depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that can be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics.

  8. Reproducibility and quantification of illicit drugs using matrix-assisted ionization (MAI) mass spectrometry.

    PubMed

    Chakrabarty, Shubhashis; DeLeeuw, Jessica L; Woodall, Daniel W; Jooss, Kevin; Narayan, Srinivas B; Trimpin, Sarah

    2015-08-18

    Matrix-assisted ionization (MAI) mass spectrometry (MS) is a simple and sensitive method for analysis of low- and high-mass compounds, requiring only that the analyte in a suitable matrix be exposed to the inlet aperture of an atmospheric pressure ionization mass spectrometer. Here, we evaluate the reproducibility of MAI and its potential for quantification using six drug standards. Factors influencing reproducibility include the matrix compound used, temperature, and the method of sample introduction. The relative standard deviation (RSD) using MAI for a mixture of morphine, codeine, oxymorphone, oxycodone, clozapine, and buspirone and their deuterated internal standards using the matrix 3-nitrobenzonitrile is less than 10% with either a Waters SYNAPT G2 or a Thermo LTQ Velos mass spectrometer. The RSD values obtained using MAI are comparable to those using ESI or MALDI on these instruments. The day-to-day reproducibility of MAI determined for five consecutive days with internal standards was better than 20% using manual sample introduction. The reproducibility improved to better than 5% using a mechanically assisted sample introduction method. Hydrocodone, present in a sample of undiluted infant urine, was quantified with MAI using the standard addition method. PMID:26186653

  9. Reproducibility and variability of the cost functions reconstructed from experimental recordings in multifinger prehension.

    PubMed

    Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2012-01-01

    The study examines whether the cost functions reconstructed from experimental recordings are reproducible over time. Participants repeated the trials on three days. By following Analytical Inverse Optimization procedures, the cost functions of finger forces were reconstructed for each day. The cost functions were found to be reproducible over time: application of a cost function C(i) to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) the 2nd order coefficients of the cost function showed negative linear relations with finger force magnitudes; (b) the finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space for all subjects and all testing sessions; (c) the data agreed well with the principle of superposition, i.e. the action of object prehension can be decoupled into the control of rotational equilibrium and slipping prevention. PMID:22364441

  10. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  11. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  12. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  13. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  14. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  15. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  16. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  17. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Authority to reproduce. 95.43 Section 95.43 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.43 Authority to reproduce. (a) Each...

  18. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-12-10

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

  19. Assessments of endothelial function and arterial stiffness are reproducible in patients with COPD

    PubMed Central

    Rodriguez-Miguelez, Paula; Seigler, Nichole; Bass, Leon; Dillard, Thomas A; Harris, Ryan A

    2015-01-01

    Background Elevated cardiovascular disease risk is observed in patients with COPD. Non-invasive assessments of endothelial dysfunction and arterial stiffness have recently emerged to provide mechanistic insight into cardiovascular disease risk in COPD; however, the reproducibility of endothelial function and arterial stiffness has yet to be investigated in this patient population. Objectives This study sought to examine the within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. Methods Baseline diameter, peak diameter, flow-mediated dilation, augmentation index, augmentation index at 75 beats per minute, and pulse wave velocity were assessed three times in 17 patients with COPD (six males, eleven females, age range 47–75 years old; forced expiratory volume in 1 second =51.5% predicted). Session A and B were separated by 3 hours (within-day), whereas session C was conducted at least 7 days following session B (between-day). Reproducibility was assessed by: 1) paired t-tests, 2) coefficients of variation, 3) coefficients of variation prime, 4) intra-class correlation coefficient, 5) Pearson’s correlations (r), and 6) Bland–Altman plots. Five acceptable assessments were required to confirm reproducibility. Results Six out of six within-day criteria were met for endothelial function and arterial stiffness outcomes. Six out of six between-day criteria were met for baseline and peak diameter, augmentation index and pulse wave velocity, whereas five out of six criteria were met for flow-mediated dilation. Conclusion The present study provides evidence for within-day and between-day reproducibility of endothelial function and arterial stiffness in patients with COPD. PMID:26396509

  20. The MARS for squat, countermovement, and standing long jump performance analyses: are measures reproducible?

    PubMed

    Hébert-Losier, Kim; Beaven, C Martyn

    2014-07-01

    Jump tests are often used to assess the effect of interventions because their outcomes are reported valid indicators of functional performance. In this study, we examined the reproducibility of performance parameters from 3 common jump tests obtained using the commercially available Kistler Measurement, Analysis and Reporting Software (MARS). On 2 separate days, 32 men performed 3 squat jumps (SJs), 3 countermovement jumps (CMJs), and 3 standing long jumps (LJs) on a Kistler force-plate. On both days, the performance measures from the best jump of each series were extracted using the MARS. Changes in the mean scores, intraclass correlation coefficients (ICCs), and coefficients of variations (CVs) were computed to quantify the between-day reproducibility of each parameter. Moreover, the reproducibility quantifiers specific to the 3 separate jumps were compared using nonparametric tests. Overall, an acceptable between-day reproducibility (mean ± SD, ICC, and CV) of SJ (0.88 ± 0.06 and 7.1 ± 3.8%), CMJ (0.84 ± 0.17 and 5.9 ± 4.1%), and LJ (0.80 ± 0.13 and 8.1 ± 4.1%) measures was found using the MARS, except for parameters directly relating to the rate of force development (i.e., time to maximal force) and change in momentum during countermovement (i.e., negative force impulse) where reproducibility was lower. A greater proportion of the performance measures from the standing LJs had low ICCs and/or high CVs values most likely owing to the complex nature of the LJ test. Practitioners and researchers can use most of the jump test parameters from the MARS with confidence to quantify changes in the functional ability of individuals over time, except for those relating to the rate of force development or change in momentum during countermovement phases of jumps.

  1. Is Analytic Information Processing a Feature of Expertise in Medicine?

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Rikers, Remy M.; Schmidt, Henk G.

    2008-01-01

    Diagnosing begins by generating an initial diagnostic hypothesis by automatic information processing. Information processing may stop here if the hypothesis is accepted, or analytical processing may be used to refine the hypothesis. This description portrays analytic processing as an optional extra in information processing, leading us to…

  2. Cone penetrometer acceptance test report

    SciTech Connect

    Boechler, G.N.

    1996-09-19

    This Acceptance Test Report (ATR) documents the results of acceptance test procedure WHC-SD-WM-ATR-151. Included in this report is a summary of the tests, the results and issues, the signature and sign- off ATP pages, and a summarized table of the specification vs. ATP section that satisfied the specification.

  3. Dissolution test acceptance sampling plans.

    PubMed

    Tsong, Y; Hammerstrom, T; Lin, K; Ong, T E

    1995-07-01

    The U.S. Pharmacopeia (USP) general monograph provides a standard for dissolution compliance with the requirements as stated in the individual USP monograph for a tablet or capsule dosage form. The acceptance rules recommended by USP have important roles in the quality control process. The USP rules and their modifications are often used as an industrial lot release sampling plan, where a lot is accepted when the tablets or capsules sampled are accepted as proof of compliance with the requirement. In this paper, the operating characteristics of the USP acceptance rules are reviewed and compared to a selected modification. The operating characteristics curves show that the USP acceptance rules are sensitive to the true mean dissolution and do not reject a lot or batch that has a large percentage of tablets that dissolve with less than the dissolution specification.

  4. Analytical laboratory quality audits

    SciTech Connect

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  5. Reproducible and controllable induction voltage adder for scaled beam experiments

    NASA Astrophysics Data System (ADS)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  6. Reproducible and controllable induction voltage adder for scaled beam experiments.

    PubMed

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments. PMID:27587112

  7. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  8. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  9. Reproducibility of the World Health Organization 2008 criteria for myelodysplastic syndromes

    PubMed Central

    Senent, Leonor; Arenillas, Leonor; Luño, Elisa; Ruiz, Juan C.; Sanz, Guillermo; Florensa, Lourdes

    2013-01-01

    The reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is uncertain and its assessment was the major aim of this study. The different peripheral blood and bone marrow variables required for an adequate morphological classification were blindly evaluated by four cytomorphologists in samples from 50 patients with myelodysplastic syndromes. The degree of agreement among observers was calculated using intraclass correlation coefficient and the generalized kappa statistic for multiple raters. The degree of agreement for the percentages of blasts in bone marrow and peripheral blood, ring sideroblasts in bone marrow, and erythroid, granulocytic and megakaryocytic dysplastic cells was strong (P<0.001 in all instances). After stratifying the percentages according to the categories required for the assignment of World Health Organization subtypes, the degree of agreement was not statistically significant for cases with 5-9% blasts in bone marrow (P=0.07), 0.1-1% blasts in peripheral blood (P=0.47), or percentage of erythroid dysplastic cells (P=0.49). Finally, the interobserver concordance for World Health Organization-defined subtypes showed a moderate overall agreement (P<0.001), the reproducibility being lower for cases with refractory anemia with excess of blasts type 1 (P=0.05) and refractory anemia with ring sideroblasts (P=0.09). In conclusion, the reproducibility of the World Health Organization 2008 classification for myelodysplastic syndromes is acceptable but the defining criteria for blast cells and features of erythroid dysplasia need to be refined. PMID:23065505

  10. Reproducibility and validity of a semi-quantitative FFQ for trace elements.

    PubMed

    Lee, Yujin; Park, Kyong

    2016-09-01

    The aim of this study was to test the reproducibility and validity of a self-administered FFQ for the Trace Element Study of Korean Adults in the Yeungnam area (SELEN). Study subjects were recruited from the SELEN cohort selected from rural and urban areas in Yeungnam, Korea. A semi-quantitative FFQ with 146 items was developed considering the dietary characteristics of cohorts in the study area. In a validation study, seventeen men and forty-eight women aged 38-62 years completed 3-d dietary records (DR) and two FFQ over a 3-month period. The validity was examined with the FFQ and DR, and the reproducibility was estimated using partial correlation coefficients, the Bland-Altman method and cross-classification. There were no significant differences between the mean intakes of selected nutrients as estimated from FFQ1, FFQ2 and DR. The median correlation coefficients for all nutrients were 0·47 and 0·56 in the reproducibility and validity tests, respectively. Bland-Altman's index and cross-classification showed acceptable agreement between FFQ1 and FFQ2 and between FFQ2 and DR. Ultimately, 78 % of the subjects were classified into the same and adjacent quartiles for most nutrients. In addition, the weighted κ value indicated that the two methods agreed fairly. In conclusion, this newly developed FFQ was a suitable dietary assessment method for the SELEN cohort study. PMID:27378401

  11. Reproducibility of a web-based FFQ for 13- to 15-year-old Danish adolescents.

    PubMed

    Bjerregaard, Anne A; Tetens, Inge; Olsen, Sjurdur F; Halldorsson, Thorhallur I

    2016-01-01

    FFQ are widely used in large-scale studies to assess dietary intake. To aid interpretation of diet-disease associations assessment of validity must be performed. Reproducibility is one aspect of validity focusing on the stability of repeated assessment with the same method which may also reveal problems in instrument design or participant instructions. The aim of the present study was to evaluate the reproducibility of a web-based FFQ targeting Danish adolescents within the Danish National Birth Cohort (DNBC). Data for the present study were obtained from a prospective design nested within the DNBC. Adolescents aged 13 to 15 years old (n 48, 60 % girls) completed the FFQ twice 4 weeks apart. The proportion of adolescents consistently classified into the same tertile according to amount of food intake ranged from 45 % (fish) to 77 % (vegetables), whereas classification into opposite tertiles ranged from 0 % (fruit, oils and dressing) to 15 % (beverages). Overall, no significant differences were observed in intake of food groups or nutrients between the two completions of the FFQ. Mean crude Spearman correlation for all food groups was 0·56 and mean intra-class correlation for all food groups was 0·61. In conclusion, the reproducibility of the FFQ for Danish adolescents was acceptable. The study revealed that adolescents aged 13-15 years seemed capable of recalling consistently overall dietary habits and had some difficulties estimating the frequency of consumption of regularly consumed food items. PMID:26855775

  12. Extending the Technology Acceptance Model: Policy Acceptance Model (PAM)

    NASA Astrophysics Data System (ADS)

    Pierce, Tamra

    There has been extensive research on how new ideas and technologies are accepted in society. This has resulted in the creation of many models that are used to discover and assess the contributing factors. The Technology Acceptance Model (TAM) is one that is a widely accepted model. This model examines people's acceptance of new technologies based on variables that directly correlate to how the end user views the product. This paper introduces the Policy Acceptance Model (PAM), an expansion of TAM, which is designed for the analysis and evaluation of acceptance of new policy implementation. PAM includes the traditional constructs of TAM and adds the variables of age, ethnicity, and family. The model is demonstrated using a survey of people's attitude toward the upcoming healthcare reform in the United States (US) from 72 survey respondents. The aim is that the theory behind this model can be used as a framework that will be applicable to studies looking at the introduction of any new or modified policies.

  13. Reproducibility of ERG responses obtained with the DTL electrode.

    PubMed

    Hébert, M; Vaegan; Lachapelle, P

    1999-03-01

    Previous investigators have suggested that the DTL fibre electrode might not be suitable for the recording of replicable electroretinograms. We present experimental evidence that when used adequately, this electrode does permit the recording of highly reproducible retinal potentials.

  14. 15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE AMERICA NEW YORK, 1872. THE HOOD WINDMILL IS IN THE FOREGROUND AND THE PANTIGO WINDMILL IS IN THE BACKGROUND - Pantigo Windmill, James Lane, East Hampton, Suffolk County, NY

  15. 8. Historic American Buildings Survey Reproduced from the collections of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey Reproduced from the collections of the Library of Congress, Accession No. 45041 Geographical File ('Nantucket, Mass.') Division of Prints and Photographs c. 1880 - Jethro Coffin House, Sunset Hill, Nantucket, Nantucket County, MA

  16. 223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS WITH INDEPENDENT ALIGNMENTS FROM KEY BRIDGE LOOKING NORTHWEST, 1953. - George Washington Memorial Parkway, Along Potomac River from McLean to Mount Vernon, VA, Mount Vernon, Fairfax County, VA

  17. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  18. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  19. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-01

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  20. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  1. Paper-based microfluidic approach for surface-enhanced raman spectroscopy and highly reproducible detection of proteins beyond picomolar concentration.

    PubMed

    Saha, Arindam; Jana, Nikhil R

    2015-01-14

    Although microfluidic approach is widely used in various point of care diagnostics, its implementation in surface enhanced Raman spectroscopy (SERS)-based detection is challenging. This is because SERS signal depends on plasmonic nanoparticle aggregation induced generation of stable electromagnetic hot spots and in currently available microfluidic platform this condition is difficult to adapt. Here we show that SERS can be adapted using simple paper based microfluidic system where both the plasmonic nanomaterials and analyte are used in mobile phase. This approach allows analyte induced controlled particle aggregation and electromagnetic hot spot generation inside the microfluidic channel with the resultant SERS signal, which is highly reproducible and sensitive. This approach has been used for reproducible detection of protein in the pico to femtomolar concentration. Presented approach is simple, rapid, and cost-effective, and requires low sample volume. Method can be extended for SERS-based detection of other biomolecules.

  2. L-286 Acceptance Test Record

    SciTech Connect

    HARMON, B.C.

    2000-01-14

    This document provides a detailed account of how the acceptance testing was conducted for Project L-286, ''200E Area Sanitary Water Plant Effluent Stream Reduction''. The testing of the L-286 instrumentation system was conducted under the direct supervision

  3. Accepted scientific research works (abstracts).

    PubMed

    2014-01-01

    These are the 39 accepted abstracts for IAYT's Symposium on Yoga Research (SYR) September 24-24, 2014 at the Kripalu Center for Yoga & Health and published in the Final Program Guide and Abstracts. PMID:25645134

  4. Quality assurance management plan special analytical support

    SciTech Connect

    Myers, M.L.

    1997-01-30

    It is the policy of Special Analytical Support (SAS) that the analytical aspects of all environmental data generated and processed in the laboratory, subject to the Environmental Protection Agency (EPA), U.S. Department of Energy (DOE), WDOE or other project specific requirements, be of known and acceptable quality. It is the intention of this QAPP to establish and assure that an effective quality controlled management system is maintained in order to meet the quality requirements of the intended use(s) of the data.

  5. Reproducibility of Resting State Connectivity in Patients with Stable Multiple Sclerosis

    PubMed Central

    Pinter, Daniela; Beckmann, Christian; Koini, Marisa; Pirker, Eva; Filippini, Nicola; Pichler, Alexander; Fuchs, Siegrid; Fazekas, Franz; Enzinger, Christian

    2016-01-01

    Given increasing efforts to use resting-state fMRI (rfMRI) as a biomarker of disease progression in multiple sclerosis (MS) we here explored the reproducibility of longitudinal rfMRI over three months in patients with clinically and radiologically stable MS. To pursue this aim, two approaches were applied in nine rfMRI networks: First, the intraclass correlation coefficient (ICC 3,1) was assessed for the mean functional connectivity maps across the entire network and a region of interest (ROI). Second, the ratio of overlap between Z-thresholded connectivity maps for each network was assessed. We quantified between-session functional reproducibility of rfMRI for 20 patients with stable MS and 14 healthy controls (HC). Nine rfMRI networks (RSNs) were examined at baseline and after 3 months of follow-up: three visual RSNs, the default-mode network, sensorimotor-, auditory-, executive control, and the left and right fronto-parietal RSN. ROI analyses were constrained to thresholded overlap masks for each individual (Z>0) at baseline and follow-up.In both stable MS and HC mean functional connectivity across the entire network did not reach acceptable ICCs for several networks (ICC<0.40) but we found a high reproducibility of ROI ICCs and of the ratio of overlap. ROI ICCs of all nine networks were between 0.98 and 0.99 for HC and ranged from 0.88 to 0.99 in patients with MS, respectively. The ratio of overlap for all networks was similar for both groups, ranging from 0.60 to 0.75.Our findings attest to a high reproducibility of rfMRI networks not only in HC but also in patients with stable MS when applying ROI analysis. This supports the utility of rfMRI to monitor functional changes related to disease progression or therapeutic interventions in MS. PMID:27007237

  6. Reproducibility of self-paced treadmill performance of trained endurance runners.

    PubMed

    Schabort, E J; Hopkins, W G; Hawley, J A

    1998-01-01

    The reproducibility of performance in a laboratory test impacts on the statistical power of that test to detect changes of performance in experiments. The purpose of this study was to determine the reproducibility of performance of distance runners completing a 60 min time trial (TT) on a motor-driven treadmill. Eight trained distance runners (age 27 +/- 7yrs, peak oxygen consumption [VO2peak] 66 +/- 5 ml x min(-1) x kg(-1), mean +/- SD) performed the TT on three occasions separated by 7-10 days. Throughout each TT the runners controlled the speed of the treadmill and could view current speed and elapsed time, but they did not know the elapsed or final distance. On the basis of heart-rate, it is estimated that the subjects ran at an average intensity equivalent to 80-83% of VO2peak. The distance run in 1 h did not vary substantially between trials (16.2 +/- 1.4 km, 15.9 +/- 1.4 km, and 16.1 +/- 1.2 km for TTs 1-3 respectively, p = 0.5). The coefficient of variation (CV) for individual runners was 2.7% (95% Cl = 1.8-4.0%) and the test-retest reliability expressed as an intraclass correlation coefficient was 0.90 (95% Cl = 0.72-0.98). Reproducibility of performance in this test was therefore acceptable. However, higher reproducibility is required for experimental studies aimed at detecting the smallest worthwhile changes in performance with realistic sample sizes. PMID:9506800

  7. Reusable plasmonic aptasensors: using a single nanoparticle to establish a calibration curve and to detect analytes.

    PubMed

    Guo, Longhua; Kim, Dong-Hwan

    2011-07-01

    We demonstrate plasmonic aptasensors that allow a single nanoparticle (NP) to generate a calibration curve and to detect analytes. The proposed reusable aptasensors have significant advantages over conventional single-NP based assays in terms of sensitivity and reproducibility.

  8. Acceptability of bio-engineered vaccines.

    PubMed

    Danner, K

    1997-01-01

    For hundreds of years bacterial and viral vaccines have been-in a way-bioengineered and were generally well received by the public, the authorities, and the medical profession. Today, additional tools, e.g. molecular biology, enable new approaches to the development of better and safer products. Various vaccines derived from gene technology have now been licensed for commercial use and are acknowledged within the scientific community. Acceptance by the public and the politicians is, however, negatively influenced by the discussions encompassing gene manipulation in man and animals, transgenic plant, and "novel food". Lack of information leads to confusion and fear. Concurrently, the absence of spectacular and life-threatening epidemics limits the perceived value of immune prophylaxis and its benefits. Scientists in institutes and industry are in a position to stimulate acceptability of bio-engineered vaccines by following some simple rule: (1) adherence to the principles of safety; (2) establishment of analytical and control methods; (3) well functioning regulatory and reporting systems; (4) demonstration of usefulness and economic benefits; (5) open communication; and (6) correct and prudent wording. PMID:9023035

  9. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  10. Reproducibility of the Structural Connectome Reconstruction across Diffusion Methods.

    PubMed

    Prčkovska, Vesna; Rodrigues, Paulo; Puigdellivol Sanchez, Ana; Ramos, Marc; Andorra, Magi; Martinez-Heras, Eloy; Falcon, Carles; Prats-Galino, Albert; Villoslada, Pablo

    2016-01-01

    Analysis of the structural connectomes can lead to powerful insights about the brain's organization and damage. However, the accuracy and reproducibility of constructing the structural connectome done with different acquisition and reconstruction techniques is not well defined. In this work, we evaluated the reproducibility of the structural connectome techniques by performing test-retest (same day) and longitudinal studies (after 1 month) as well as analyzing graph-based measures on the data acquired from 22 healthy volunteers (6 subjects were used for the longitudinal study). We compared connectivity matrices and tract reconstructions obtained with the most typical acquisition schemes used in clinical application: diffusion tensor imaging (DTI), high angular resolution diffusion imaging (HARDI), and diffusion spectrum imaging (DSI). We observed that all techniques showed high reproducibility in the test-retest analysis (correlation >.9). However, HARDI was the only technique with low variability (2%) in the longitudinal assessment (1-month interval). The intraclass coefficient analysis showed the highest reproducibility for the DTI connectome, however, with more sparse connections than HARDI and DSI. Qualitative (neuroanatomical) assessment of selected tracts confirmed the quantitative results showing that HARDI managed to detect most of the analyzed fiber groups and fanning fibers. In conclusion, we found that HARDI acquisition showed the most balanced trade-off between high reproducibility of the connectome, higher rate of path detection and of fanning fibers, and intermediate acquisition times (10-15 minutes), although at the cost of higher appearance of aberrant fibers. PMID:26464179

  11. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  12. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  13. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  14. Self-reproducing systems: structure, niche relations and evolution.

    PubMed

    Sharov, A A

    1991-01-01

    A formal definition of a self-reproducing system is proposed using Petri nets. A potential self-reproducing system is a set of places in the Petri net such that the number of tokens in each place increases due to some sequence of internal transitions (a transition is called internal to the marked subset of places if at least one of its starting places and one of its terminating places belongs to that subset). An actual self-reproducing system is a system that compensates the outflow of its components by reproduction. In a suitable environment every potential self-reproducing system becomes an actual one. Each Petri net can be considered as an ecosystem with the web of ecological niches bound together with trophic and other relations. The stationary dynamics of the ecosystem is characterized by the set of filled niches. The process of evolution is described in terms of niche composition change. Perspectives of the theory of self-reproducing systems in biology are discussed.

  15. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients prior to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was

  16. 21 CFR 530.40 - Safe levels and availability of analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Safe levels and availability of analytical methods... Safe levels and availability of analytical methods. (a) In accordance with § 530.22, the following safe... accordance with § 530.22, the following analytical methods have been accepted by FDA:...

  17. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-01

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  18. From requirements to acceptance tests

    NASA Technical Reports Server (NTRS)

    Baize, Lionel; Pasquier, Helene

    1993-01-01

    From user requirements definition to accepted software system, the software project management wants to be sure that the system will meet the requirements. For the development of a telecommunication satellites Control Centre, C.N.E.S. has used new rules to make the use of tracing matrix easier. From Requirements to Acceptance Tests, each item of a document must have an identifier. A unique matrix traces the system and allows the tracking of the consequences of a change in the requirements. A tool has been developed, to import documents into a relational data base. Each record of the data base corresponds to an item of a document, the access key is the item identifier. Tracing matrix is also processed, providing automatically links between the different documents. It enables the reading on the same screen of traced items. For example one can read simultaneously the User Requirements items, the corresponding Software Requirements items and the Acceptance Tests.

  19. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  20. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  1. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  2. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  3. Next-generation sequencing data interpretation: enhancing reproducibility and accessibility.

    PubMed

    Nekrutenko, Anton; Taylor, James

    2012-09-01

    Areas of life sciences research that were previously distant from each other in ideology, analysis practices and toolkits, such as microbial ecology and personalized medicine, have all embraced techniques that rely on next-generation sequencing instruments. Yet the capacity to generate the data greatly outpaces our ability to analyse it. Existing sequencing technologies are more mature and accessible than the methodologies that are available for individual researchers to move, store, analyse and present data in a fashion that is transparent and reproducible. Here we discuss currently pressing issues with analysis, interpretation, reproducibility and accessibility of these data, and we present promising solutions and venture into potential future developments.

  4. Internal R and D task summary report: analytical methods development

    SciTech Connect

    Schweighardt, F.K.

    1983-07-01

    International Coal Refining Company (ICRC) conducted two research programs to develop analytical procedures for characterizing the feed, intermediates,and products of the proposed SRC-I Demonstration Plant. The major conclusion is that standard analytical methods must be defined and assigned statistical error limits of precision and reproducibility early in development. Comparing all SRC-I data or data from different processes is complex and expensive if common data correlation procedures are not followed. ICRC recommends that processes be audited analytically and statistical analyses generated as quickly as possible, in order to quantify process-dependent and -independent variables. 16 references, 10 figures, 20 tables.

  5. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    NASA Astrophysics Data System (ADS)

    Wei, Haoran; Vikesland, Peter J.

    2015-12-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection.

  6. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    PubMed Central

    Wei, Haoran; Vikesland, Peter J.

    2015-01-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection. PMID:26658696

  7. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform.

    PubMed

    Wei, Haoran; Vikesland, Peter J

    2015-01-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection. PMID:26658696

  8. Analytic theories of allometric scaling.

    PubMed

    Agutter, Paul S; Tuszynski, Jack A

    2011-04-01

    During the 13 years since it was first advanced, the fractal network theory (FNT), an analytic theory of allometric scaling, has been subjected to a wide range of methodological, mathematical and empirical criticisms, not all of which have been answered satisfactorily. FNT presumes a two-variable power-law relationship between metabolic rate and body mass. This assumption has been widely accepted in the past, but a growing body of evidence during the past quarter century has raised questions about its general validity. There is now a need for alternative theories of metabolic scaling that are consistent with empirical observations over a broad range of biological applications. In this article, we briefly review the limitations of FNT, examine the evidence that the two-variable power-law assumption is invalid, and outline alternative perspectives. In particular, we discuss quantum metabolism (QM), an analytic theory based on molecular-cellular processes. QM predicts the large variations in scaling exponent that are found empirically and also predicts the temperature dependence of the proportionality constant, issues that have eluded models such as FNT that are based on macroscopic and network properties of organisms.

  9. Analytic cognitive style predicts religious and paranormal belief.

    PubMed

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A

    2012-06-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically.

  10. Unambiguous characterization of analytical markers in complex, seized opiate samples using an enhanced ion mobility trace detector-mass spectrometer.

    PubMed

    Liuni, Peter; Romanov, Vladimir; Binette, Marie-Josée; Zaknoun, Hafid; Tam, Maggie; Pilon, Pierre; Hendrikse, Jan; Wilson, Derek J

    2014-11-01

    Ion mobility spectroscopy (IMS)-based trace-compound detectors (TCDs) are powerful and widely implemented tools for the detection of illicit substances. They combine high sensitivity, reproducibility, rapid analysis time, and resistance to dirt with an acceptable false alarm rate. The analytical specificity of TCD-IMS instruments for a given analyte depends strongly on a detailed knowledge of the ion chemistry involved, as well as the ability to translate this knowledge into field-robust analytical methods. In this work, we introduce an enhanced hybrid TCD-IMS/mass spectrometer (TCD-IMS/MS) that combines the strengths of ion-mobility-based target compound detection with unambiguous identification by tandem MS. Building on earlier efforts along these lines (Kozole et al., Anal. Chem. 2011, 83, 8596-8603), the current instrument is capable of positive and negative-mode analyses with tightly controlled gating between the IMS and MS modules and direct measurement of ion mobility profiles. We demonstrate the unique capabilities of this instrument using four samples of opium seized by the Canada Border Services Agency (CBSA), consisting of a mixture of opioid alkaloids and other naturally occurring compounds typically found in these samples. Although many analytical methods have been developed for analyzing naturally occurring opiates, this is the first detailed ion mobility study on seized opium samples. This work demonstrates all available analytical modes for the new IMS-MS system including "single-gate", "dual-gate", MS/MS, and precursor ion scan methods. Using a combination of these modes, we unambiguously identify all signals in the IMS spectra, including previously uncharacterized minor peaks arising from compounds that are common in raw opium. PMID:25302672

  11. Mass spectrometric assessment and analytical methods for quantitation of the new herbicide aminocyclopyrachlor and its methyl analogue in soil and water.

    PubMed

    Nanita, Sergio C; Pentz, Anne M; Grant, Joann; Vogl, Emily; Devine, Timothy J; Henze, Robert M

    2009-01-15

    Analytical methods have been developed for the detection and quantitation of a new herbicide active ingredient, aminocyclopyrachlor, and its analogue aminocyclopyrachlor methyl in environmental samples. The analytes were purified from soil extracts and water samples using solid phase extraction based on mixed-mode cation exchange/reverse phase retention. Analyte identification and quantitative analyses were performed by high performance liquid chromatography coupled to tandem mass spectrometry by an electrospray ionization source. External standards prepared in neat solvents were used for quantitation, providing acceptable accuracy, with no matrix effects observed during method validation. The method limits of quantitation (LOQ) were 0.10 ng/mL (ppb, parts-per-billion) in water and 1.0 ng/g in soil for both compounds. The limit of detection (LOD) in water was estimated to be 20 ng/L (ppt, parts-per-trillion) for aminocyclopyrachlor and 1 ng/L for aminocyclopyrachlor methyl, while LODs in soil were 100 ng/kg and 10 ng/kg for aminocyclopyrachlor and aminocyclopyrachlor methyl, respectively. The stability of both compounds in various solvents was evaluated as part of method development. Tandem mass spectrometry experiments were also conducted to investigate the gas-phase fragmentation of aminocyclopyrachlor and its methyl analogue, and the results are reported. A statistical analysis of method validation data generated at two laboratories by multiple chemists authenticates the ruggedness and good reproducibility of the analytical procedures tested.

  12. Imaginary Companions and Peer Acceptance

    ERIC Educational Resources Information Center

    Gleason, Tracy R.

    2004-01-01

    Early research on imaginary companions suggests that children who create them do so to compensate for poor social relationships. Consequently, the peer acceptance of children with imaginary companions was compared to that of their peers. Sociometrics were conducted on 88 preschool-aged children; 11 had invisible companions, 16 had personified…

  13. Acceptance of Others (Number Form).

    ERIC Educational Resources Information Center

    Masters, James R.; Laverty, Grace E.

    As part of the instrumentation to assess the effectiveness of the Schools Without Failure (SWF) program in 10 elementary schools in the New Castle, Pa. School District, the Acceptance of Others (Number Form) was prepared to determine pupil's attitudes toward classmates. Given a list of all class members, pupils are asked to circle a number from 1…

  14. W-025, acceptance test report

    SciTech Connect

    Roscha, V.

    1994-10-04

    This acceptance test report (ATR) has been prepared to establish the results of the field testing conducted on W-025 to demonstrate that the electrical/instrumentation systems functioned as intended by design. This is part of the RMW Land Disposal Facility.

  15. Euthanasia Acceptance: An Attitudinal Inquiry.

    ERIC Educational Resources Information Center

    Klopfer, Fredrick J.; Price, William F.

    The study presented was conducted to examine potential relationships between attitudes regarding the dying process, including acceptance of euthanasia, and other attitudinal or demographic attributes. The data of the survey was comprised of responses given by 331 respondents to a door-to-door interview. Results are discussed in terms of preferred…

  16. Helping Our Children Accept Themselves.

    ERIC Educational Resources Information Center

    Gamble, Mae

    1984-01-01

    Parents of a child with muscular dystrophy recount their reactions to learning of the diagnosis, their gradual acceptance, and their son's resistance, which was gradually lessened when he was provided with more information and treated more normally as a member of the family. (CL)

  17. Acceptance and Commitment Therapy: Introduction

    ERIC Educational Resources Information Center

    Twohig, Michael P.

    2012-01-01

    This is the introductory article to a special series in Cognitive and Behavioral Practice on Acceptance and Commitment Therapy (ACT). Instead of each article herein reviewing the basics of ACT, this article contains that review. This article provides a description of where ACT fits within the larger category of cognitive behavior therapy (CBT):…

  18. Who accepts first aid training?

    PubMed

    Pearn, J; Dawson, B; Leditschke, F; Petrie, G; Nixon, J

    1980-09-01

    The percentage of individuals trained in first aid skills in the general community is inadequate. We report here a study to investigate factors which influence motivation to accept voluntary training in first aid. A group of 700 randomly selected owners of inground swimming pools (a parental high-risk group) was offered a course of formal first aid instruction. Nine per cent attended the offered training course. The time commitment involved in traditional courses (eight training nights spread over four weeks) is not a deterrent, the same percentage accepting such courses as that who accept a course of one night's instruction. Cost is an important deterrent factor, consumer resistance rising over 15 cost units (one cost unit = the price of a loaf of bread). The level of competent first aid training within the community can be raised by (a) keeping to traditional course content, but (b) by ensuring a higher acceptance rate of first aid courses by a new approach to publicity campaigns, to convince prospective students of the real worth of first aid training. Questions concerning who should be taught first aid, and factors influencing motivation, are discussed.

  19. Keeping the Bar Low: Why Russia's Nonresident Fathers Accept Narrow Fatherhood Ideals

    ERIC Educational Resources Information Center

    Utrata, Jennifer

    2008-01-01

    Although most Russian nonresident fathers feel torn between old and new ideals of fatherhood, they end up accepting older, narrow ideals. Fathers reproduce the dominant gender discourse, which deems men irresponsible and infantile and diminishes the importance of fathers. On the basis of extensive fieldwork, including in-depth interviews (N = 21)…

  20. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  1. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  2. Analytical mass spectrometry

    SciTech Connect

    Not Available

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  3. Analytical mass spectrometry. Abstracts

    SciTech Connect

    Not Available

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  4. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  5. Signals: Applying Academic Analytics

    ERIC Educational Resources Information Center

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  6. Teaching the Analytical Life

    ERIC Educational Resources Information Center

    Jackson, Brian

    2010-01-01

    Using a survey of 138 writing programs, I argue that we must be more explicit about what we think students should get out of analysis to make it more likely that students will transfer their analytical skills to different settings. To ensure our students take analytical skills with them at the end of the semester, we must simplify the task we…

  7. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  8. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states.

  9. Artificially reproduced image of earth photographed by UV camera

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A reproduction of a color enhancement of a picture photographed in far-ultraviolet light by Astronaut John W. Young, Apollo 16 commander, showing the Earth. Note this is an artificially reproduced image. The three auroral belts, the sunlit atmosphere and background stars are visible.

  10. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  11. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  12. Reproducibility of polycarbonate reference material in toxicity evaluation

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Huttlinger, P. A.

    1981-01-01

    A specific lot of bisphenol A polycarbonate has been used for almost four years as the reference material for the NASA-USF-PSC toxicity screening test method. The reproducibility of the test results over this period of time indicate that certain plastics may be more suitable reference materials than the more traditional cellulosic materials.

  13. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  14. Slide rule-type color chart predicts reproduced photo tones

    NASA Technical Reports Server (NTRS)

    Griffin, J. D.

    1966-01-01

    Slide rule-type color chart determines the final reproduced gray tones in the production of briefing charts that are photographed in black and white. The chart shows both the color by drafting paint manufacturers name and mixture number, and the gray tone resulting from black and white photographic reproduction.

  15. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states. PMID:16077266

  16. A simple and reproducible breast cancer prognostic test

    PubMed Central

    2013-01-01

    Background A small number of prognostic and predictive tests based on gene expression are currently offered as reference laboratory tests. In contrast to such success stories, a number of flaws and errors have recently been identified in other genomic-based predictors and the success rate for developing clinically useful genomic signatures is low. These errors have led to widespread concerns about the protocols for conducting and reporting of computational research. As a result, a need has emerged for a template for reproducible development of genomic signatures that incorporates full transparency, data sharing and statistical robustness. Results Here we present the first fully reproducible analysis of the data used to train and test MammaPrint, an FDA-cleared prognostic test for breast cancer based on a 70-gene expression signature. We provide all the software and documentation necessary for researchers to build and evaluate genomic classifiers based on these data. As an example of the utility of this reproducible research resource, we develop a simple prognostic classifier that uses only 16 genes from the MammaPrint signature and is equally accurate in predicting 5-year disease free survival. Conclusions Our study provides a prototypic example for reproducible development of computational algorithms for learning prognostic biomarkers in the era of personalized medicine. PMID:23682826

  17. The United States Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Black and white maps, graphs and tables that may be reproduced are presented in this volume focusing on the United States. Some of the features of the United States depicted are: size, population, agriculture and resources, manufactures, trade, citizenship, employment, income, poverty, the federal budget, energy, health, education, crime, and the…

  18. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  19. Regional cerebral blood flow utilizing the gamma camera and xenon inhalation: reproducibility and clinical applications

    SciTech Connect

    Fox, R.A.; Knuckey, N.W.; Fleay, R.F.; Stokes, B.A.; Van der Schaaf, A.; Surveyor, I.

    1985-11-01

    A modified collimator and standard gamma camera have been used to measure regional cerebral blood flow following inhalation of radioactive xenon. The collimator and a simplified analysis technique enables excellent statistical accuracy to be achieved with acceptable precision in the measurement of grey matter blood flow. The validity of the analysis was supported by computer modelling and patient measurements. Sixty-one patients with subarachnoid hemorrhage, cerebrovascular disease or dementia were retested to determine the reproducibility of our method. The measured coefficient of variation was 6.5%. Of forty-six patients who had a proven subarachnoid hemorrhage, 15 subsequently developed cerebral ischaemia. These showed a CBF of 42 +/- 6 ml X minute-1 X 100 g brain-1 compared with 49 +/- 11 ml X minute-1 X 100 g brain-1 for the remainder. There is evidence that decreasing blood flow and low initial flow correlate with the subsequent onset of cerebral ischemia.

  20. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed. PMID:26631024

  1. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  2. Tract Specific Reproducibility of Tractography Based Morphology and Diffusion Metrics

    PubMed Central

    Besseling, René M. H.; Jansen, Jacobus F. A.; Overvliet, Geke M.; Vaessen, Maarten J.; Braakman, Hilde M. H.; Hofman, Paul A. M.; Aldenkamp, Albert P.; Backes, Walter H.

    2012-01-01

    Introduction The reproducibility of tractography is important to determine its sensitivity to pathological abnormalities. The reproducibility of tract morphology has not yet been systematically studied and the recently developed tractography contrast Tract Density Imaging (TDI) has not yet been assessed at the tract specific level. Materials and Methods Diffusion tensor imaging (DTI) and probabilistic constrained spherical deconvolution (CSD) tractography are performed twice in 9 healthy subjects. Tractography is based on common space seed and target regions and performed for several major white matter tracts. Tractograms are converted to tract segmentations and inter-session reproducibility of tract morphology is assessed using Dice similarity coefficient (DSC). The coefficient of variation (COV) and intraclass correlation coefficient (ICC) are calculated of the following tract metrics: fractional anisotropy (FA), apparent diffusion coefficient (ADC), volume, and TDI. Analyses are performed both for proximal (deep white matter) and extended (including subcortical white matter) tract segmentations. Results Proximal DSC values were 0.70–0.92. DSC values were 5–10% lower in extended compared to proximal segmentations. COV/ICC values of FA, ADC, volume and TDI were 1–4%/0.65–0.94, 2–4%/0.62–0.94, 3–22%/0.53–0.96 and 8–31%/0.48–0.70, respectively, with the lower COV and higher ICC values found in the proximal segmentations. Conclusion For all investigated metrics, reproducibility depended on the segmented tract. FA and ADC had relatively low COV and relatively high ICC, indicating clinical potential. Volume had higher COV but its moderate to high ICC values in most tracts still suggest subject-differentiating power. Tract TDI had high COV and relatively low ICC, which reflects unfavorable reproducibility. PMID:22485157

  3. Reproducibility of dual-photon absorptiometry using a clinical phantom

    SciTech Connect

    DaCosta, M.; DeLaney, M.; Goldsmith, S.J.

    1985-05-01

    The use of dual-photon absorptiometry (DPA) bone mineral density (BMD) to monitor bone for diagnosis and monitoring therapy of osteoporosis has been established. The objective of this study is to determine the reproducibility of DPA measurements. A phantom was constructed using a section of human boney pelvis and lumbo-sacral spine. Provisions were made to mimic changes in patient girth. To evaluate the DPA reproducibility within a single day, 12 consecutive studies were performed on the phantom using standard acquisition and processing procedures. The mean BMD +-1 SD in gms/cm/sup 2/ (BMD-bar)of lumbar vertebrae 2-4 was 0.771 +- 0.007 with a 0.97% coefficient of variation (1SD) (CV). This evaluation was repeated 7 times over the next 4 months with the performance of 3 to 6 studies each time, the maximum CV found was 1.93. In order to evaluate the DPA reproducibility with time, phantom studies were performed over a 7 month period which included a 153-Gd source change. The BMD-bar was 0.770 +- 0.017 with a 2.15CV. DPA reproducibility with patient girth changes was evaluated by performing the phantom studies at water depths of 12.5, 17.0 and 20.0cm. Five studies of each were performed using standard acquisition and processing procedures. The BMD-bar was 0.779 +- 0.012 with a 1.151CV. based on these results, BMD measurements by DPA are reproducible within 2%. This reliability is maintained for studies performed over extended period of time and are independent of changes in patient girth.

  4. Repeatability and Reproducibility of Anterior Segment Measurements in Normal Eyes Using Dual Scheimpflug Analyzer

    PubMed Central

    Altıparmak, Zeynep; Yağcı, Ramazan; Güler, Emre; Arslanyılmaz, Zeynel; Canbal, Metin; Hepşen, İbrahim F.

    2015-01-01

    Objectives: To assess the repeatability and reproducibility of anterior segment measurements including aberrometric measurements provided by a dual Scheimpflug analyzer (Galilei) system in normal eyes. Materials and Methods: Three repeated consecutive measurements were taken by two independent examiners. The following were evaluated: total corneal power and posterior corneal power, corneal higher-order wavefront aberrations (6.0 mm pupil), pachymetry at the central, paracentral, and peripheral zones, and anterior chamber depth (ACD). Repeatability was assessed by calculating the within-subject standard deviation, precision, repeatability, and intraclass correlation coefficient (ICC). Bland-Altman analysis was used for assessing reproducibility. Results: Thirty eyes of 30 patients were included. The best ICC values were for corneal pachymetry and ACD. For both observers, acceptable ICC was also achieved for the other parameters, the only exceptions being posterior corneal astigmatism and total high order aberration. The 95% LoA (Limits of Agreement) values for all measurements showed small variability between the two examiners. Conclusion: The Galilei system provided reliable measurements of anterior segment parameters. Therefore, the instrument can be confidently used for routine clinical use and research purposes. PMID:27800242

  5. Evaluation of measurement reproducibility using the standard-sites data, 1994 Fernald field characterization demonstration project

    SciTech Connect

    Rautman, C.A.

    1996-02-01

    The US Department of Energy conducted the 1994 Fernald (Ohio) field characterization demonstration project to evaluate the performance of a group of both industry-standard and proposed alternative technologies in describing the nature and extent of uranium contamination in surficial soils. Detector stability and measurement reproducibility under actual operating conditions encountered in the field is critical to establishing the credibility of the proposed alternative characterization methods. Comparability of measured uranium activities to those reported by conventional, US Environmental Protection Agency (EPA)-certified laboratory methods is also required. The eleven (11) technologies demonstrated included (1) EPA-standard soil sampling and laboratory mass-spectroscopy analyses, and currently-accepted field-screening techniques using (2) sodium-iodide scintillometers, (3) FIDLER low-energy scintillometers, and (4) a field-portable x-ray fluorescence spectrometer. Proposed advanced characterization techniques included (5) alpha-track detectors, (6) a high-energy beta scintillometer, (7) electret ionization chambers, (8) and (9) a high-resolution gamma-ray spectrometer in two different configurations, (10) a field-adapted laser ablation-inductively coupled plasma-atomic emission spectroscopy (ICP-AES) technique, and (11) a long-range alpha detector. Measurement reproducibility and the accuracy of each method were tested by acquiring numerous replicate measurements of total uranium activity at each of two ``standard sites`` located within the main field demonstration area. Meteorological variables including temperature, relative humidity. and 24-hour rainfall quantities were also recorded in conjunction with the standard-sites measurements.

  6. Characterizing the Reproducibility and Reliability of Dietary Patterns among Yup’ik Alaska Native People

    PubMed Central

    Ryman, Tove K.; Boyer, Bert B.; Hopkins, Scarlett; Philip, Jacques; O’Brien, Diane; Thummel, Kenneth; Austin, Melissa A.

    2015-01-01

    Food frequency questionnaire (FFQ) data can be used to characterize dietary patterns for diet-disease association studies. Among a sample of Yup’ik people from Southwest Alaska, we evaluated three previously defined dietary patterns: “subsistence foods” and market-based “processed foods” and “fruits and vegetables”. We tested the reproducibility and reliability of the dietary patterns and tested associations of the patterns with dietary biomarkers and participant characteristics. We analyzed data from adult study participants who completed at least one FFQ with the Center for Alaska Native Health Research 9/2009–5/2013. To test reproducibility we conducted a confirmatory factor analysis (CFA) of a hypothesized model using 18 foods to measure the dietary patterns (n=272). To test the reliability of the dietary patterns, we used CFA to measure the composite reliability (n=272) and intraclass correlation coefficients for test-retest reliability (n=113). Finally, to test associations we used linear regression (n=637). All CFA factor loadings, except one, indicated acceptable correlations between foods and dietary patterns (r > 0.40) and model fit criteria were greater than 0.90. Composite and test-retest reliability of dietary patterns were respectively 0.56 and 0.34 for subsistence foods, 0.73 and 0.66 for processed foods, and 0.72 and 0.54 for fruits and vegetables. In the multi-predictor analysis, dietary patterns were significantly associated with dietary biomarkers, community location, age, sex, and self-reported lifestyle. This analysis confirmed the reproducibility and reliability of the dietary patterns in this study population. These dietary patterns can be used for future research and development of dietary interventions in this underserved population. PMID:25656871

  7. Characterising the reproducibility and reliability of dietary patterns among Yup'ik Alaska Native people.

    PubMed

    Ryman, Tove K; Boyer, Bert B; Hopkins, Scarlett; Philip, Jacques; O'Brien, Diane; Thummel, Kenneth; Austin, Melissa A

    2015-02-28

    FFQ data can be used to characterise dietary patterns for diet-disease association studies. In the present study, we evaluated three previously defined dietary patterns--'subsistence foods', market-based 'processed foods' and 'fruits and vegetables'--among a sample of Yup'ik people from Southwest Alaska. We tested the reproducibility and reliability of the dietary patterns, as well as the associations of these patterns with dietary biomarkers and participant characteristics. We analysed data from adult study participants who completed at least one FFQ with the Center for Alaska Native Health Research 9/2009-5/2013. To test the reproducibility of the dietary patterns, we conducted a confirmatory factor analysis (CFA) of a hypothesised model using eighteen food items to measure the dietary patterns (n 272). To test the reliability of the dietary patterns, we used the CFA to measure composite reliability (n 272) and intra-class correlation coefficients for test-retest reliability (n 113). Finally, to test the associations, we used linear regression (n 637). All factor loadings, except one, in CFA indicated acceptable correlations between foods and dietary patterns (r>0·40), and model-fit criteria were >0·90. Composite and test-retest reliability of the dietary patterns were, respectively, 0·56 and 0·34 for 'subsistence foods', 0·73 and 0·66 for 'processed foods', and 0·72 and 0·54 for 'fruits and vegetables'. In the multi-predictor analysis, the dietary patterns were significantly associated with dietary biomarkers, community location, age, sex and self-reported lifestyle. This analysis confirmed the reproducibility and reliability of the dietary patterns in the present study population. These dietary patterns can be used for future research and development of dietary interventions in this underserved population. PMID:25656871

  8. Electrochemiluminescence detection in microfluidic cloth-based analytical devices.

    PubMed

    Guan, Wenrong; Liu, Min; Zhang, Chunsun

    2016-01-15

    This work describes the first approach at combining microfluidic cloth-based analytical devices (μCADs) with electrochemiluminescence (ECL) detection. Wax screen-printing is employed to make cloth-based microfluidic chambers which are patterned with carbon screen-printed electrodes (SPEs) to create truly disposable, simple, inexpensive sensors which can be read with a low-cost, portable charge coupled device (CCD) imaging sensing system. And, the two most commonly used ECL systems of tris(2,2'-bipyridyl)ruthenium(II)/tri-n-propylamine (Ru(bpy)3(2+)/TPA) and 3-aminophthalhydrazide/hydrogen peroxide (luminol/H2O2) are applied to demonstrate the quantitative ability of the ECL μCADs. In this study, the proposed devices have successfully fulfilled the determination of TPA with a linear range from 2.5 to 2500μM with a detection limit of 1.265μM. In addition, the detection of H2O2 can be performed in the linear range of 0.05-2.0mM, with a detection limit of 0.027mM. It has been shown that the ECL emission on the wax-patterned cloth device has an acceptable sensitivity, stability and reproducibility. Finally, the applicability of cloth-based ECL is demonstrated for determination of glucose in phosphate buffer solution (PBS) and artificial urine (AU) samples, with the detection limits of 0.032mM and 0.038mM, respectively. It can be foreseen, therefore, that μCADs with ECL detection could provide a new sensing platform for point-of-care testing, public health, food safety detection and environmental monitoring in remote regions, developing or developed countries. PMID:26319168

  9. Analytic representations with theta functions for systems on ℤ(d) and on 𝕊

    NASA Astrophysics Data System (ADS)

    Evangelides, P.; Lei, C.; Vourdas, A.

    2015-07-01

    An analytic representation with theta functions on a torus, for systems with variables in ℤ(d), is considered. Another analytic representation with theta functions on a strip, for systems with positions in a circle 𝕊 and momenta in ℤ, is also considered. The reproducing kernel formalism for these two systems is studied. Wigner and Weyl functions in this language are also studied.

  10. CIEF method optimization: development of robust and reproducible protein reagent characterization in the clinical immunodiagnostic industry.

    PubMed

    Bonn, Ryan; Rampal, Sushma; Rae, Tracey; Fishpaugh, Jeffrey

    2013-03-01

    Several method parameters have been refined for application of CIEF methods to provide optimal capillary robustness and performance longevity while maintaining desired analytical output for the ever increasing characterization scrutiny of protein reagents used in clinical assay formulations. Demonstrated here are significant modifications to the existing protocols in order to attain a robust, reproducible method that achieves as much as a 20-fold increase in the number of consecutive runs before capillary degradation. Not only is it a concern for the rudimentary analysis of acidic and basic components of the isoform profile for monoclonal antibodies, but a comprehensive identification of each individual isoform to obtain a characteristic fingerprint is necessary for minor distinguishable properties between multiple proteins in unambiguous identification. In order to maintain the integrity of these modifications, extensive studies were conducted on an implemented system suitability standard protein with specifically defined parameters indicating either sufficient or poor separation performance.

  11. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  12. Accepting the T3D

    SciTech Connect

    Rich, D.O.; Pope, S.C.; DeLapp, J.G.

    1994-10-01

    In April, a 128 PE Cray T3D was installed at Los Alamos National Laboratory`s Advanced Computing Laboratory as part of the DOE`s High-Performance Parallel Processor Program (H4P). In conjunction with CRI, the authors implemented a 30 day acceptance test. The test was constructed in part to help them understand the strengths and weaknesses of the T3D. In this paper, they briefly describe the H4P and its goals. They discuss the design and implementation of the T3D acceptance test and detail issues that arose during the test. They conclude with a set of system requirements that must be addressed as the T3D system evolves.

  13. Sweeteners: consumer acceptance in tea.

    PubMed

    Sprowl, D J; Ehrcke, L A

    1984-09-01

    Sucrose, fructose, aspartame, and saccharin were compared for consumer preference, aftertaste, and cost to determine acceptability of the sweeteners. A 23-member taste panel evaluated tea samples for preference and aftertaste. Mean retail cost of the sweeteners were calculated and adjusted to take sweetening power into consideration. Sucrose was the least expensive and most preferred sweetener. No significant difference in preference for fructose and aspartame was found, but both sweeteners were rated significantly lower than sucrose. Saccharin was the most disliked sweetener. Fructose was the most expensive sweetener and aspartame the next most expensive. Scores for aftertaste followed the same pattern as those for preference. Thus, a strong, unpleasant aftertaste seems to be associated with a dislike for a sweetener. From the results of this study, it seems that there is no completely acceptable low-calorie substitute for sucrose available to consumers.

  14. Toward Transparent and Reproducible Science: Using Open Source "Big Data" Tools for Water Resources Assessment

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Zulkafli, Z. D.; Vitolo, C.

    2014-12-01

    Transparency and reproducibility are fundamental properties of good science. In the current era of large and diverse datasets and long and complex workflows for data analysis and inference, ensuring such transparency and reproducibility is challenging. Hydrological science is a good case in point, because the discipline typically uses a large variety of datasets ranging from local observations to large-scale remotely sensed products. These data are often obtained from various different sources, and integrated using complex yet uncertain modelling tools. In this paper, we present and discuss methods of ensuring transparency and reproducibility in scientific workflows for hydrological data analysis for the purpose of water resources assessment, using relevant examples of emerging open source "big data" tools. First, we discuss standards for data storage, access, and processing that allow improving the modularity of a hydrological analysis workflow. In particular standards emerging from the Open Geospatial Consortium, such as the Sensor Observation Service, the Web Coverage Service, hold promise. However, some bottlenecks such as the availability of data models and the ability to work with spatio-temperal subsets of large datasets, need further development. Next, we focus on available methods to build transparent data processing workflows. Again, standards such as OGC's Web Processing Service are being developed to facilitate web-based analytics. Yet, in practice, the experimental nature of these standards and web services in general often requires a more pragmatic approach. The availability of web technologies in popular open source data analysis environments such as R and Python often makes them an attractive solution for workflow creation and sharing. Lastly, we elaborate on the potential of open source solutions hold in the context of participatory approaches to data collection and knowledge generation. Using examples from the tropical Andes and the Himalayas, we

  15. Acceptability of reactors in space

    SciTech Connect

    Buden, D.

    1981-01-01

    Reactors are the key to our future expansion into space. However, there has been some confusion in the public as to whether they are a safe and acceptable technology for use in space. The answer to these questions is explored. The US position is that when reactors are the preferred technical choice, that they can be used safely. In fact, it does not appear that reactors add measurably to the risk associated with the Space Transportation System.

  16. Acceptability of reactors in space

    SciTech Connect

    Buden, D.

    1981-04-01

    Reactors are the key to our future expansion into space. However, there has been some confusion in the public as to whether they are a safe and acceptable technology for use in space. The answer to these questions is explored. The US position is that when reactors are the preferred technical choice, that they can be used safely. In fact, it dies not appear that reactors add measurably to the risk associated with the Space Transportation System.

  17. The Dutch motor skills assessment as tool for talent development in table tennis: a reproducibility and validity study.

    PubMed

    Faber, Irene R; Nijhuis-Van Der Sanden, Maria W G; Elferink-Gemser, Marije T; Oosterveld, Frits G J

    2015-01-01

    A motor skills assessment could be helpful in talent development by estimating essential perceptuo-motor skills of young players, which are considered requisite to develop excellent technical and tactical qualities. The Netherlands Table Tennis Association uses a motor skills assessment in their talent development programme consisting of eight items measuring perceptuo-motor skills specific to table tennis under varying conditions. This study aimed to investigate this assessment regarding its reproducibility, internal consistency, underlying dimensions and concurrent validity in 113 young table tennis players (6-10 years). Intraclass correlation coefficients of six test items met the criteria of 0.7 with coefficients of variation between 3% and 8%. Cronbach's alpha valued 0.853 for internal consistency. The principal components analysis distinguished two conceptually meaningful factors: "ball control" and "gross motor function." Concurrent validity analyses demonstrated moderate associations between the motor skills assessment's results and national ranking; boys r = -0.53 (P < 0.001) and girls r = -0.45 (P = 0.015). In conclusion, this evaluation demonstrated six test items with acceptable reproducibility, good internal consistency and good prospects for validity. Two test items need revision to upgrade reproducibility. Since the motor skills assessment seems to be a reproducible, objective part of a talent development programme, more longitudinal studies are required to investigate its predictive validity. PMID:25482916

  18. The Dutch motor skills assessment as tool for talent development in table tennis: a reproducibility and validity study.

    PubMed

    Faber, Irene R; Nijhuis-Van Der Sanden, Maria W G; Elferink-Gemser, Marije T; Oosterveld, Frits G J

    2015-01-01

    A motor skills assessment could be helpful in talent development by estimating essential perceptuo-motor skills of young players, which are considered requisite to develop excellent technical and tactical qualities. The Netherlands Table Tennis Association uses a motor skills assessment in their talent development programme consisting of eight items measuring perceptuo-motor skills specific to table tennis under varying conditions. This study aimed to investigate this assessment regarding its reproducibility, internal consistency, underlying dimensions and concurrent validity in 113 young table tennis players (6-10 years). Intraclass correlation coefficients of six test items met the criteria of 0.7 with coefficients of variation between 3% and 8%. Cronbach's alpha valued 0.853 for internal consistency. The principal components analysis distinguished two conceptually meaningful factors: "ball control" and "gross motor function." Concurrent validity analyses demonstrated moderate associations between the motor skills assessment's results and national ranking; boys r = -0.53 (P < 0.001) and girls r = -0.45 (P = 0.015). In conclusion, this evaluation demonstrated six test items with acceptable reproducibility, good internal consistency and good prospects for validity. Two test items need revision to upgrade reproducibility. Since the motor skills assessment seems to be a reproducible, objective part of a talent development programme, more longitudinal studies are required to investigate its predictive validity.

  19. Quality assurance management plan (QAPP) special analytical support (SAS)

    SciTech Connect

    LOCKREM, L.L.

    1999-05-20

    It is the policy of Special Analytical Support (SAS) that the analytical aspects of all environmental data generated and processed in the laboratory, subject to the Environmental Protection Agency (EPA), U.S. Department of Energy or other project specific requirements, be of known and acceptable quality. It is the intention of this QAPP to establish and assure that an effective quality controlled management system is maintained in order to meet the quality requirements of the intended use(s) of the data.

  20. Simple, analytical criteria for the sequencing of distillation columns

    SciTech Connect

    Malone, M.F.; Douglas, J.M.; Glinos, K.; Marquez, F.E.

    1985-04-01

    A quantitative criterion for the selection of simple distillation sequences is derived for ideal mixtures. A simple cost model, along with a short-cut solution of Underwood's equations, gives an analytical form for the total vapor rate, which is the key design variable. The results for column sequencing that are based on the analytical criterion agree well with more exact solutions, but they indicate that in numerous situations the commonly accepted heuristics are incorrect.

  1. 48 CFR 12.402 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Acceptance. 12.402 Section... Acceptance. (a) The acceptance paragraph in 52.212-4 is based upon the assumption that the Government will rely on the contractor's assurances that the commercial item tendered for acceptance conforms to...

  2. Analytic thinking reduces belief in conspiracy theories.

    PubMed

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. PMID:25217762

  3. Analytic thinking reduces belief in conspiracy theories.

    PubMed

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories.

  4. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  5. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  6. Implementation of a portable and reproducible parallel pseudorandom number generator

    SciTech Connect

    Pryor, D.V.; Cuccaro, S.A.; Mascagni, M.; Robinson, M.L.

    1994-12-31

    The authors describe in detail the parallel implementation of a family of additive lagged-Fibonacci pseudorandom number generators. The theoretical structure of these generators is exploited to preserve their well-known randomness properties and to provide a parallel system in of distinct cycles. The algorithm presented here solves the reproducibility problem for a far larger class of parallel Monte Carlo applications than has been previously possible. In particular, Monte Carlo applications that undergo ``splitting`` can be coded to be reproducible, independent both of the number of processors and the execution order of the parallel processes. A library of portable C routines (available from the authors) that implements these ideas is also described.

  7. Reproducing kernel particle method for free and forced vibration analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J. X.; Zhang, H. Y.; Zhang, L.

    2005-01-01

    A reproducing kernel particle method (RKPM) is presented to analyze the natural frequencies of Euler-Bernoulli beams as well as Kirchhoff plates. In addition, RKPM is also used to predict the forced vibration responses of buried pipelines due to longitudinal travelling waves. Two different approaches, Lagrange multipliers as well as transformation method , are employed to enforce essential boundary conditions. Based on the reproducing kernel approximation, the domain of interest is discretized by a set of particles without the employment of a structured mesh, which constitutes an advantage over the finite element method. Meanwhile, RKPM also exhibits advantages over the classical Rayleigh-Ritz method and its counterparts. Numerical results presented here demonstrate the effectiveness of this novel approach for both free and forced vibration analysis.

  8. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales. PMID:24805343

  9. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    SciTech Connect

    2010-07-02

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  10. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  11. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2016-07-12

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  12. Intersubject variability and reproducibility of 15O PET studies.

    PubMed

    Coles, Jonathan P; Fryer, Tim D; Bradley, Peter G; Nortje, Jurgens; Smielewski, Peter; Rice, Kenneth; Clark, John C; Pickard, John D; Menon, David K

    2006-01-01

    Oxygen-15 positron emission tomography (15O PET) can provide important data regarding patients with head injury. We provide reference data on intersubject variability and reproducibility of cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolism (CMRO2) and oxygen extraction fraction (OEF) in patients and healthy controls, and explored alternative ways of assessing reproducibility within the context of a single PET study. In addition, we used independent measurements of CBF and CMRO2 to investigate the effect of mathematical correlation on the relationship between flow and metabolism. In patients, intersubject coefficients of variation (CoV) for CBF, CMRO2 and OEF were larger than in controls (32.9%+/-2.2%, 23.2%+/-2.0% and 22.5%+/-3.4% versus 13.5%+/-1.4%, 12.8%+/-1.1% and 7.3%+/-1.2%), while CoV for CBV were lower (15.2%+/-2.1% versus 22.5%+/-2.8%) (P<0.001). The CoV for the test-retest reproducibility of CBF, CBV, CMRO2 and OEF in patients were 2.1%+/-1.5%, 3.8%+/-3.0%, 3.7%+/-3.0% and 4.6%+/-3.5%, respectively. These were much lower than the intersubject CoV figures, and were similar to alternative measures of reproducibility obtained by fractionating data from a single study. The physiological relationship between flow and metabolism was preserved even when mathematically independent measures were used for analysis. These data provide a context for the design and interpretation of interventional PET studies. While ideally each centre should develop its own bank of such data, the figures provided will allow initial generic approximations of sample size for such studies.

  13. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. PMID:21982279

  14. Highly reproducible Bragg grating acousto-ultrasonic contact transducers

    NASA Astrophysics Data System (ADS)

    Saxena, Indu Fiesler; Guzman, Narciso; Lieberman, Robert A.

    2014-09-01

    Fiber optic acousto-ultrasonic transducers offer numerous applications as embedded sensors for impact and damage detection in industrial and aerospace applications as well as non-destructive evaluation. Superficial contact transducers with a sheet of fiber optic Bragg gratings has been demonstrated for guided wave ultrasound based measurements. It is reported here that this method of measurement provides highly reproducible guided ultrasound data of the test composite component, despite the optical fiber transducers not being permanently embedded in it.

  15. Reproducing the assembly of massive galaxies within the hierarchical cosmogony

    NASA Astrophysics Data System (ADS)

    Fontanot, Fabio; Monaco, Pierluigi; Silva, Laura; Grazian, Andrea

    2007-12-01

    In order to gain insight into the physical mechanisms leading to the formation of stars and their assembly in galaxies, we compare the predictions of the MOdel for the Rise of GAlaxies aNd Active nuclei (MORGANA) to the properties of K- and 850-μm-selected galaxies (such as number counts, redshift distributions and luminosity functions) by combining MORGANA with the spectrophotometric model GRASIL. We find that it is possible to reproduce the K- and 850-μm-band data sets at the same time and with a standard Salpeter initial mass function, and ascribe this success to our improved modelling of cooling in DM haloes. We then predict that massively star-forming discs are common at z ~ 2 and dominate the star formation rate, but most of them merge with other galaxies within ~100 Myr. Our preferred model produces an overabundance of bright galaxies at z < 1; this overabundance might be connected to the build-up of the diffuse stellar component in galaxy clusters, as suggested by Monaco et al., but a naive implementation of the mechanism suggested in that paper does not produce a sufficient slowdown of the evolution of these objects. Moreover, our model overpredicts the number of 1010-1011Msolar galaxies at z ~ 1; this is a common behaviour of theoretical models as shown by Fontana et al.. These findings show that, while the overall build-up of the stellar mass is correctly reproduced by galaxy formation models, the `downsizing' trend of galaxies is not fully reproduced yet. This hints to some missing feedback mechanism in order to reproduce at the same time the formation of both the massive and the small galaxies.

  16. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  17. Enzymes in Analytical Chemistry.

    ERIC Educational Resources Information Center

    Fishman, Myer M.

    1980-01-01

    Presents tabular information concerning recent research in the field of enzymes in analytic chemistry, with methods, substrate or reaction catalyzed, assay, comments and references listed. The table refers to 128 references. Also listed are 13 general citations. (CS)

  18. Mechanostructure and composition of highly reproducible decellularized liver matrices.

    PubMed

    Mattei, G; Di Patria, V; Tirella, A; Alaimo, A; Elia, G; Corti, A; Paolicchi, A; Ahluwalia, A

    2014-02-01

    Despite the increasing number of papers on decellularized scaffolds, there is little consensus on the optimum method of decellularizing biological tissue such that the micro-architecture and protein content of the matrix are conserved as far as possible. Focusing on the liver, the aim of this study was therefore to develop a method for the production of well-characterized and reproducible matrices that best preserves the structure and composition of the native extra cellular matrix (ECM). Given the importance of matrix stiffness in regulating cell response, the mechanical properties of the decellularized tissue were also considered. The testing and analysis framework is based on the characterization of decellularized and untreated samples in the same reproducible initial state (i.e., the equilibrium swollen state). Decellularized ECM (dECM) were characterized using biochemical, histological, mechanical and structural analyses to identify the best procedure to ensure complete cell removal while preserving most of the native ECM structure and composition. Using this method, sterile decellularized porcine ECM with highly conserved intra-lobular micro-structure and protein content were obtained in a consistent and reproducible manner using the equilibrium swollen state of tissue or matrix as a reference. A significant reduction in the compressive elastic modulus was observed for liver dECM with respect to native tissue, suggesting a re-examination of design parameters for ECM-mimicking scaffolds for engineering tissues in vitro.

  19. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers. PMID:9787445

  20. Representativity and reproducibility of DNA malignancy grading in different carcinomas.

    PubMed

    Böcking, A; Chatelain, R; Homge, M; Daniel, R; Gillissen, A; Wohltmann, D

    1989-04-01

    The reproducibility of the determination of the "DNA malignancy grade" (DNA-MG) was tested in 56 carcinomas of the colon, breast and lung while its representativity was tested on 195 slides from 65 tumors of the colon, breast and lung. DNA measurements were performed on Feulgen-stained smears with the TAS Plus TV-based image analysis system combined with an automated microscope. The variance of the DNA values of tumor cells around the 2c peak, the "2c deviation index" (2cDI), was taken as a basis for the computation of the DNA-MG, which ranges on a continuous scale from 0.01 to 3.00. The representativity, analyzed by comparison of the DNA-MGs measured in three different areas of the same tumor greater than or equal to 1.5 cm apart from each other, yielded an 81% agreement. No significant differences between DNA-MGs of these areas were found. The intraobserver and interobserver reproducibilities of the DNA grading system, investigated by repeated DNA measurements, were 83.9% and 82.2%, respectively. In comparison, histopathologic grading of the 27 breast cancers studied yielded 65% intraobserver and 57% interobserver reproducibilities and 66% representativity.

  1. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers.

  2. Reproducibility of LCA models of crude oil production.

    PubMed

    Vafi, Kourosh; Brandt, Adam R

    2014-11-01

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  3. Git can facilitate greater reproducibility and increased transparency in science

    PubMed Central

    2013-01-01

    Background Reproducibility is the hallmark of good science. Maintaining a high degree of transparency in scientific reporting is essential not just for gaining trust and credibility within the scientific community but also for facilitating the development of new ideas. Sharing data and computer code associated with publications is becoming increasingly common, motivated partly in response to data deposition requirements from journals and mandates from funders. Despite this increase in transparency, it is still difficult to reproduce or build upon the findings of most scientific publications without access to a more complete workflow. Findings Version control systems (VCS), which have long been used to maintain code repositories in the software industry, are now finding new applications in science. One such open source VCS, Git, provides a lightweight yet robust framework that is ideal for managing the full suite of research outputs such as datasets, statistical code, figures, lab notes, and manuscripts. For individual researchers, Git provides a powerful way to track and compare versions, retrace errors, explore new approaches in a structured manner, while maintaining a full audit trail. For larger collaborative efforts, Git and Git hosting services make it possible for everyone to work asynchronously and merge their contributions at any time, all the while maintaining a complete authorship trail. In this paper I provide an overview of Git along with use-cases that highlight how this tool can be leveraged to make science more reproducible and transparent, foster new collaborations, and support novel uses. PMID:23448176

  4. Interrater reproducibility of clinical tests for rotator cuff lesions

    PubMed Central

    Ostor, A; Richards, C; Prevost, A; Hazleman, B; Speed, C

    2004-01-01

    Background: Rotator cuff lesions are common in the community but reproducibility of tests for shoulder assessment has not been adequately appraised and there is no uniform approach to their use. Objective: To study interrater reproducibility of standard tests for shoulder evaluation among a rheumatology specialist, rheumatology trainee, and research nurse. Methods: 136 patients were reviewed over 12 months at a major teaching hospital. The three assessors examined each patient in random order and were unaware of each other's evaluation. Each shoulder was examined in a standard manner by recognised tests for specific lesions and a diagnostic algorithm was used. Between-observer agreement was determined by calculating Cohen's κ coefficients (measuring agreement beyond that expected by chance). Results: Fair to substantial agreement was obtained for the observations of tenderness, painful arc, and external rotation. Tests for supraspinatus and subscapularis also showed at least fair agreement between observers. 40/55 (73%) κ coefficient assessments were rated at >0.2, indicating at least fair concordance between observers; 21/55 (38%) were rated at >0.4, indicating at least moderate concordance between observers. Conclusion: The reproducibility of certain tests, employed by observers of varying experience, in the assessment of the rotator cuff and general shoulder disease was determined. This has implications for delegation of shoulder assessment to nurse specialists, the development of a simplified evaluation schedule for general practitioners, and uniformity in epidemiological research studies. PMID:15361389

  5. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  6. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  7. Planar heterojunction perovskite solar cells with superior reproducibility.

    PubMed

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  8. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  9. Extreme Scale Visual Analytics

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Pullum, Laura L; Ramanathan, Arvind; Shipman, Galen M; Thornton, Peter E; Potok, Thomas E

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  10. General theory of experiment containing reproducible data: The reduction to an ideal experiment

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Raoul R.; Zhang, Wei; Striccoli, Domenico

    2015-10-01

    The authors suggest a general theory for consideration of all experiments associated with measurements of reproducible data in one unified scheme. The suggested algorithm does not contain unjustified suppositions and the final function that is extracted from these measurements can be compared with hypothesis that is suggested by the theory adopted for the explanation of the object/phenomenon studied. This true function is free from the influence of the apparatus (instrumental) function and when the "best fit", or the most acceptable hypothesis, is absent, can be presented as a segment of the Fourier series. The discrete set of the decomposition coefficients describes the final function quantitatively and can serve as an intermediate model that coincides with the amplitude-frequency response (AFR) of the object studied. It can be used by theoreticians also for comparison of the suggested theory with experimental observations. Two examples (Raman spectra of the distilled water and exchange by packets between two wireless sensor nodes) confirm the basic elements of this general theory. From this general theory the following important conclusions follow: 1. The Prony's decomposition should be used in detection of the quasi-periodic processes and for quantitative description of reproducible data. 2. The segment of the Fourier series should be used as the fitting function for description of observable data corresponding to an ideal experiment. The transition from the initial Prony's decomposition to the conventional Fourier transform implies also the elimination of the apparatus function that plays an important role in the reproducible data processing. 3. The suggested theory will be helpful for creation of the unified metrological standard (UMS) that should be used in comparison of similar data obtained from the same object studied but in different laboratories with the usage of different equipment. 4. Many cases when the conventional theory confirms the experimental

  11. Reproducibility and quantitation of amplicon sequencing-based detection.

    PubMed

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-08-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  12. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  13. Repeatability and Reproducibility of Decisions by Latent Fingerprint Examiners

    PubMed Central

    Ulery, Bradford T.; Hicklin, R. Austin; Buscaglia, JoAnn; Roberts, Maria Antonia

    2012-01-01

    The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner) results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive) was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as “difficult” than for “easy” or “moderate” comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4); 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization). Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases. PMID:22427888

  14. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  15. Venusian Polar Vortex reproduced by a general circulation model

    NASA Astrophysics Data System (ADS)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  16. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  17. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  18. Inter-study reproducibility of cardiovascular magnetic resonance tagging

    PubMed Central

    2013-01-01

    Background The aim of this study is to determine the test-retest reliability of the measurement of regional myocardial function by cardiovascular magnetic resonance (CMR) tagging using spatial modulation of magnetization. Methods Twenty-five participants underwent CMR tagging twice over 12 ± 7 days. To assess the role of slice orientation on strain measurement, two healthy volunteers had a first exam, followed by image acquisition repeated with slices rotated ±15 degrees out of true short axis, followed by a second exam in the true short axis plane. To assess the role of slice location, two healthy volunteers had whole heart tagging. The harmonic phase (HARP) method was used to analyze the tagged images. Peak midwall circumferential strain (Ecc), radial strain (Err), Lambda 1, Lambda 2, and Angle α were determined in basal, mid and apical slices. LV torsion, systolic and early diastolic circumferential strain and torsion rates were also determined. Results LV Ecc and torsion had excellent intra-, interobserver, and inter-study intra-class correlation coefficients (ICC range, 0.7 to 0.9). Err, Lambda 1, Lambda 2 and angle had excellent intra- and interobserver ICC than inter-study ICC. Angle had least inter-study reproducibility. Torsion rates had superior intra-, interobserver, and inter-study reproducibility to strain rates. The measurements of LV Ecc were comparable in all three slices with different short axis orientations (standard deviation of mean Ecc was 0.09, 0.18 and 0.16 at basal, mid and apical slices, respectively). The mean difference in LV Ecc between slices was more pronounced in most of the basal slices compared to the rest of the heart. Conclusions Intraobserver and interobserver reproducibility of all strain and torsion parameters was excellent. Inter-study reproducibility of CMR tagging by SPAMM varied between different parameters as described in the results above and was superior for Ecc and LV torsion. The variation in LV Ecc

  19. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets

  20. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    PubMed Central

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~<10%). Some sequences are particularly variable in specific structures (ASL exhibited variation of 28% in the cerebral white matter) or in thin structures (quantitative T2 varied by up to 73% in the caudate) due, in large part, to variability in automated ROI placement. The

  1. Meal Replacement Mass Reduction and Integration Acceptability Study

    NASA Technical Reports Server (NTRS)

    Sirmons, T.; Cooper, M.; Douglas, G.; Barrett, A.; Richardson, M.; Arias, D.; Schneiderman, J.; Slack, K.; Ploutz-Snyder R.

    2016-01-01

    NASA, in planning for long duration missions, has an imperative to provide a food system with the necessary nutrition, acceptability, and safety to ensure sustainment of crew health and performance. The Orion Multi-Purpose Crew Vehicle (MPCV) and future exploration missions are mass constrained; therefore we are challenged to reduce the mass of the food system by 10% while maintaining safety, nutrition, and acceptability for exploration missions. Food bars have previously been used to supplement meals in the Skylab food system, indicating that regular consumption of bars will be acceptable. However, commercially available products do not meet the requirements for a full meal replacement in the spaceflight food system. The purpose of this task is to develop a variety of nutritionally balanced breakfast replacement bars, which meet spaceflight nutritional, microbiological, sensorial, and shelf-life requirements, while enabling a 10% food mass savings. To date, six nutrient-dense meal replacement bars have been developed, using both traditional methods of compression as well as novel ultrasonic compression technologies developed by Creative Resonance Inc. (Phoenix, AZ). All bars will be prioritized based on acceptability and the four top candidates will be evaluated in the Human Exploration Research Analog (HERA) to assess the frequency with which actual meal replacement options may be implemented. Specifically, overall impact to mood, satiety, dietary discomfort, and satisfaction with food will be analyzed to inform successful implementation strategies. In addition, these bars will be evaluated based on final product sensory acceptability, nutritional stability, qualitative stability of analytical measurements (i.e. water activity and texture), and microbiological compliance over two years of storage at room temperature and potential temperature abuse conditions to predict long-term acceptability. It is expected that this work will enable a successful meal

  2. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  3. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  4. Transforming labor-management practices through real-time analytics.

    PubMed

    Nippert, Kathye Habig; Graves, Brian

    2012-06-01

    Catholic Health Partners (CHP) decentralized productivity management, giving its regional executives greater control over their productivity tools and data. CHP retained centralized management of its benchmarking and analytics and created an enterprise database with standardized information. CHP's stakeholders shared accountability and accepted greater responsibility for labor-management decisions.

  5. Paper-based inkjet-printed microfluidic analytical devices.

    PubMed

    Yamada, Kentaro; Henares, Terence G; Suzuki, Koji; Citterio, Daniel

    2015-04-27

    Rapid, precise, and reproducible deposition of a broad variety of functional materials, including analytical assay reagents and biomolecules, has made inkjet printing an effective tool for the fabrication of microanalytical devices. A ubiquitous office device as simple as a standard desktop printer with its multiple ink cartridges can be used for this purpose. This Review discusses the combination of inkjet printing technology with paper as a printing substrate for the fabrication of microfluidic paper-based analytical devices (μPADs), which have developed into a fast-growing new field in analytical chemistry. After introducing the fundamentals of μPADs and inkjet printing, it touches on topics such as the microfluidic patterning of paper, tailored arrangement of materials, and functionalities achievable exclusively by the inkjet deposition of analytical assay components, before concluding with an outlook on future perspectives.

  6. Frontiers in analytical chemistry

    SciTech Connect

    Amato, I.

    1988-12-15

    Doing more with less was the modus operandi of R. Buckminster Fuller, the late science genius, and inventor of such things as the geodesic dome. In late September, chemists described their own version of this maxim--learning more chemistry from less material and in less time--in a symposium titled Frontiers in Analytical Chemistry at the 196th National Meeting of the American Chemical Society in Los Angeles. Symposium organizer Allen J. Bard of the University of Texas at Austin assembled six speakers, himself among them, to survey pretty widely different areas of analytical chemistry.

  7. Monitoring the analytic surface.

    PubMed

    Spence, D P; Mayes, L C; Dahl, H

    1994-01-01

    How do we listen during an analytic hour? Systematic analysis of the speech patterns of one patient (Mrs. C.) strongly suggests that the clustering of shared pronouns (e.g., you/me) represents an important aspect of the analytic surface, preconsciously sensed by the analyst and used by him to determine when to intervene. Sensitivity to these patterns increases over the course of treatment, and in a final block of 10 hours shows a striking degree of contingent responsivity: specific utterances by the patient are consistently echoed by the analyst's interventions. PMID:8182248

  8. Standardization of DOE Disposal Facilities Waste Acceptance Processes

    SciTech Connect

    Shrader, T. A.; Macbeth, P. J.

    2002-02-26

    On February 25, 2000, the U.S. Department of Energy (DOE) issued the Record of Decision (ROD) for the Waste Management Programmatic Environmental Impact Statement (WM PEIS) for low-level and mixed low-level wastes (LLW/ MLLW) treatment and disposal. The ROD designated the disposal sites at Hanford and the Nevada Test Site (NTS) to dispose of LLW/MLLW from sites without their own disposal facilities. DOE's Richland Operations Office (RL) and the National Nuclear Security Administration's Nevada Operations Office (NV) have been charged with effectively implementing the ROD. To accomplish this task NV and RL, assisted by their operating contractors Bechtel Nevada (BN), Fluor Hanford (FH), and Bechtel Hanford (BH) assembled a task team to systematically map out and evaluate the current waste acceptance processes and develop an integrated, standardized process for the acceptance of LLW/MLLW. A structured, systematic, analytical process using the Six Sigma system identified dispos al process improvements and quantified the associated efficiency gains to guide changes to be implemented. The review concluded that a unified and integrated Hanford/NTS Waste Acceptance Process would be a benefit to the DOE Complex, particularly the waste generators. The Six Sigma review developed quantitative metrics to address waste acceptance process efficiency improvements, and provides an initial look at development of comparable waste disposal cost models between the two disposal sites to allow quantification of the proposed improvements.

  9. Standardization of DOE Disposal Facilities Waste Acceptance Process

    SciTech Connect

    SHRADER, T.; MACBETH, P.

    2002-01-01

    On February 25, 2000, the US. Department of Energy (DOE) issued the Record of Decision (ROD) for the Waste Management Programmatic Environmental Impact Statement (WM PEIS) for low-level and mixed low-level wastes (LLW/ MLLW) treatment and disposal. The ROD designated the disposal sites at Hanford and the Nevada Test Site (NTS) to dispose of LLWMLLW from sites without their own disposal facilities. DOE's Richland Operations Office (RL) and the National Nuclear Security Administration's Nevada Operations Office (NV) have been charged with effectively implementing the ROD. To accomplish this task NV and RL, assisted by their operating contractors Bechtel Nevada (BN), Fluor Hanford (FH), and Bechtel Hanford (BH) assembled a task team to systematically map out and evaluate the current waste acceptance processes and develop an integrated, standardized process for the acceptance of LLWMLLW. A structured, systematic, analytical process using the Six Sigma system identified disposal process improvements and quantified the associated efficiency gains to guide changes to be implemented. The review concluded that a unified and integrated Hanford/NTS Waste Acceptance Process would be a benefit to the DOE Complex, particularly the waste generators. The Six Sigma review developed quantitative metrics to address waste acceptance process efficiency improvements, and provides an initial look at development of comparable waste disposal cost models between the two disposal sites to allow quantification of the proposed improvements.

  10. Reproducibility of the anti-Factor Xa and anti-Factor IIa assays applied to enoxaparin solution.

    PubMed

    Martinez, Céline; Savadogo, Adama; Agut, Christophe; Anger, Pascal

    2013-01-01

    Enoxaparin is a widely used subcutaneously administered antithrombotic agent comprising a complex mixture of glycosaminoglycan chains. Owing to this complexity, its antithrombotic potency cannot be defined by physicochemical methods and is therefore evaluated using an enzymatic assay of anti-Xa and anti-IIa activity. Maintaining consistent anti-Xa activity in the final medicinal product allows physicians to ensure administration of the appropriate dosage to their patients. Bioassays are usually complex and display poorer reproducibility than physicochemical tests such as HPLC assays. Here, we describe the implementation of a common robotic platform and standard release potency testing procedures for enoxaparin sodium injection (Lovenox, Sanofi, Paris, France) products at seven quality control sites within Sanofi. Qualification and analytical procedures, as well as data handling, were optimized and harmonized to improve assay reproducibility. An inter-laboratory study was performed in routine-release conditions. The coefficients of variation for repeatability and reproducibility in assessments of anti-Xa activity were 1.0% and 1.2%, respectively. The tolerance interval in reproducibility precision conditions, expressed as percentage potency, was 96.8-103.2% of the drug product target of 10,000 IU/ml, comparing favorably with the United States of America Pharmacopeia specification (90-110%). The maximum difference between assays in two different laboratories is expected to be 4.1%. The reproducibility characteristics of anti-IIa activity assessments were found to be similar. These results demonstrate the effectiveness of the standardization process established and allow for further improvements to quality control in Lovenox manufacture. This process guarantees closeness between actual and target potencies, as exemplified by the results of release assays obtained during a three-year period.

  11. Reproducibility of the anti-Factor Xa and anti-Factor IIa assays applied to enoxaparin solution.

    PubMed

    Martinez, Céline; Savadogo, Adama; Agut, Christophe; Anger, Pascal

    2013-01-01

    Enoxaparin is a widely used subcutaneously administered antithrombotic agent comprising a complex mixture of glycosaminoglycan chains. Owing to this complexity, its antithrombotic potency cannot be defined by physicochemical methods and is therefore evaluated using an enzymatic assay of anti-Xa and anti-IIa activity. Maintaining consistent anti-Xa activity in the final medicinal product allows physicians to ensure administration of the appropriate dosage to their patients. Bioassays are usually complex and display poorer reproducibility than physicochemical tests such as HPLC assays. Here, we describe the implementation of a common robotic platform and standard release potency testing procedures for enoxaparin sodium injection (Lovenox, Sanofi, Paris, France) products at seven quality control sites within Sanofi. Qualification and analytical procedures, as well as data handling, were optimized and harmonized to improve assay reproducibility. An inter-laboratory study was performed in routine-release conditions. The coefficients of variation for repeatability and reproducibility in assessments of anti-Xa activity were 1.0% and 1.2%, respectively. The tolerance interval in reproducibility precision conditions, expressed as percentage potency, was 96.8-103.2% of the drug product target of 10,000 IU/ml, comparing favorably with the United States of America Pharmacopeia specification (90-110%). The maximum difference between assays in two different laboratories is expected to be 4.1%. The reproducibility characteristics of anti-IIa activity assessments were found to be similar. These results demonstrate the effectiveness of the standardization process established and allow for further improvements to quality control in Lovenox manufacture. This process guarantees closeness between actual and target potencies, as exemplified by the results of release assays obtained during a three-year period. PMID:23644908

  12. Robust estimation of fractal measures for characterizing the structural complexity of the human brain: optimization and reproducibility.

    PubMed

    Goñi, Joaquín; Sporns, Olaf; Cheng, Hu; Aznárez-Sanado, Maite; Wang, Yang; Josa, Santiago; Arrondo, Gonzalo; Mathews, Vincent P; Hummer, Tom A; Kronenberger, William G; Avena-Koenigsberger, Andrea; Saykin, Andrew J; Pastor, María A

    2013-12-01

    High-resolution isotropic three-dimensional reconstructions of human brain gray and white matter structures can be characterized to quantify aspects of their shape, volume and topological complexity. In particular, methods based on fractal analysis have been applied in neuroimaging studies to quantify the structural complexity of the brain in both healthy and impaired conditions. The usefulness of such measures for characterizing individual differences in brain structure critically depends on their within-subject reproducibility in order to allow the robust detection of between-subject differences. This study analyzes key analytic parameters of three fractal-based methods that rely on the box-counting algorithm with the aim to maximize within-subject reproducibility of the fractal characterizations of different brain objects, including the pial surface, the cortical ribbon volume, the white matter volume and the gray matter/white matter boundary. Two separate datasets originating from different imaging centers were analyzed, comprising 50 subjects with three and 24 subjects with four successive scanning sessions per subject, respectively. The reproducibility of fractal measures was statistically assessed by computing their intra-class correlations. Results reveal differences between different fractal estimators and allow the identification of several parameters that are critical for high reproducibility. Highest reproducibility with intra-class correlations in the range of 0.9-0.95 is achieved with the correlation dimension. Further analyses of the fractal dimensions of parcellated cortical and subcortical gray matter regions suggest robustly estimated and region-specific patterns of individual variability. These results are valuable for defining appropriate parameter configurations when studying changes in fractal descriptors of human brain structure, for instance in studies of neurological diseases that do not allow repeated measurements or for disease

  13. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable. PMID:26919473

  14. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools.

  15. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  16. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  17. [Study of the validity and reproducibility of passive ozone monitors].

    PubMed

    Cortez-Lugo, M; Romieu, I; Palazuelos-Rendón, E; Hernández-Avila, M

    1995-01-01

    The aim of this study was to evaluate the validity and reproducibility between ozone measurements obtained with passive ozone monitors and those registered with a continuous ozone monitor, to determine the applicability of passive monitors in epidemiological research. The study was carried out during November and December 1992. Indoor and outdoor classroom air ozone concentrations were analyzed using 28 passive monitors and using a continuous monitor. The correlation between both measurements was highly significant (r = 0.089, p < 0.001), indicating a very good validity. Also, the correlation between the measurements obtained with two different passive monitors exposed concurrently was very high (r = 0.97, p < 0.001), indicating a good reproducibility in the measurements of the passive monitors. The relative error between the concentrations measured by the passive monitors and those from the continuous monitor tended to decrease with increasing ozone concentrations. The results suggest that passive monitors should be used to determine cumulative exposure of ozone exceeding 100 ppb, corresponding to an exposure period greater than five days, if used to analyze indoor air.

  18. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  19. Data reproducibility of pace strategy in a laboratory test run.

    PubMed

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-06-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  20. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  1. Reproducible quantitative proteotype data matrices for systems biology.

    PubMed

    Röst, Hannes L; Malmström, Lars; Aebersold, Ruedi

    2015-11-01

    Historically, many mass spectrometry-based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals.

  2. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  3. Selection on soil microbiomes reveals reproducible impacts on plant function

    PubMed Central

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-01-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  4. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  5. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  6. Effect of Soil Moisture Content on the Splash Phenomenon Reproducibility

    PubMed Central

    Ryżak, Magdalena; Bieganowski, Andrzej; Polakowski, Cezary

    2015-01-01

    One of the methods for testing splash (the first phase of water erosion) may be an analysis of photos taken using so-called high-speed cameras. The aim of this study was to determine the reproducibility of measurements using a single drop splash of simulated precipitation. The height from which the drops fell resulted in a splash of 1.5 m. Tests were carried out using two types of soil: Eutric Cambisol (loamy silt) and Orthic Luvisol (sandy loam); three initial pressure heads were applied equal to 16 kPa, 3.1 kPa, and 0.1 kPa. Images for one, five, and 10 drops were recorded at a rate of 2000 frames per second. It was found that (i) the dispersion of soil caused by the striking of the 1st drop was significantly different from the splash impact caused by subsequent drops; (ii) with every drop, the splash phenomenon proceeded more reproducibly, that is, the number of particles of soil and/or water that splashed were increasingly close to each other; (iii) the number of particles that were detached during the splash were strongly correlated with its surface area; and (iv) the higher the water film was on the surface the smaller the width of the crown was. PMID:25785859

  7. Reproducibility of the 6-minute walk test in obese adults.

    PubMed

    Beriault, K; Carpentier, A C; Gagnon, C; Ménard, J; Baillargeon, J-P; Ardilouze, J-L; Langlois, M-F

    2009-10-01

    The six-minute walk test (6MWT) is an inexpensive, quick and safe tool to evaluate the functional capacity of patients with heart failure and chronic obstructive pulmonary disease. The aim of this study was to determine the reproducibility of the 6MWT in overweight and obese individuals. We thus undertook a prospective repeated-measure validity study taking place in our academic weight management outpatient clinic. The 6MWT was conducted twice the same day in 21 overweight or obese adult subjects (15 females and 6 males). Repeatability of walking distance was the primary outcome. Anthropometric measures, blood pressure and heart rate were also recorded. Participant's mean BMI was 37.2+/-9.8 kg/m(2) (range: 27.0-62.3 kg/m(2)). Walking distance in the morning (mean=452+/-90 m) and in the afternoon (mean=458+/-97 m) were highly correlated (r=0.948; 95% Confidence Interval 0.877-0.978; p<0.001). Walking distance was negatively correlated with BMI (r=-0.47, p=0.03), waist circumference (r=-0.43, p=0.05) and pre-test heart rate (r=-0.54, p=0.01). Our findings indicate that the 6MWT is highly reproducible in obese subjects and could thus be used as a fitness indicator in clinical studies and clinical care in this population.

  8. Computer acceptance of older adults.

    PubMed

    Nägle, Sibylle; Schmidt, Ludger

    2012-01-01

    Even though computers play a massive role in everyday life of modern societies, older adults, and especially older women, are less likely to use a computer, and they perform fewer activities on it than younger adults. To get a better understanding of the factors affecting older adults' intention towards and usage of computers, the Unified Theory of Acceptance and Usage of Technology (UTAUT) was applied as part of a more extensive study with 52 users and non-users of computers, ranging in age from 50 to 90 years. The model covers various aspects of computer usage in old age via four key constructs, namely performance expectancy, effort expectancy, social influences, and facilitating conditions, as well as the variables gender, age, experience, and voluntariness it. Interestingly, next to performance expectancy, facilitating conditions showed the strongest correlation with use as well as with intention. Effort expectancy showed no significant correlation with the intention of older adults to use a computer.

  9. Automated analytical microarrays: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2008-07-01

    Microarrays provide a powerful analytical tool for the simultaneous detection of multiple analytes in a single experiment. The specific affinity reaction of nucleic acids (hybridization) and antibodies towards antigens is the most common bioanalytical method for generating multiplexed quantitative results. Nucleic acid-based analysis is restricted to the detection of cells and viruses. Antibodies are more universal biomolecular receptors that selectively bind small molecules such as pesticides, small toxins, and pharmaceuticals and to biopolymers (e.g. toxins, allergens) and complex biological structures like bacterial cells and viruses. By producing an appropriate antibody, the corresponding antigenic analyte can be detected on a multiplexed immunoanalytical microarray. Food and water analysis along with clinical diagnostics constitute potential application fields for multiplexed analysis. Diverse fluorescence, chemiluminescence, electrochemical, and label-free microarray readout systems have been developed in the last decade. Some of them are constructed as flow-through microarrays by combination with a fluidic system. Microarrays have the potential to become widely accepted as a system for analytical applications, provided that robust and validated results on fully automated platforms are successfully generated. This review gives an overview of the current research on microarrays with the focus on automated systems and quantitative multiplexed applications.

  10. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed.

  11. 39 CFR 3050.10 - Analytical principles to be applied in the Postal Service's annual periodic reports to the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Analytical principles to be applied in the Postal... REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.10 Analytical principles to be applied in the... Commission, the Postal Service shall use only accepted analytical principles. With respect to its...

  12. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    NASA Astrophysics Data System (ADS)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  13. Analytical Services Management System

    SciTech Connect

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  14. Analytical Services Management System

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standardmore » chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.« less

  15. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  16. Analytics: Changing the Conversation

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  17. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  18. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    PubMed

    Kramski, Marit; Mätz-Rensing, Kerstin; Stahl-Hennig, Christiane; Kaup, Franz-Josef; Nitsche, Andreas; Pauli, Georg; Ellerbrok, Heinz

    2010-01-01

    The intentional re-introduction of Variola virus (VARV), the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV) and Monkeypox virus (MPXV) cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus).A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV) outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2) pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50) (minimal monkey infectious dose 50%) of 8.3x10(2) pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  19. A Novel Semiautomated Fractional Limb Volume Tool for Rapid and Reproducible Fetal Soft Tissue Assessment.

    PubMed

    Mack, Lauren M; Kim, Sung Yoon; Lee, Sungmin; Sangi-Haghpeykar, Haleh; Lee, Wesley

    2016-07-01

    The purpose of this study was to document the reproducibility and efficiency of a semiautomated image analysis tool that rapidly provides fetal fractional limb volume measurements. Fifty pregnant women underwent 3-dimensional sonographic examinations for fractional arm and thigh volumes at a mean menstrual age of 31.3 weeks. Manual and semiautomated fractional limb volume measurements were calculated, with the semiautomated measurements calculated by novel software (5D Limb Vol; Samsung Medison, Seoul, Korea). The software applies an image transformation method based on the major axis length, minor axis length, and limb center coordinates. A transformed image is used to perform a global optimization technique for determination of an optimal limb soft tissue boundary. Bland-Altman analysis defined bias with 95% limits of agreement (LOA) between methods, and timing differences between manual versus automated methods were compared by a paired t test. Bland-Altman analysis indicated an acceptable bias with 95% LOA between the manual and semiautomated methods: mean arm volume ± SD, 1.7% ± 4.6% (95% LOA, -7.3% to 10.7%); and mean thigh volume, 0.0% ± 3.8% (95% LOA, -7.5% to 7.5%). The computer-assisted software completed measurements about 5 times faster compared to manual tracings. In conclusion, semiautomated fractional limb volume measurements are significantly faster to calculate when compared to a manual procedure. These results are reproducible and are likely to reduce operator dependency. The addition of computer-assisted fractional limb volume to standard biometry may improve the precision of estimated fetal weight by adding a soft tissue component to the weight estimation process. PMID:27269002

  20. A Novel Semiautomated Fractional Limb Volume Tool for Rapid and Reproducible Fetal Soft Tissue Assessment.

    PubMed

    Mack, Lauren M; Kim, Sung Yoon; Lee, Sungmin; Sangi-Haghpeykar, Haleh; Lee, Wesley

    2016-07-01

    The purpose of this study was to document the reproducibility and efficiency of a semiautomated image analysis tool that rapidly provides fetal fractional limb volume measurements. Fifty pregnant women underwent 3-dimensional sonographic examinations for fractional arm and thigh volumes at a mean menstrual age of 31.3 weeks. Manual and semiautomated fractional limb volume measurements were calculated, with the semiautomated measurements calculated by novel software (5D Limb Vol; Samsung Medison, Seoul, Korea). The software applies an image transformation method based on the major axis length, minor axis length, and limb center coordinates. A transformed image is used to perform a global optimization technique for determination of an optimal limb soft tissue boundary. Bland-Altman analysis defined bias with 95% limits of agreement (LOA) between methods, and timing differences between manual versus automated methods were compared by a paired t test. Bland-Altman analysis indicated an acceptable bias with 95% LOA between the manual and semiautomated methods: mean arm volume ± SD, 1.7% ± 4.6% (95% LOA, -7.3% to 10.7%); and mean thigh volume, 0.0% ± 3.8% (95% LOA, -7.5% to 7.5%). The computer-assisted software completed measurements about 5 times faster compared to manual tracings. In conclusion, semiautomated fractional limb volume measurements are significantly faster to calculate when compared to a manual procedure. These results are reproducible and are likely to reduce operator dependency. The addition of computer-assisted fractional limb volume to standard biometry may improve the precision of estimated fetal weight by adding a soft tissue component to the weight estimation process.

  1. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... offered have either achieved commercial market acceptance or been satisfactorily supplied to an agency... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance....

  2. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a) Section... either— (i) Achieved commercial market acceptance; or (ii) Been satisfactorily supplied to an...

  3. 48 CFR 2911.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Market acceptance. 2911... DESCRIBING AGENCY NEEDS Selecting And Developing Requirements Documents 2911.103 Market acceptance. The... offered have either achieved commercial market acceptance or been satisfactorily supplied to an...

  4. Older Adults' Acceptance of Information Technology

    ERIC Educational Resources Information Center

    Wang, Lin; Rau, Pei-Luen Patrick; Salvendy, Gavriel

    2011-01-01

    This study investigated variables contributing to older adults' information technology acceptance through a survey, which was used to find factors explaining and predicting older adults' information technology acceptance behaviors. Four factors, including needs satisfaction, perceived usability, support availability, and public acceptance, were…

  5. Apollo experience report environmental acceptance testing

    NASA Technical Reports Server (NTRS)

    Laubach, C. H. M.

    1976-01-01

    Environmental acceptance testing was used extensively to screen selected spacecraft hardware for workmanship defects and manufacturing flaws. The minimum acceptance levels and durations and methods for their establishment are described. Component selection and test monitoring, as well as test implementation requirements, are included. Apollo spacecraft environmental acceptance test results are summarized, and recommendations for future programs are presented.

  6. 48 CFR 245.606-3 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Acceptance. 245.606-3..., DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT GOVERNMENT PROPERTY Reporting, Redistribution, and Disposal of Contractor Inventory 245.606-3 Acceptance. (a) If the schedules are acceptable, the plant clearance...

  7. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  8. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  9. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  10. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  11. 46 CFR 28.73 - Accepted organizations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As...

  12. Formulation of a candidate glass for use as an acceptance test standard material

    SciTech Connect

    Ebert, W.L.; Strachan, D.M.; Wolf, S.F.

    1998-04-01

    In this report, the authors discuss the formulation of a glass that will be used in a laboratory testing program designed to measure the precision of test methods identified in the privatization contracts for the immobilization of Hanford low-activity wastes. Tests will be conducted with that glass to measure the reproducibility of tests and analyses that must be performed by glass producers as a part of the product acceptance procedure. Test results will be used to determine if the contractually required tests and analyses are adequate for evaluating the acceptability of likely immobilized low-activity waste (ILAW) products. They will also be used to evaluate if the glass designed for use in these tests can be used as an analytical standard test material for verifying results reported by vendors for tests withg ILAW products. The results of those tests and analyses will be presented in a separate report. The purpose of this report is to document the strategy used to formulate the glass to be used in the testing program. The low-activity waste reference glass LRM that will be used in the testing program was formulated to be compositionally similar to ILAW products to be made with wastes from Hanford. Since the ILAW product compositions have not been disclosed by the vendors participating in the Hanford privatization project, the composition of LRM was formulated based on simulated Hanford waste stream and amounts of added glass forming chemicals typical for vitrified waste forms. The major components are 54 mass % SiO{sub 2}, 20 mass % Na{sub 2}O, 10 mass % Al{sub 2}O{sub 3}, 8 mass % B{sub 2}O{sub 3}, and 1.5 mass % K{sub 2}O. Small amounts of other chemicals not present in Hanford wastes were also included in the glass, since they may be included as chemical additives in ILAW products. This was done so that the use of LRM as a composition standard could be evaluated. Radionuclides were not included in LRM because a nonradioactive material was desired.

  13. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  14. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    NASA Astrophysics Data System (ADS)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  15. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  16. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    NASA Astrophysics Data System (ADS)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  17. Reproducible, Scalable Fusion Gene Detection from RNA-Seq.

    PubMed

    Arsenijevic, Vladan; Davis-Dusenbery, Brandi N

    2016-01-01

    Chromosomal rearrangements resulting in the creation of novel gene products, termed fusion genes, have been identified as driving events in the development of multiple types of cancer. As these gene products typically do not exist in normal cells, they represent valuable prognostic and therapeutic targets. Advances in next-generation sequencing and computational approaches have greatly improved our ability to detect and identify fusion genes. Nevertheless, these approaches require significant computational resources. Here we describe an approach which leverages cloud computing technologies to perform fusion gene detection from RNA sequencing data at any scale. We additionally highlight methods to enhance reproducibility of bioinformatics analyses which may be applied to any next-generation sequencing experiment. PMID:26667464

  18. [Expansion of undergraduate nursing and the labor market: reproducing inequalities?].

    PubMed

    Silva, Kênia Lara; de Sena, Roseni Rosângela; Tavares, Tatiana Silva; Wan der Maas, Lucas

    2012-01-01

    This study aimed to analyze the relationship between the increase in the number of degree courses in nursing and the nursing job market. It is a descriptive exploratory study with a quantitative approach, which used data on Undergraduate Nursing courses, supply of nurses, connection with health facilities, and formal jobs in nursing in the state of Minas Gerais. The evolution of Undergraduate Nursing courses reveals a supply and demand decline in recent years. Such context is determined by the nurse's labor market being influenced by the contradiction of a professional quantitative surplus, particularly in the state's less developed areas, as opposed to a low percentage of nurses to care for the population's health. These characteristics of the nursing labor market reproduce inequalities furthermore aspects such as the regulation of nursing education and the creation of new jobs need to be discussed further.

  19. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  20. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

  1. GigaDB: promoting data dissemination and reproducibility

    PubMed Central

    Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612

  2. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements. PMID:27013736

  3. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily Kara; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  4. Whole blood metal ion measurement reproducibility between different laboratories.

    PubMed

    Rahmé, Michel; Lavigne, Martin; Barry, Janie; Cirtiu, Ciprian Mihai; Bélanger, Patrick; Vendittoli, Pascal-André

    2014-11-01

    Monitoring patients' metal ion blood concentrations can be useful in cases of problematic metal on metal hip implants. Our objective was to evaluate the reproducibility of metal ion level values measured by two different laboratories. Whole blood samples were collected in 46 patients with metal on metal hip arthroplasty. For each patients, two whole blood samples were collected and analyzed by two laboratories. Laboratory 1 had higher results than laboratory 2. There was a clinically significant absolute difference between the two laboratories, above the predetermined threshold, 35% of Cr samples and 38% of Co samples. All laboratories do not use the same technologies for their measurements. Therefore, decision to revise a metal on metal hip arthroplasty should rely on metal ion trends and have to be done in the same laboratory.

  5. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  6. Reproducing the kinematics of damped Lyman α systems

    NASA Astrophysics Data System (ADS)

    Bird, Simeon; Haehnelt, Martin; Neeleman, Marcel; Genel, Shy; Vogelsberger, Mark; Hernquist, Lars

    2015-02-01

    We examine the kinematic structure of damped Lyman α systems (DLAs) in a series of cosmological hydrodynamic simulations using the AREPO code. We are able to match the distribution of velocity widths of associated low-ionization metal absorbers substantially better than earlier work. Our simulations produce a population of DLAs dominated by haloes with virial velocities around 70 km s-1, consistent with a picture of relatively small, faint objects. In addition, we reproduce the observed correlation between velocity width and metallicity and the equivalent width distribution of Si II. Some discrepancies of moderate statistical significance remain; too many of our spectra show absorption concentrated at the edge of the profile and there are slight differences in the exact shape of the velocity width distribution. We show that the improvement over previous work is mostly due to our strong feedback from star formation and our detailed modelling of the metal ionization state.

  7. Initial evaluations of the reproducibility of vapor-diffusion crystallization.

    PubMed

    Newman, Janet; Xu, Jian; Willis, Michael C

    2007-07-01

    Experiments were set up to test how the crystallization drop size affects the crystallization process; in the test cases studied, increasing the drop size led to increasing numbers of crystals. Other data produced from a high-throughput automation-system run were analyzed in order to gauge the effect of replication on the success of crystallization screening. With over 40-fold multiplicity, lysozyme was found to crystallize in over half of the conditions in a standard 96-condition screen. However, despite the fact that industry-standard lysozyme was used in our tests, it was rare that we obtained crystals reproducibly; this suggests that replication whilst screening might improve the success rate of macromolecular crystallization.

  8. The Vienna LTE simulators - Enabling reproducibility in wireless communications research

    NASA Astrophysics Data System (ADS)

    Mehlführer, Christian; Colom Colom Ikuno, Josep; Šimko, Michal; Schwarz, Stefan; Wrulich, Martin; Rupp, Markus

    2011-12-01

    In this article, we introduce MATLAB-based link and system level simulation environments for UMTS Long-Term Evolution (LTE). The source codes of both simulators are available under an academic non-commercial use license, allowing researchers full access to standard-compliant simulation environments. Owing to the open source availability, the simulators enable reproducible research in wireless communications and comparison of novel algorithms. In this study, we explain how link and system level simulations are connected and show how the link level simulator serves as a reference to design the system level simulator. We compare the accuracy of the PHY modeling at system level by means of simulations performed both with bit-accurate link level simulations and PHY-model-based system level simulations. We highlight some of the currently most interesting research questions for LTE, and explain by some research examples how our simulators can be applied.

  9. Investigating the reproducibility of a complex multifocal radiosurgery treatment

    PubMed Central

    Niebanck, M; Juang, T; Newton, J; Adamovics, J; Wang, Z; Oldham, M

    2013-01-01

    Stereotactic radiosurgery has become a widely used technique to treat solid tumors and secondary metastases of the brain. Multiple targets can be simultaneously treated with a single isocenter in order to reduce the set-up time to improve patient comfort and workflow. In this study, a 5-arc multifocal RapidArc treatment was delivered to multiple PRESAGE® dosimeters in order to explore the repeatability of the treatment. The three delivery measurements agreed well with each other, with less than 3% standard deviation of dose in the target. The deliveries also agreed well with the treatment plan, with gamma passing rates greater than 90% (5% dose-difference, and 2 mm distance-to-agreement criteria). The optical-CT PRESAGE® system provided a reproducible measurement for treatment verification, provided measurements were made immediately following treatment. PMID:27081397

  10. Reproducing kernel hilbert space based single infrared image super resolution

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang; Deng, Liangjian; Shen, Wei; Xi, Ning; Zhou, Zhanxin; Song, Bo; Yang, Yongliang; Cheng, Yu; Dong, Lixin

    2016-07-01

    The spatial resolution of Infrared (IR) images is limited by lens optical diffraction, sensor array pitch size and pixel dimension. In this work, a robust model is proposed to reconstruct high resolution infrared image via a single low resolution sampling, where the image features are discussed and classified as reflective, cooled emissive and uncooled emissive based on infrared irradiation source. A spline based reproducing kernel hilbert space and approximative heaviside function are deployed to model smooth part and edge component of image respectively. By adjusting the parameters of heaviside function, the proposed model can enhance distinct part of images. The experimental results show that the model is applicable on both reflective and emissive low resolution infrared images to improve thermal contrast. The overall outcome produces a high resolution IR image, which makes IR camera better measurement accuracy and observes more details at long distance.

  11. New model for datasets citation and extraction reproducibility in VAMDC

    NASA Astrophysics Data System (ADS)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  12. Accurate and reproducible determination of lignin molar mass by acetobromination.

    PubMed

    Asikkala, Janne; Tamminen, Tarja; Argyropoulos, Dimitris S

    2012-09-12

    The accurate and reproducible determination of lignin molar mass by using size exclusion chromatography (SEC) is challenging. The lignin association effects, known to dominate underivatized lignins, have been thoroughly addressed by reaction with acetyl bromide in an excess of glacial acetic acid. The combination of a concerted acetylation with the introduction of bromine within the lignin alkyl side chains is thought to be responsible for the observed excellent solubilization characteristics acetobromination imparts to a variety of lignin samples. The proposed methodology was compared and contrasted to traditional lignin derivatization methods. In addition, side reactions that could possibly be induced under the acetobromination conditions were explored with native softwood (milled wood lignin, MWL) and technical (kraft) lignin. These efforts lend support toward the use of room temperature acetobromination being a facile, effective, and universal lignin derivatization medium proposed to be employed prior to SEC measurements. PMID:22870925

  13. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  14. Reproducibility of MRI segmentation using a feature space method

    NASA Astrophysics Data System (ADS)

    Soltanian-Zadeh, Hamid; Windham, Joe P.; Scarpace, Lisa; Murnock, Tanya

    1998-06-01

    This paper presents reproducibility studies for the segmentation results obtained by our optimal MRI feature space method. The steps of the work accomplished are as follows. (1) Eleven patients with brain tumors were imaged by a 1.5 T General Electric Signa MRI System. Four T2- weighted and two T1-weighted images (before and after Gadolinium injection) were acquired for each patient. (2) Images of a slice through the center of the tumor were selected for processing. (3) Patient information was removed from the image headers and new names (unrecognizable by the image analysts) were given to the images. These images were blindly analyzed by the image analysts. (4) Segmentation results obtained by the two image analysts at two time points were compared to assess the reproducibility of the segmentation method. For each tissue segmented in each patient study, a comparison was done by kappa statistics and a similarity measure (an approximation of kappa statistics used by other researchers), to evaluate the number of pixels that were in both of the segmentation results obtained by the two image analysts (agreement) relative to the number of pixels that were not in both (disagreement). An overall agreement comparison was done by finding means and standard deviations of kappa statistics and the similarity measure found for each tissue type in the studies. The kappa statistics for white matter was the largest (0.80) followed by those of gray matter (0.68), partial volume (0.67), total lesion (0.66), and CSF (0.44). The similarity measure showed the same trend but it was always higher than kappa statistics. It was 0.85 for white matter, 0.77 for gray matter, 0.73 for partial volume, 0.72 for total lesion, and 0.47 for CSF.

  15. Galaxy Zoo: reproducing galaxy morphologies via machine learning

    NASA Astrophysics Data System (ADS)

    Banerji, Manda; Lahav, Ofer; Lintott, Chris J.; Abdalla, Filipe B.; Schawinski, Kevin; Bamford, Steven P.; Andreescu, Dan; Murray, Phil; Raddick, M. Jordan; Slosar, Anze; Szalay, Alex; Thomas, Daniel; Vandenberg, Jan

    2010-07-01

    We present morphological classifications obtained using machine learning for objects in the Sloan Digital Sky Survey DR6 that have been classified by Galaxy Zoo into three classes, namely early types, spirals and point sources/artefacts. An artificial neural network is trained on a subset of objects classified by the human eye, and we test whether the machine-learning algorithm can reproduce the human classifications for the rest of the sample. We find that the success of the neural network in matching the human classifications depends crucially on the set of input parameters chosen for the machine-learning algorithm. The colours and parameters associated with profile fitting are reasonable in separating the objects into three classes. However, these results are considerably improved when adding adaptive shape parameters as well as concentration and texture. The adaptive moments, concentration and texture parameters alone cannot distinguish between early type galaxies and the point sources/artefacts. Using a set of 12 parameters, the neural network is able to reproduce the human classifications to better than 90 per cent for all three morphological classes. We find that using a training set that is incomplete in magnitude does not degrade our results given our particular choice of the input parameters to the network. We conclude that it is promising to use machine-learning algorithms to perform morphological classification for the next generation of wide-field imaging surveys and that the Galaxy Zoo catalogue provides an invaluable training set for such purposes. This publication has been made possible by the participation of more than 100000 volunteers in the Galaxy Zoo project. Their contributions are individually acknowledged at http://www.galaxyzoo.org/Volunteers.aspx. E-mail: mbanerji@ast.cam.ac.uk ‡ Einstein Fellow.

  16. A workflow for reproducing mean benthic gas fluxes

    NASA Astrophysics Data System (ADS)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  17. Acceptability of GM foods among Pakistani consumers.

    PubMed

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-04-01

    In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers.

  18. Acceptability of GM foods among Pakistani consumers.

    PubMed

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-04-01

    In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers. PMID:27494790

  19. Relative Validity and Reproducibility of a Quantitative Food Frequency Questionnaire for Adolescents with Type 1 Diabetes: Validity of a Food Frequency Questionnaire

    PubMed Central

    Marques, Rosana de Moraes Borges; de Oliveira, Amanda Cristine; Teles, Sheylle Almeida da Silva; Stringuini, Maria Luiza Ferreira; Fornés, Nélida Shimid

    2014-01-01

    Background. Food frequency questionnaires are used to assess dietary intake in epidemiological studies. Objective. The aim of the study was to assess the relative validity and reproducibility of a quantitative food frequency questionnaire (QFFQ) for adolescents with type 1 diabetes. Methods: Validity was evaluated by comparing the data generated by QFFQs to those of 24-hour recalls (24 hrs). QFFQs were applied twice per patient to assess reproducibility. Statistical analysis included performing t-tests, obtaining Pearson correlation coefficients when necessary, correcting measurements for randomness by the weighted kappa method, calculating intraclass correlation coefficients, and generating Bland-Altman plots (P < 0,05). Results. The total energy and nutrient intake as estimated by the QFFQs were significantly higher than those from 24 hrs. Pearson correlation coefficients for energy-adjusted, deattenuated data ranged from 0.32 (protein) to 0.75 (lipid, unsaturated fat and calcium). Weighted kappa values ranged from 0.15 (vitamin C) to 0.45 (calcium). Bland-Altman plots indicated acceptable validity. As for reproducibility, intraclass correlation coefficients ranged from 0.24 (calcium) to 0.65 (lipid), and the Bland-Altman plots showed good agreement between the two questionnaires. Conclusion: The QFFQ presented an acceptable ability to classify correctly and with good reproducibility, adolescents with type 1 diabetes according to their levels of dietary intake. PMID:25250051

  20. Reproducibility and Variability of the Cost Functions Reconstructed from Experimental Recordings in Multi-Finger Prehension

    PubMed Central

    Niu, Xun; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2012-01-01

    The main goal of the study is to examine whether the cost (objective) functions reconstructed from experimental recordings in multi-finger prehension tasks are reproducible over time, i.e., whether the functions reflect stable preferences of the subjects and can be considered personal characteristics of motor coordination. Young, healthy participants grasped an instrumented handle with varied values of external torque, load and target grasping force and repeated the trials on three days: Day 1, Day 2, and Day 7. By following Analytical Inverse Optimization (ANIO) computation procedures, the cost functions for individual subjects were reconstructed from the experimental recordings (individual finger forces) for each day. The cost functions represented second-order polynomials of finger forces with non-zero linear terms. To check whether the obtained cost functions were reproducible over time a cross-validation was performed: a cost function obtained on Day i was applied to experimental data observed on Day j (i≠j). In spite of the observed day-to-day variability of the performance and the cost functions, the ANIO reconstructed cost functions were found to be reproducible over time: application of a cost function Ci to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) The 2nd order coefficients Ki of the cost function showed negative linear relations with finger force magnitudes. This fact may be interpreted as encouraging involvement of stronger fingers in tasks requiring higher total force magnitude production. (b) The finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space, which has been confirmed for all subjects and all testing sessions. (c) The discovered principal components in the principal component analysis of the finger forces agreed well with the principle of superposition, i.e. the complex action of

  1. Analytical caustic surfaces

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  2. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  3. Analytic ICF Hohlraum Energetics

    SciTech Connect

    Rosen, M D; Hammer, J

    2003-08-27

    We apply recent analytic solutions to the radiation diffusion equation to problems of interest for ICF hohlraums. The solutions provide quantitative values for absorbed energy which are of use for generating a desired radiation temperature vs. time within the hohlraum. Comparison of supersonic and subsonic solutions (heat front velocity faster or slower, respectively, than the speed of sound in the x-ray heated material) suggests that there may be some advantage in using high Z metallic foams as hohlraum wall material to reduce hydrodynamic losses, and hence, net absorbed energy by the walls. Analytic and numerical calculations suggest that the loss per unit area might be reduced {approx} 20% through use of foam hohlraum walls. Reduced hydrodynamic motion of the wall material may also reduce symmetry swings, as found for heavy ion targets.

  4. Nuclear analytical chemistry

    SciTech Connect

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  5. Analytic holographic superconductor

    NASA Astrophysics Data System (ADS)

    Herzog, Christopher P.

    2010-06-01

    We investigate a holographic superconductor that admits an analytic treatment near the phase transition. In the dual 3+1-dimensional field theory, the phase transition occurs when a scalar operator of scaling dimension two gets a vacuum expectation value. We calculate current-current correlation functions along with the speed of second sound near the critical temperature. We also make some remarks about critical exponents. An analytic treatment is possible because an underlying Heun equation describing the zero mode of the phase transition has a polynomial solution. Amusingly, the treatment here may generalize for an order parameter with any integer spin, and we propose a Lagrangian for a spin-two holographic superconductor.

  6. Avatars in Analytical Gaming

    SciTech Connect

    Cowell, Andrew J.; Cowell, Amanda K.

    2009-08-29

    This paper discusses the design and use of anthropomorphic computer characters as nonplayer characters (NPC’s) within analytical games. These new environments allow avatars to play a central role in supporting training and education goals instead of planning the supporting cast role. This new ‘science’ of gaming, driven by high-powered but inexpensive computers, dedicated graphics processors and realistic game engines, enables game developers to create learning and training opportunities on par with expensive real-world training scenarios. However, there needs to be care and attention placed on how avatars are represented and thus perceived. A taxonomy of non-verbal behavior is presented and its application to analytical gaming discussed.

  7. Industrial Analytics Corporation

    SciTech Connect

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  8. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  9. [Acceptance and mindfulness-based cognitive-behavioral therapies].

    PubMed

    Ngô, Thanh-Lan

    2013-01-01

    achieve specific goals. They focus on the present moment rather than on historical causes. However, they also present significant differences: control vs acceptance of thoughts, focus on cognition vs behavior, focus on the relationship between the individual and his thoughts vs cognitive content, goal of modifying dysfunctional beliefs vs metacognitive processes, use of experiential vs didactic methods, focus on symptoms vs quality of life, strategies used before vs after the unfolding of full emotional response. The main interventions based on mindfulness meditation and acceptance are: Acceptance and Commitment Therapy, Functional Analytic Therapy, the expanded model of Behavioral Activation, Metacognitive Therapy, Mindfulness based Cognitive Therapy, Dialectic Behavior Therapy, Integrative Behavioral Couples Therapy and Compassionate Mind Training. These are described in this article. They offer concepts and techniques which might enhance therapeutic efficacy. They teach a new way to deploy attention and to enter into a relationship with current experience (for example, defusion) in order to diminish cognitive reactivity, a maintenance factor for psychopathology, and to enhance psychological flexibility. The focus on cognitive process, metacognition as well as cognitive content might yield additional benefits in therapy. It is possible to combine traditional CBT with third wave approaches by using psychoeducation and cognitive restructuring in the beginning phases of therapy in order to establish thought bias and to then encourage acceptance of internal experiences as well as exposure to feared stimuli rather than to continue to use cognitive restructuring techniques. Traditional CBT and third wave approaches seem to impact different processes: the former enhance the capacity to observe and describe experiences and the latter diminish experiential avoidance and increase conscious action as well as acceptance. The identification of personal values helps to motivate the

  10. Acceptance in Romantic Relationships: The Frequency and Acceptability of Partner Behavior Inventory

    ERIC Educational Resources Information Center

    Doss, Brian D.; Christensen, Andrew

    2006-01-01

    Despite the recent emphasis on acceptance in romantic relationships, no validated measure of relationship acceptance presently exists. To fill this gap, the 20-item Frequency and Acceptability of Partner Behavior Inventory (FAPBI; A. Christensen & N. S. Jacobson, 1997) was created to assess separately the acceptability and frequency of both…

  11. Phase-matching loci and angular acceptance of non-collinear optical parametric amplification.

    PubMed

    Trophème, Benoît; Boulanger, Benoit; Mennerat, Gabriel

    2012-11-19

    A general study of phase-matching loci and associated angular acceptances is performed in the case of non-collinear parametric amplification. Numerical and analytical calculations, as well as measurements, are described for the uniaxial BBO crystal and the biaxial LBO crystal.

  12. Reproducible production of a PEGylated dual-acting peptide for diabetes.

    PubMed

    Tom, Irene; Lee, Vivian; Dumas, Michael; Madanat, Melanie; Ouyang, Jun; Severs, Joanne; Andersen, John; Buxton, Joanne M; Whelan, James P; Pan, Clark Q

    2007-01-01

    A PEGylated glucagon-like peptide-1 (GLP-1) agonist and glucagon antagonist hybrid peptide was engineered as a potential treatment for type 2 diabetes. To support preclinical development of this PEGylated dual-acting peptide for diabetes (DAPD), we developed a reproducible method for PEGylation, purification, and analysis. Optimal conditions for site-specific PEGylation with 22 and 43 kDa maleimide-polyethylene glycol (maleimide-PEG) polymers were identified by evaluating pH, reaction time, and reactant molar ratio parameters. A 3-step purification process was developed and successfully implemented to purify PEGylated DAPD and remove excess uncoupled PEG and free peptide. Five lots of 43 kDa PEGylated DAPD with starting peptide amounts of 100 mg were produced with overall yields of 53% to 71%. Analytical characterization by N-terminal sequencing, amino acid analysis, matrix-assisted laser desorption/ionization mass spectrometry, and GLP-1 receptor activation assay confirmed site-specific attachment of PEG at the engineered cysteine residue, expected molecular weight, correct amino acid sequence and composition, and consistent functional activity. Purity and safety analysis by sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), analytical ion-exchange chromatography, reversed-phase high-performance liquid chromatography, and limulus amebocyte lysate test showed that the final products contained <1% free peptide, <5% uncoupled PEG, and <0.2 endotoxin units per milligram of peptide. These results demonstrate that the PEGylation and purification process we developed was consistent and effective in producing PEGylated DAPD preclinical materials at the 100 mg (peptide weight basis) or 1.2 g (drug substance weight basis) scale. PMID:17907763

  13. Visual Analytics: How Much Visualization and How Much Analytics?

    SciTech Connect

    Keim, Daniel; Mansmann, Florian; Thomas, James J.

    2009-12-16

    The term Visual Analytics has been around for almost five years by now, but still there are on-going discussions about what it actually is and in particular what is new about it. The core of our view on Visual Analytics is the new enabling and accessible analytic reasoning interactions supported by the combination of automated and visual analytics. In this paper, we outline the scope of Visual Analytics using two problem and three methodological classes in order to work out the need for and purpose of Visual Analytics. Thereby, the respective methods are explained plus examples of analytic reasoning interaction leading to a glimpse into the future of how Visual Analytics methods will enable us to go beyond what is possible when separately using the two methods.

  14. A fast object-oriented Matlab implementation of the Reproducing Kernel Particle Method

    NASA Astrophysics Data System (ADS)

    Barbieri, Ettore; Meo, Michele

    2012-05-01

    Novel numerical methods, known as Meshless Methods or Meshfree Methods and, in a wider perspective, Partition of Unity Methods, promise to overcome most of disadvantages of the traditional finite element techniques. The absence of a mesh makes meshfree methods very attractive for those problems involving large deformations, moving boundaries and crack propagation. However, meshfree methods still have significant limitations that prevent their acceptance among researchers and engineers, namely the computational costs. This paper presents an in-depth analysis of computational techniques to speed-up the computation of the shape functions in the Reproducing Kernel Particle Method and Moving Least Squares, with particular focus on their bottlenecks, like the neighbour search, the inversion of the moment matrix and the assembly of the stiffness matrix. The paper presents numerous computational solutions aimed at a considerable reduction of the computational times: the use of kd-trees for the neighbour search, sparse indexing of the nodes-points connectivity and, most importantly, the explicit and vectorized inversion of the moment matrix without using loops and numerical routines.

  15. Model for a reproducible curriculum infrastructure to provide international nurse anesthesia continuing education.

    PubMed

    Collins, Shawn Bryant

    2011-12-01

    There are no set standards for nurse anesthesia education in developing countries, yet one of the keys to the standards in global professional practice is competency assurance for individuals. Nurse anesthetists in developing countries have difficulty obtaining educational materials. These difficulties include, but are not limited to, financial constraints, lack of anesthesia textbooks, and distance from educational sites. There is increasing evidence that the application of knowledge in developing countries is failing. One reason is that many anesthetists in developing countries are trained for considerably less than acceptable time periods and are often supervised by poorly trained practitioners, who then pass on less-than-desirable practice skills, thus exacerbating difficulties. Sustainability of development can come only through anesthetists who are both well trained and able to pass on their training to others. The international nurse anesthesia continuing education project was developed in response to the difficulty that nurse anesthetists in developing countries face in accessing continuing education. The purpose of this project was to develop a nonprofit, volunteer-based model for providing nurse anesthesia continuing education that can be reproduced and used in any developing country.

  16. Improving the Reproducibility of the Radial Argon Concentration in Beryllium Shells

    SciTech Connect

    Youngblood, K. P.; Alford, C.; Bhandarkar, S.; Hayes, J.; Moreno, K. A.; Nikroo, A.; Xu, H.

    2011-01-01

    We used sputter coating of beryllium on spherical mandrels at Lawrence Livermore National Laboratory and at General Atomics to produce graded, copper doped beryllium shells. While these coatings have consistent microstructure and acceptable void content, different coaters produced different results with respect to argon implantation. Each individual system met the requirements for argon implantation, but the deviation from one system to another and from run to run exceeded the variability requirements as specified by the National Ignition Facility target design requirements. We redesigned the fixturing within one system to improve reproducibility. Furthermore, we reconfigured the coaters so that the vertical and lateral alignments of the shells under the gun varied <1 mm between systems. After this process, the systems were able to produce beryllium capsules with radial argon profiles that met specifications and were consistent from run to run and from system to system. During this process we gained insight into the beryllium coating process. Finally, the radial argon variation was shown to be dependent on sputter target thickness. We also found that the argon content in the shells was extremely dependent on the position of the shells with respect to the gun.

  17. Comparability and reproducibility of adult male anogenital distance measurements for two different methods.

    PubMed

    Mendiola, J; Oñate-Celdrán, J; Samper-Mateo, P; Arense-Gonzalo, J J; Torres-Roca, M; Sánchez-Rodríguez, C; García-Escudero, D; Fontana-Compiano, L O; Eisenberg, M L; Swan, S H; Torres-Cantero, A M

    2016-07-01

    The distance from the genitals to the anus, anogenital distance, reflects androgen concentration during prenatal development in mammals. The use of anogenital distance in human studies is still very limited and the quality and consistency of measurements is an important methodological issue. The aim of this study was to assess the feasibility and reproducibility of adult male anogenital distance measurements by two different methods. All men were attending an outpatient clinic at a university hospital and underwent an andrological examination and completed a brief questionnaire. Two variants of anogenital distance [from the anus to the posterior base of the scrotum (AGDAS ) and to the cephalad insertion of the penis (AGDAP )] by two methods (lithotomy or frog-legged position) were assessed in 70 men. Within and between coefficient of variations, intra-class correlation coefficients, two-way repeated-measures analysis of variance, and scatter and Bland-Altman plots were calculated. The two methods produced similar values for AGDAP but different estimates for AGDAS . Nonetheless, the overall agreement (ICC ≥ 0.80) was acceptable for both measures. Therefore, both methods are internally consistent and adequate for epidemiological studies, and may be used depending on the available medical resources, clinical setting, and populations. PMID:27153294

  18. Reproducibility of Dynamic MR Imaging Pelvic Measurements: A Multi-institutional Study

    PubMed Central

    Lockhart, Mark E.; Fielding, Julia R.; Richter, Holly E.; Brubaker, Linda; Salomon, Caryl G.; Ye, Wen; Hakim, Christiane M.; Wai, Clifford Y.; Stolpen, Alan H.; Weber, Anne M.

    2008-01-01

    Purpose: To assess the reproducibility of bone and soft-tissue pelvimetry measurements obtained from dynamic magnetic resonance (MR) imaging studies in primiparous women across multiple centers. Materials and Methods: All subjects prospectively gave consent for participation in this institutional review board–approved, HIPAA-compliant study. At six clinical sites, standardized dynamic pelvic 1.5-T multiplanar T2-weighted MR imaging was performed in three groups of primiparous women at 6–12 months after birth: Group 1, vaginal delivery with anal sphincter tear (n = 93); group 2, vaginal delivery without anal sphincter tear (n = 79); and group 3, cesarean delivery without labor (n = 26). After standardized central training, blinded readers at separate clinical sites and a blinded expert central reader measured nine bone and 10 soft-tissue pelvimetry parameters. Subsequently, three readers underwent additional standardized training, and reread 20 MR imaging studies. Measurement variability was assessed by using intraclass correlation for agreement between the clinical site and central readers. Acceptable agreement was defined as an intraclass correlation coefficient (ICC) of at least 0.7. Results: There was acceptable agreement (ICC range, 0.71–0.93) for eight of 19 MR imaging parameters at initial readings of 198 subjects. The remaining parameters had an ICC range of 0.13–0.66. Additional training reduced measurement variability: Twelve of 19 parameters had acceptable agreement (ICC range, 0.70–0.92). Correlations were greater for bone (ICC, ≥0.70 in five [initial readings] and eight of nine [rereadings] variables) than for soft-tissue measurements (ICC, ≥0.70 in three [initial readings] of 10 and four [rereadings] of 10 readings, respectively). Conclusion: Despite standardized central training, there is high variability of pelvic MR imaging measurements among readers, particularly for soft-tissue structures. Although slightly improved with additional

  19. ANALYTIC MODELING OF THE MORETON WAVE KINEMATICS

    SciTech Connect

    Temmer, M.; Veronig, A. M.

    2009-09-10

    The issue whether Moreton waves are flare-ignited or coronal mass ejection (CME)-driven, or a combination of both, is still a matter of debate. We develop an analytical model describing the evolution of a large-amplitude coronal wave emitted by the expansion of a circular source surface in order to mimic the evolution of a Moreton wave. The model results are confronted with observations of a strong Moreton wave observed in association with the X3.8/3B flare/CME event from 2005 January 17. Using different input parameters for the expansion of the source region, either derived from the real CME observations (assuming that the upward moving CME drives the wave), or synthetically generated scenarios (expanding flare region, lateral expansion of the CME flanks), we calculate the kinematics of the associated Moreton wave signature. Those model input parameters are determined which fit the observed Moreton wave kinematics best. Using the measured kinematics of the upward moving CME as the model input, we are not able to reproduce the observed Moreton wave kinematics. The observations of the Moreton wave can be reproduced only by applying a strong and impulsive acceleration for the source region expansion acting in a piston mechanism scenario. Based on these results we propose that the expansion of the flaring region or the lateral expansion of the CME flanks is more likely the driver of the Moreton wave than the upward moving CME front.

  20. Treatment acceptability among mexican american parents.

    PubMed

    Borrego, Joaquin; Ibanez, Elizabeth S; Spendlove, Stuart J; Pemberton, Joy R

    2007-09-01

    There is a void in the literature with regard to Hispanic parents' views about common interventions for children with behavior problems. The purpose of this study was to examine the treatment acceptability of child management techniques in a Mexican American sample. Parents' acculturation was also examined to determine if it would account for differences in treatment acceptability. Mexican American parents found response cost, a punishment-based technique, more acceptable than positive reinforcement-based techniques (e.g., differential attention). Results suggest that Mexican American parents' acculturation has little impact on acceptability of child management interventions. No association was found between mothers' acculturation and treatment acceptability. However, more acculturated Mexican American fathers viewed token economy as more acceptable than less acculturated fathers. Results are discussed in the context of clinical work and research with Mexican Americans.

  1. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder

    PubMed Central

    Richardson, Jason R.; Taylor, Michele M.; Shalat, Stuart L.; Guillot, Thomas S.; Caudle, W. Michael; Hossain, Muhammad M.; Mathews, Tiffany A.; Jones, Sara R.; Cory-Slechta, Deborah A.; Miller, Gary W.

    2015-01-01

    Attention-deficit hyperactivity disorder (ADHD) is estimated to affect 8–12% of school-age children worldwide. ADHD is a complex disorder with significant genetic contributions. However, no single gene has been linked to a significant percentage of cases, suggesting that environmental factors may contribute to ADHD. Here, we used behavioral, molecular, and neurochemical techniques to characterize the effects of developmental exposure to the pyrethroid pesticide deltamethrin. We also used epidemiologic methods to determine whether there is an association between pyrethroid exposure and diagnosis of ADHD. Mice exposed to the pyrethroid pesticide deltamethrin during development exhibit several features reminiscent of ADHD, including elevated dopamine transporter (DAT) levels, hyperactivity, working memory and attention deficits, and impulsive-like behavior. Increased DAT and D1 dopamine receptor levels appear to be responsible for the behavioral deficits. Epidemiologic data reveal that children aged 6–15 with detectable levels of pyrethroid metabolites in their urine were more than twice as likely to be diagnosed with ADHD. Our epidemiologic finding, combined with the recapitulation of ADHD behavior in pesticide-treated mice, provides a mechanistic basis to suggest that developmental pyrethroid exposure is a risk factor for ADHD.—Richardson, J. R., Taylor, M. M., Shalat, S. L., Guillot III, T. S., Caudle, W. M., Hossain, M. M., Mathews, T. A., Jones, S. R., Cory-Slechta, D. A., Miller, G. W. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder. PMID:25630971

  2. How to Obtain Reproducible Results for Lithium Sulfur Batteries

    SciTech Connect

    Zheng, Jianming; Lu, Dongping; Gu, Meng; Wang, Chong M.; Zhang, Jiguang; Liu, Jun; Xiao, Jie

    2013-01-01

    The basic requirements for getting reliable Li-S battery data have been discussed in this work. Unlike Li-ion batteries, electrolyte-rich environment significantly affects the cycling stability of Li-S batteries prepared and tested under the same conditions. The reason has been assigned to the different concentrations of polysulfide-containing electrolytes in the cells, which have profound influences on both sulfur cathode and lithium anode. At optimized S/E ratio of 50 g L-1, a good balance among electrolyte viscosity, wetting ability, diffusion rate dissolved polysulfide and nucleation/growth of short-chain Li2S/Li2S2 has been built along with largely reduced contamination on the lithium anode side. Accordingly, good cyclability, high reversible capacity and Coulombic efficiency are achieved in Li-S cell with controlled S/E ratio without any additive. Other factors such as sulfur content in the composite and sulfur loading on the electrode also need careful concern in Li-S system in order to generate reproducible results and gauge the various methods used to improve Li-S battery technology.

  3. Reproducibility and reliability of fetal cardiac time intervals using magnetocardiography.

    PubMed

    van Leeuwen, P; Lange, S; Klein, A; Geue, D; Zhang, Y; Krause, H J; Grönemeyer, D

    2004-04-01

    We investigated several factors which may affect the accuracy of fetal cardiac time intervals (CTI) determined in magnetocardiographic (MCG) recordings: observer differences, the number of available recording sites and the type of sensor used in acquisition. In 253 fetal MCG recordings, acquired using different biomagnetometer devices between the 15th and 42nd weeks of gestation, P-wave, QRS complex and T-wave onsets and ends were identified in signal averaged data sets independently by different observers. Using a defined procedure for setting signal events, interobserver reliability was high. Increasing the number of registration sites led to more accurate identification of the events. The differences in wave morphology between magnetometer and gradiometer configurations led to deviations in timing whereas the differences between low and high temperature devices seemed to be primarily due to noise. Signal-to-noise ratio played an important overall role in the accurate determination of CTI and changes in signal amplitude associated with fetal maturation may largely explain the effects of gestational age on reproducibility. As fetal CTI may be of value in the identification of pathologies such as intrauterine growth retardation or fetal cardiac hypertrophy, their reliable estimation will be enhanced by strategies which take these factors into account.

  4. A silicon retina that reproduces signals in the optic nerve.

    PubMed

    Zaghloul, Kareem A; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor-and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  5. High Reproducibility of ELISPOT Counts from Nine Different Laboratories.

    PubMed

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem; BenHamouda, Nadine; Gottwein, Judith; Laxmanan, Sreenivas; Levine, Steven M; Loffredo, John T; McArdle, Stephanie; Neudoerfl, Christine; Roen, Diana; Silina, Karina; Welch, Mackenzie; Lehmann, Paul V

    2015-01-01

    The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable background intensities. Due to the subjective nature of judging maximal and minimal spot sizes, different investigators come up with different numbers. This study aims to determine whether statistics-based, automated size-gating can harmonize the number of spot counts calculated between different laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A) Basic Count™ relying on subjective counting parameters set by the respective investigators and (B) SmartCount™, an automated counting protocol by the ImmunoSpot® Software that uses statistics-based spot size auto-gating with spot intensity auto-thresholding. The average coefficient of variation (CV) for the mean values between independent laboratories was 26.7% when counting with Basic Count™, and 6.7% when counting with SmartCount™. Our data indicates that SmartCount™ allows harmonization of counting ELISPOT results between different laboratories and investigators. PMID:25585297

  6. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  7. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  8. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles.

  9. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-04-27

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics.

  10. Accurate measurements of dynamics and reproducibility in small genetic networks

    PubMed Central

    Dubuis, Julien O; Samanta, Reba; Gregor, Thomas

    2013-01-01

    Quantification of gene expression has become a central tool for understanding genetic networks. In many systems, the only viable way to measure protein levels is by immunofluorescence, which is notorious for its limited accuracy. Using the early Drosophila embryo as an example, we show that careful identification and control of experimental error allows for highly accurate gene expression measurements. We generated antibodies in different host species, allowing for simultaneous staining of four Drosophila gap genes in individual embryos. Careful error analysis of hundreds of expression profiles reveals that less than ∼20% of the observed embryo-to-embryo fluctuations stem from experimental error. These measurements make it possible to extract not only very accurate mean gene expression profiles but also their naturally occurring fluctuations of biological origin and corresponding cross-correlations. We use this analysis to extract gap gene profile dynamics with ∼1 min accuracy. The combination of these new measurements and analysis techniques reveals a twofold increase in profile reproducibility owing to a collective network dynamics that relays positional accuracy from the maternal gradients to the pair-rule genes. PMID:23340845

  11. Reproducibility of tactile assessments for children with unilateral cerebral palsy.

    PubMed

    Auld, Megan Louise; Ware, Robert S; Boyd, Roslyn Nancy; Moseley, G Lorimer; Johnston, Leanne Marie

    2012-05-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were Semmes-Weinstein monofilaments: 75% (90%); single-point localization: 69% (97%); static two-point discrimination: 93% (97%); and moving two-point discrimination: 87% (97%). Test-retest reliability for registration and unilateral spatial tactile perception tests was high in children with CP (intraclass correlation coefficient [ICC] = 0.79-0.96). Two tests demonstrated a learning effect for children with CP, double simultaneous and tactile texture perception. Stereognosis had a ceiling effect for TDC (ICC = 0) and variability for children with CP (% exact agreement = 47%-50%). The Semmes-Weinstein monofilaments, single-point localization, and both static and moving two-point discrimination are recommended for use in practice and research. Although recommended to provide a comprehensive assessment, the measures of double simultaneous, stereognosis, and tactile texture perception may not be responsive to change over time in children with unilateral CP.

  12. Magnetofection: A Reproducible Method for Gene Delivery to Melanoma Cells

    PubMed Central

    Prosen, Lara; Prijic, Sara; Music, Branka; Lavrencak, Jaka; Cemazar, Maja; Sersa, Gregor

    2013-01-01

    Magnetofection is a nanoparticle-mediated approach for transfection of cells, tissues, and tumors. Specific interest is in using superparamagnetic iron oxide nanoparticles (SPIONs) as delivery system of therapeutic genes. Magnetofection has already been described in some proof-of-principle studies; however, fine tuning of the synthesis of SPIONs is necessary for its broader application. Physicochemical properties of SPIONs, synthesized by the co-precipitation in an alkaline aqueous medium, were tested after varying different parameters of the synthesis procedure. The storage time of iron(II) sulfate salt, the type of purified water, and the synthesis temperature did not affect physicochemical properties of SPIONs. Also, varying the parameters of the synthesis procedure did not influence magnetofection efficacy. However, for the pronounced gene expression encoded by plasmid DNA it was crucial to functionalize poly(acrylic) acid-stabilized SPIONs (SPIONs-PAA) with polyethyleneimine (PEI) without the adjustment of its elementary alkaline pH water solution to the physiological pH. In conclusion, the co-precipitation of iron(II) and iron(III) sulfate salts with subsequent PAA stabilization, PEI functionalization, and plasmid DNA binding is a robust method resulting in a reproducible and efficient magnetofection. To achieve high gene expression is important, however, the pH of PEI water solution for SPIONs-PAA functionalization, which should be in the alkaline range. PMID:23862136

  13. Reproducibility of Vibrionaceae population structure in coastal bacterioplankton

    PubMed Central

    Szabo, Gitta; Preheim, Sarah P; Kauffman, Kathryn M; David, Lawrence A; Shapiro, Jesse; Alm, Eric J; Polz, Martin F

    2013-01-01

    How reproducibly microbial populations assemble in the wild remains poorly understood. Here, we assess evidence for ecological specialization and predictability of fine-scale population structure and habitat association in coastal ocean Vibrionaceae across years. We compare Vibrionaceae lifestyles in the bacterioplankton (combinations of free-living, particle, or zooplankton associations) measured using the same sampling scheme in 2006 and 2009 to assess whether the same groups show the same environmental association year after year. This reveals complex dynamics with populations falling primarily into two categories: (i) nearly equally represented in each of the two samplings and (ii) highly skewed, often to an extent that they appear exclusive to one or the other sampling times. Importantly, populations recovered at the same abundance in both samplings occupied highly similar habitats suggesting predictable and robust environmental association while skewed abundances of some populations may be triggered by shifts in ecological conditions. The latter is supported by difference in the composition of large eukaryotic plankton between years, with samples in 2006 being dominated by copepods, and those in 2009 by diatoms. Overall, the comparison supports highly predictable population-habitat linkage but highlights the fact that complex, and often unmeasured, environmental dynamics in habitat occurrence may have strong effects on population dynamics. PMID:23178668

  14. Reproducibility of cold provocation in patients with Raynaud's phenomenon.

    PubMed

    Wigley, F M; Malamet, R; Wise, R A

    1987-08-01

    Twenty-five patients with Raynaud's phenomenon had serial cold challenges during a double blinded drug trial. The data were analyzed to determine the reproducibility of cold provocation in the induction of critical closure of the digital artery in patients with Raynaud's phenomenon. Finger systolic pressure (FSP) was measured after local digital cooling using a digital strain gauge placed around the distal phalanx. Nineteen of 25 patients completed the study. The prevalence of inducing a Raynaud's attack decreased with each successive cold challenge from 74% of patients at initial challenge to 42% at the 3rd challenge. A lower temperature was required to induce a Raynaud's attack at last challenge (10.6 +/- 0.6 degrees C) compared to the first cold challenge (13.2 +/- 1.0 degrees C). Our data demonstrate adaptation to a laboratory cold challenge through the winter months in patients with Raynaud's phenomenon and show it is an important factor in objectively assessing drug efficacy in the treatment of Raynaud's phenomenon.

  15. Reproducing natural spider silks' copolymer behavior in synthetic silk mimics.

    PubMed

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-12-10

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia , indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  16. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  17. A silicon retina that reproduces signals in the optic nerve

    NASA Astrophysics Data System (ADS)

    Zaghloul, Kareem A.; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor—and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  18. Stochastic simulations of minimal self-reproducing cellular systems.

    PubMed

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2007-10-29

    This paper is a theoretical attempt to gain insight into the problem of how self-assembling vesicles (closed bilayer structures) could progressively turn into minimal self-producing and self-reproducing cells, i.e. into interesting candidates for (proto)biological systems. With this aim, we make use of a recently developed object-oriented platform to carry out stochastic simulations of chemical reaction networks that take place in dynamic cellular compartments. We apply this new tool to study the behaviour of different minimal cell models, making realistic assumptions about the physico-chemical processes and conditions involved (e.g. thermodynamic equilibrium/non-equilibrium, variable volume-to-surface relationship, osmotic pressure, solute diffusion across the membrane due to concentration gradients, buffering effect). The new programming platform has been designed to analyse not only how a single protometabolic cell could maintain itself, grow or divide, but also how a collection of these cells could 'evolve' as a result of their mutual interactions in a common environment. PMID:17510021

  19. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  20. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  1. Optimizing reproducibility evaluation for random amplified polymorphic DNA markers.

    PubMed

    Ramos, J R; Telles, M P C; Diniz-Filho, J A F; Soares, T N; Melo, D B; Oliveira, G

    2008-01-01

    The random amplified polymorphic DNA (RAPD) technique is often criticized because it usually shows low levels of repeatability; thus it can generate spurious bands. These problems can be partially overcome by rigid laboratory protocols and by performing repeatability tests. However, because it is expensive and time-consuming to obtain genetic data twice for all individuals, a few randomly chosen individuals are usually selected for a priori repeatability analysis, introducing a potential bias in genetic parameter estimates. We developed a procedure to optimize repeatability analysis based on RAPD data, which was applied to evaluate genetic variability in three local populations of Tibochina papyrus, an endemic Cerrado plant found in elevated rocky fields in Brazil. We used a simulated annealing procedure to select the smallest number of individuals that contain all bands and repeated the analyses only for those bands that were reproduced in these individuals. We compared genetic parameter estimates using HICKORY and POPGENE softwares on an unreduced data set and on data sets in which we eliminated bands based on repeatability of individuals selected by simulated annealing and based on three randomly selected individuals. Genetic parameter estimates were very similar when we used the optimization procedure to reduce the number of bands analyzed, but as expected, selecting only three individuals to evaluate the repeatability of bands produced very different estimates. We conclude that the problems of repeatability attributed to RAPD markers could be due to bias in the selection of loci and primers and not necessarily to the RAPD technique per se. PMID:19065774

  2. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  3. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  4. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  5. Periotest values: Its reproducibility, accuracy, and variability with hormonal influence

    PubMed Central

    Chakrapani, Swarna; Goutham, Madireddy; Krishnamohan, Thota; Anuparthy, Sujitha; Tadiboina, Nagarjuna; Rambha, Somasekhar

    2015-01-01

    Tooth mobility can be assessed by both subjective and objective means. The use of subjective measures may lead to bias and hence it becomes imperative to use objective means to assess tooth mobility. It has also been observed that hormonal fluctuations may have significantly influence tooth mobility. Aims: The study was undertaken to assess the reproducibility of periotest in the assessment of tooth mobility and, to unravel the obscurity associated with the hormonal influence on tooth mobility. Materials and Methods: 100 subjects were included in the study and were divided equally into two groups based on their age, group I (11-14 years) and group II(16-22 years). Results: There was no statistical significant difference between the periotest values (PTV) taken at two different time periods with a time difference of 20 minutes. PTV of group I was found to have a statistical significant greater PTV than group II. Conclusion: Periotest can reliably measure tooth mobility. Tooth mobility is greater during puberty as compared to adolescence and during adolescence mobility was slightly greater in males. PMID:25684904

  6. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  7. Reproducing stone monument photosynthetic-based colonization under laboratory conditions.

    PubMed

    Miller, Ana Zélia; Laiz, Leonila; Gonzalez, Juan Miguel; Dionísio, Amélia; Macedo, Maria Filomena; Saiz-Jimenez, Cesareo

    2008-11-01

    In order to understand the biodeterioration process occurring on stone monuments, we analyzed the microbial communities involved in these processes and studied their ability to colonize stones under controlled laboratory experiments. In this study, a natural green biofilm from a limestone monument was cultivated, inoculated on stone probes of the same lithotype and incubated in a laboratory chamber. This incubation system, which exposes stone samples to intermittently sprinkling water, allowed the development of photosynthetic biofilms similar to those occurring on stone monuments. Denaturing gradient gel electrophoresis (DGGE) analysis was used to evaluate the major microbial components of the laboratory biofilms. Cyanobacteria, green microalgae, bacteria and fungi were identified by DNA-based molecular analysis targeting the 16S and 18S ribosomal RNA genes. The natural green biofilm was mainly composed by the Chlorophyta Chlorella, Stichococcus, and Trebouxia, and by Cyanobacteria belonging to the genera Leptolyngbya and Pleurocapsa. A number of bacteria belonging to Alphaproteobacteria, Bacteroidetes and Verrucomicrobia were identified, as well as fungi from the Ascomycota. The laboratory colonization experiment on stone probes showed a colonization pattern similar to that occurring on stone monuments. The methodology described in this paper allowed to reproduce a colonization equivalent to the natural biodeteriorating process.

  8. Reproducible fabrication of stable small nano Pt with high activity for sensor applications.

    PubMed

    Ye, Pingping; Guo, Xiaoyu; Liu, Guiting; Chen, Huifen; Pan, Yuxia; Wen, Ying; Yang, Haifeng

    2013-07-26

    Pt nanoparticles with an average size of 2-3 nm in diameter were reproducibly synthesized by reduction of H₂PtCl₆ solution containing inositol hexaphosphate (IP₆) as the stabilizing agent. Single crystals with Pt(111) faces of the resulting cubic nanoparticles were revealed by the electron diffraction pattern. The PtNPs-IP₆ nanoparticles were used to modify an electrode as a nonenzymatic sensor for H₂O₂ detection, exhibiting a fast response and high sensitivity. A low detection limit of 2.0 × 10⁻⁷ M (S/N = 3) with two linear ranges between 2.4 × 10⁻⁷ and 1.3 × 10⁻³ M (R² = 0.9987) and between 1.3 × 10⁻³ and 1.3 × 10⁻² M (R² = 0.9980) was achieved. The attractive electrochemical performance of PtNPs-IP₆ enables it to be employed as a promising material for the development of Pt-based analytical systems and other applications.

  9. Discrete restricted four-body problem: Existence of proof of equilibria and reproducibility of periodic orbits

    SciTech Connect

    Minesaki, Yukitaka

    2015-01-01

    We propose the discrete-time restricted four-body problem (d-R4BP), which approximates the orbits of the restricted four-body problem (R4BP). The d-R4BP is given as a special case of the discrete-time chain regularization of the general N-body problem published in Minesaki. Moreover, we analytically prove that the d-R4BP yields the correct orbits corresponding to the elliptic relative equilibrium solutions of the R4BP when the three primaries form an equilateral triangle at any time. Such orbits include the orbit of a relative equilibrium solution already discovered by Baltagiannis and Papadakis. Until the proof in this work, there has been no discrete analog that preserves the orbits of elliptic relative equilibrium solutions in the R4BP. For a long time interval, the d-R4BP can precisely compute some stable periodic orbits in the Sun–Jupiter–Trojan asteroid–spacecraft system that cannot necessarily be reproduced by other generic integrators.

  10. Reproducibility study for the detection of Staphylococcal enterotoxins in dairy products between official Italian national laboratories.

    PubMed

    Bianchi, D M; Ingravalle, F; Adriano, D; Gallina, S; Gramaglia, M; Zuccon, F; Astegiano, S; Bellio, A; Macori, G; Ru, G; Decastelli, L

    2014-06-01

    Staphylococcal food poisoning is a common foodborne disease caused by the ingestion of staphylococcal enterotoxins (SEs) produced mainly by enterotoxigenic strains of Staphylococcus aureus. To date, 21 SEs and/or enterotoxin-like types have been identified, several of which represent a potential hazard for consumers. To protect consumer health and to reduce the amount of SE-contaminated food entering the market, European Union legislation regulating food safety requires testing for SEs. The Italian National Reference Laboratory organized a ring trial to test technical and analytical proficiency in the national network of official food laboratories. Twenty-four laboratories took part, and each received and analyzed 24 blind dairy samples. Reproducibility of the results from the laboratories was assessed by the Cohen k index, and accuracy (sensitivity and specificity) was evaluated according to the International Organization for Standardization definition (ISO 16140:2003). Trial results revealed partially satisfactory agreement: 254 of 276 possible paired participants (92%) reached a k value >0.60, which is conventionally recognized as satisfactory. Accuracy was deemed satisfactory; 100% sensitivity and 100% specificity were achieved by 22 and 18 of the 24 laboratories, respectively.

  11. Synthesis of Hapten-Protein Conjugate Vaccines with Reproducible Hapten Densities.

    PubMed

    Torres, Oscar B; Alving, Carl R; Matyas, Gary R

    2016-01-01

    The ability to prepare hapten-carrier conjugates reproducibly with consistent lot-to-lot hapten densities and protein yields is a critical component of hapten vaccine development. This entails the development of appropriate coupling chemistries that do not cause protein precipitation and the development of methods to quantify hapten density. Recently, extensive efforts have been devoted to design vaccines against drugs of abuse. We describe, herein, a method for conjugation of a morphine-like hapten (MorHap) to tetanus toxoid (TT), which involves conjugation of MorHap to the surface lysines of TT through the N-hydroxysuccinimide portion of a heterobifunctional linker and the subsequent attachment of the thiol on MorHap to the maleimide portion of the cross-linker. Methods are described for the analytical quantification of the hapten density of the conjugates using modified Ellman's test, trinitrobenzenesulfonic acid (TNBS) assay, and matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS). PMID:27076161

  12. Synthesis of Hapten-Protein Conjugate Vaccines with Reproducible Hapten Densities.

    PubMed

    Torres, Oscar B; Alving, Carl R; Matyas, Gary R

    2016-01-01

    The ability to prepare hapten-carrier conjugates reproducibly with consistent lot-to-lot hapten densities and protein yields is a critical component of hapten vaccine development. This entails the development of appropriate coupling chemistries that do not cause protein precipitation and the development of methods to quantify hapten density. Recently, extensive efforts have been devoted to design vaccines against drugs of abuse. We describe, herein, a method for conjugation of a morphine-like hapten (MorHap) to tetanus toxoid (TT), which involves conjugation of MorHap to the surface lysines of TT through the N-hydroxysuccinimide portion of a heterobifunctional linker and the subsequent attachment of the thiol on MorHap to the maleimide portion of the cross-linker. Methods are described for the analytical quantification of the hapten density of the conjugates using modified Ellman's test, trinitrobenzenesulfonic acid (TNBS) assay, and matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS).

  13. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  14. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  15. Analytical chemistry of nickel.

    PubMed

    Stoeppler, M

    1984-01-01

    Analytical chemists are faced with nickel contents in environmental and biological materials ranging from the mg/kg down to the ng/kg level. Sampling and sample treatment have to be performed with great care at lower levels, and this also applies to enrichment and separation procedures. The classical determination methods formerly used have been replaced almost entirely by different forms of atomic absorption spectrometry. Electroanalytical methods are also of increasing importance and at present provide the most sensitive approach. Despite the powerful methods available, achieving reliable results is still a challenge for the analyst requiring proper quality control measures.

  16. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  17. Acceptability of blood and blood substitutes.

    PubMed

    Ferguson, E; Prowse, C; Townsend, E; Spence, A; Hilten, J A van; Lowe, K

    2008-03-01

    Alternatives to donor blood have been developed in part to meet increasing demand. However, new biotechnologies are often associated with increased perceptions of risk and low acceptance. This paper reviews developments of alternatives and presents data, from a field-based experiment in the UK and Holland, on the risks and acceptance of donor blood and alternatives (chemical, genetically modified and bovine). UK groups perceived all substitutes as riskier than the Dutch. There is a negative association between perceived risk and acceptability. Solutions to increasing acceptance are discussed in terms of implicit attitudes, product naming and emotional responses.

  18. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  19. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  20. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  1. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  2. 7 CFR 932.32 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE OLIVES GROWN IN CALIFORNIA Order Regulating Handling Olive Administrative Committee § 932.32 Acceptance. Any person selected by the...

  3. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  4. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  5. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  6. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  7. 7 CFR 1207.323 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ORDERS; MISCELLANEOUS COMMODITIES), DEPARTMENT OF AGRICULTURE POTATO RESEARCH AND PROMOTION PLAN Potato Research and Promotion Plan National Potato Promotion Board § 1207.323 Acceptance. Each...

  8. 2013 SYR Accepted Poster Abstracts.

    PubMed

    2013-01-01

    SYR 2013 Accepted Poster abstracts: 1. Benefits of Yoga as a Wellness Practice in a Veterans Affairs (VA) Health Care Setting: If You Build It, Will They Come? 2. Yoga-based Psychotherapy Group With Urban Youth Exposed to Trauma. 3. Embodied Health: The Effects of a Mind�Body Course for Medical Students. 4. Interoceptive Awareness and Vegetable Intake After a Yoga and Stress Management Intervention. 5. Yoga Reduces Performance Anxiety in Adolescent Musicians. 6. Designing and Implementing a Therapeutic Yoga Program for Older Women With Knee Osteoarthritis. 7. Yoga and Life Skills Eating Disorder Prevention Among 5th Grade Females: A Controlled Trial. 8. A Randomized, Controlled Trial Comparing the Impact of Yoga and Physical Education on the Emotional and Behavioral Functioning of Middle School Children. 9. Feasibility of a Multisite, Community based Randomized Study of Yoga and Wellness Education for Women With Breast Cancer Undergoing Chemotherapy. 10. A Delphi Study for the Development of Protocol Guidelines for Yoga Interventions in Mental Health. 11. Impact Investigation of Breathwalk Daily Practice: Canada�India Collaborative Study. 12. Yoga Improves Distress, Fatigue, and Insomnia in Older Veteran Cancer Survivors: Results of a Pilot Study. 13. Assessment of Kundalini Mantra and Meditation as an Adjunctive Treatment With Mental Health Consumers. 14. Kundalini Yoga Therapy Versus Cognitive Behavior Therapy for Generalized Anxiety Disorder and Co-Occurring Mood Disorder. 15. Baseline Differences in Women Versus Men Initiating Yoga Programs to Aid Smoking Cessation: Quitting in Balance Versus QuitStrong. 16. Pranayam Practice: Impact on Focus and Everyday Life of Work and Relationships. 17. Participation in a Tailored Yoga Program is Associated With Improved Physical Health in Persons With Arthritis. 18. Effects of Yoga on Blood Pressure: Systematic Review and Meta-analysis. 19. A Quasi-experimental Trial of a Yoga based Intervention to Reduce Stress and

  9. 2013 SYR Accepted Poster Abstracts.

    PubMed

    2013-01-01

    SYR 2013 Accepted Poster abstracts: 1. Benefits of Yoga as a Wellness Practice in a Veterans Affairs (VA) Health Care Setting: If You Build It, Will They Come? 2. Yoga-based Psychotherapy Group With Urban Youth Exposed to Trauma. 3. Embodied Health: The Effects of a Mind�Body Course for Medical Students. 4. Interoceptive Awareness and Vegetable Intake After a Yoga and Stress Management Intervention. 5. Yoga Reduces Performance Anxiety in Adolescent Musicians. 6. Designing and Implementing a Therapeutic Yoga Program for Older Women With Knee Osteoarthritis. 7. Yoga and Life Skills Eating Disorder Prevention Among 5th Grade Females: A Controlled Trial. 8. A Randomized, Controlled Trial Comparing the Impact of Yoga and Physical Education on the Emotional and Behavioral Functioning of Middle School Children. 9. Feasibility of a Multisite, Community based Randomized Study of Yoga and Wellness Education for Women With Breast Cancer Undergoing Chemotherapy. 10. A Delphi Study for the Development of Protocol Guidelines for Yoga Interventions in Mental Health. 11. Impact Investigation of Breathwalk Daily Practice: Canada�India Collaborative Study. 12. Yoga Improves Distress, Fatigue, and Insomnia in Older Veteran Cancer Survivors: Results of a Pilot Study. 13. Assessment of Kundalini Mantra and Meditation as an Adjunctive Treatment With Mental Health Consumers. 14. Kundalini Yoga Therapy Versus Cognitive Behavior Therapy for Generalized Anxiety Disorder and Co-Occurring Mood Disorder. 15. Baseline Differences in Women Versus Men Initiating Yoga Programs to Aid Smoking Cessation: Quitting in Balance Versus QuitStrong. 16. Pranayam Practice: Impact on Focus and Everyday Life of Work and Relationships. 17. Participation in a Tailored Yoga Program is Associated With Improved Physical Health in Persons With Arthritis. 18. Effects of Yoga on Blood Pressure: Systematic Review and Meta-analysis. 19. A Quasi-experimental Trial of a Yoga based Intervention to Reduce Stress and

  10. In acceptance we trust? Conceptualising acceptance as a viable approach to NGO security management.

    PubMed

    Fast, Larissa A; Freeman, C Faith; O'Neill, Michael; Rowley, Elizabeth

    2013-04-01

    This paper documents current understanding of acceptance as a security management approach and explores issues and challenges non-governmental organisations (NGOs) confront when implementing an acceptance approach to security management. It argues that the failure of organisations to systematise and clearly articulate acceptance as a distinct security management approach and a lack of organisational policies and procedures concerning acceptance hinder its efficacy as a security management approach. The paper identifies key and cross-cutting components of acceptance that are critical to its effective implementation in order to advance a comprehensive and systematic concept of acceptance. The key components of acceptance illustrate how organisational and staff functions affect positively or negatively an organisation's acceptance, and include: an organisation's principles and mission, communications, negotiation, programming, relationships and networks, stakeholder and context analysis, staffing, and image. The paper contends that acceptance is linked not only to good programming, but also to overall organisational management and structures. PMID:23278470

  11. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  12. The analytic renormalization group

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank

    2016-08-01

    Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k ∈ Z, associated with the Matsubara frequencies νk = 2 πk / β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct "Analytic Renormalization Group" linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk | < μ (with the possible exception of the zero mode G0), together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk | ≥ μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  13. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  14. Scan-rescan reproducibility of CT densitometric measures of emphysema

    NASA Astrophysics Data System (ADS)

    Chong, D.; van Rikxoort, E. M.; Kim, H. J.; Goldin, J. G.; Brown, M. S.

    2011-03-01

    This study investigated the reproducibility of HRCT densitometric measures of emphysema in patients scanned twice one week apart. 24 emphysema patients from a multicenter study were scanned at full inspiration (TLC) and expiration (RV), then again a week later for four scans total. Scans for each patient used the same scanner and protocol, except for tube current in three patients. Lung segmentation with gross airway removal was performed on the scans. Volume, weight, mean lung density (MLD), relative area under -950HU (RA-950), and 15th percentile (PD-15) were calculated for TLC, and volume and an airtrapping mask (RA-air) between -950 and -850HU for RV. For each measure, absolute differences were computed for each scan pair, and linear regression was performed against volume difference in a subgroup with volume difference <500mL. Two TLC scan pairs were excluded due to segmentation failure. The mean lung volumes were 5802 +/- 1420mL for TLC, 3878 +/- 1077mL for RV. The mean absolute differences were 169mL for TLC volume, 316mL for RV volume, 14.5g for weight, 5.0HU for MLD, 0.66p.p. for RA-950, 2.4HU for PD-15, and 3.1p.p. for RA-air. The <500mL subgroup had 20 scan pairs for TLC and RV. The R2 values were 0.8 for weight, 0.60 for MLD, 0.29 for RA-950, 0.31 for PD-15, and 0.64 for RA-air. Our results indicate that considerable variability exists in densitometric measures over one week that cannot be attributed to breathhold or physiology. This has implications for clinical trials relying on these measures to assess emphysema treatment efficacy.

  15. Reproducibility and intraindividual variability of the pattern electroretinogram.

    PubMed

    Jacobi, P C; Walter, P; Brunner, R; Krieglstein, G K

    1994-08-01

    The human pattern electroretinogram (PERG) is a contrast-specific potential presumedly reflecting the functional integrity of ganglion cells. Many studies have devised criteria that enable PERG measurements to distinguish established glaucomatous (hypertonic) eyes from normal controls. As there are relatively few reports concerning the reproducibility and reliability of the PERG, we studied the intraindividual variability of the PERG in 20 healthy subjects. Both transient and steady-state responses were recorded using a high-contrast (98%), black-and-white, counterphasing checkerboard pattern (average luminance, 80 cd/m2) generated by a television monitor (subtending angle, 13.8 degrees x 10.8 degrees) using three different check sizes (15', 30', and 60'). Recordings were performed in both eyes simultaneously at a 7-day interval under test-retest conditions. Responses of 30' spatial frequency were most consistent and resulted in a mean amplitude (+/- SD) of 2.18 +/- 0.95 microV (P50) and 4.00 +/- 1.69 microV (N95) for transient patterns and 1.84 +/- 1.25 microV for steady-state patterns. No statistically significant difference was observed between either right and left eyes, test and retest conditions or 1st- and 7th-day recording sessions for PERG parameters. In linear correlation analysis there was an adequate, positive correlation between the right and left eyes (r = 0.78); a weak correlation between test and retest conditions (r = 0.58); and no correlation between measurements made at a 7-day interval. As a consequence, we conclude that the follow-up of patients (e.g., glaucoma, ocular hypertension) by means of PERG is critical, especially when therapeutic consequences may be based on the physiological variability of a weak retinal signal.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7804106

  16. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  17. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    NASA Astrophysics Data System (ADS)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  18. Development of a Consistent and Reproducible Porcine Scald Burn Model

    PubMed Central

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  19. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy.

    PubMed

    Brinkmann, Benjamin H; Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C; Chen, Min; Tieng, Quang M; He, Jialune; Muñoz-Almaraz, F J; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E; Litt, Brian; Worrell, Gregory A

    2016-06-01

    SEE MORMANN AND ANDRZEJAK DOI101093/BRAIN/AWW091 FOR A SCIENTIFIC COMMENTARY ON THIS ARTICLE  : Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and human

  20. Reproducing American Sign Language sentences: cognitive scaffolding in working memory.

    PubMed

    Supalla, Ted; Hauser, Peter C; Bavelier, Daphne

    2014-01-01

    The American Sign Language Sentence Reproduction Test (ASL-SRT) requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall) and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects' recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies when they failed to recall the sentence correctly. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are considered. PMID

  1. Engineering preliminaries to obtain reproducible mixtures of atelocollagen and polysaccharides.

    PubMed

    Lefter, Cristina-Mihaela; Maier, Stelian Sergiu; Maier, Vasilica; Popa, Marcel; Desbrieres, Jacques

    2013-05-01

    The critical stage in producing blends of biomacromolecules consists in the mixing of component solutions to generate homogenous diluted colloidal systems. Simple experimental investigations allow the establishment of the design rules of recipes and the procedures for preparing homogenous and compositionally reproducible mixtures. Starting from purified solutions of atelocollagen, hyaluronan and native gellan, having as low as possible inorganic salts content, initial binary and ternary mixtures can be prepared up to a total dry matter content of 0.150 g/dL, in no co-precipitating conditions. Two pH manipulation ways are feasible for homogenous mixing: (i) unbuffered prior correction at pH 5.5, and (ii) "rigid" buffering at pH 9.0, using organic species. Atelocollagen including co-precipitates can be obtained in the presence of one or both polysaccharides, preferably in pH domains far from the isoelectric point of scleroprotein. A critical behavior has been observed in mixtures containing gellan, due to its macromolecular dissimilarities compared with atelocollagen. In optimal binary mixtures, the coordinates of threshold points on the phase diagrams are 0.028% w/w atelocollagen/0.025% w/w hyaluronan, and 0.022% w/w atelocollagen/0.020% w/w gellan. Uni- or bi-phasic ternary systems having equilibrated ratios of co-precipitated components can be prepared starting from initial mixtures containing up to 0.032 g/dL atelocollagen, associated with, for example, 0.040 g/dL hyaluronan and 0.008 g/dL gellan, following the first pH manipulation way.

  2. Soft and hard classification by reproducing kernel Hilbert space methods.

    PubMed

    Wahba, Grace

    2002-12-24

    Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of y(i) is a label that indicates which category it came from. For the first problem, we wish to build a model from the training set that assigns to each t in an attribute domain of interest an estimate of the probability pj(t) that a (future) subject with attribute vector t is in category j. The second problem is in some sense less ambitious; it is to build a model that assigns to each t a label, which classifies a future subject with that t into one of the categories or possibly "none of the above." The approach to the first of these two problems discussed here is a special case of what is known as penalized likelihood estimation. The approach to the second problem is known as the support vector machine. We also note some alternate but closely related approaches to the second problem. These approaches are all obtained as solutions to optimization problems in RKHS. Many other problems, in particular the solution of ill-posed inverse problems, can be obtained as solutions to optimization problems in RKHS and are mentioned in passing. We caution the reader that although a large literature exists in all of these topics, in this inaugural article we are selectively highlighting work of the author, former students, and other collaborators.

  3. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  4. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2014-04-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale. A new approach for modelling soil erosion at large spatial scale is here proposed. It is based on the joint use of low data demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available datasets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country level statistics of pre-existing European maps of soil erosion by water is also provided.

  5. Reproducibility of urinary phthalate metabolites in first morning urine samples.

    PubMed Central

    Hoppin, Jane A; Brock, John W; Davis, Barbara J; Baird, Donna D

    2002-01-01

    Phthalates are ubiquitous in our modern environment because of their use in plastics and cosmetic products. Phthalate monoesters--primarily monoethylhexyl phthalate and monobutyl phthalate--are reproductive and developmental toxicants in animals. Accurate measures of phthalate exposure are needed to assess their human health effects. Phthalate monoesters have a biologic half-life of approximately 12 hr, and little is known about the temporal variability and daily reproducibility of urinary measures in humans. To explore these aspects, we measured seven phthalate monoesters and creatinine concentration in two consecutive first-morning urine specimens from 46 African-American women, ages 35-49 years, residing in the Washington, DC, area in 1996-1997. We measured phthalate monoesters using high-pressure liquid chromatography followed by tandem mass spectrometry on a triple quadrupole instrument using atmospheric pressure chemical ionization. We detected four phthalate monoesters in all subjects, with median levels of 31 ng/mL for monobenzyl phthalate (mBzP), 53 ng/mL for monobutyl phthalate (mBP), 211 ng/mL for monoethyl phthalate (mEP), and 7.3 ng/mL for monoethylhexyl phthalate (mEHP). These were similar to concentrations reported for other populations using spot urine specimens. Phthalate levels did not differ between the two sampling days. The Pearson correlation coefficient between the concentrations on the 2 days was 0.8 for mBP, 0.7 for mEHP, 0.6 for mEP, and 0.5 for mBzP. These results suggest that even with the short half-lives of phthalates, women's patterns of exposure may be sufficiently stable to assign an exposure level based on a single first morning void urine measurement. PMID:12003755

  6. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  7. Color accuracy and reproducibility in whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Hulsken, Bas

    2014-01-01

    Abstract We propose a workflow for color reproduction in whole slide imaging (WSI) scanners, such that the colors in the scanned images match to the actual slide color and the inter-scanner variation is minimum. We describe a new method of preparation and verification of the color phantom slide, consisting of a standard IT8-target transmissive film, which is used in color calibrating and profiling the WSI scanner. We explore several International Color Consortium (ICC) compliant techniques in color calibration/profiling and rendering intents for translating the scanner specific colors to the standard display (sRGB) color space. Based on the quality of the color reproduction in histopathology slides, we propose the matrix-based calibration/profiling and absolute colorimetric rendering approach. The main advantage of the proposed workflow is that it is compliant to the ICC standard, applicable to color management systems in different platforms, and involves no external color measurement devices. We quantify color difference using the CIE-DeltaE2000 metric, where DeltaE values below 1 are considered imperceptible. Our evaluation on 14 phantom slides, manufactured according to the proposed method, shows an average inter-slide color difference below 1 DeltaE. The proposed workflow is implemented and evaluated in 35 WSI scanners developed at Philips, called the Ultra Fast Scanners (UFS). The color accuracy, measured as DeltaE between the scanner reproduced colors and the reference colorimetric values of the phantom patches, is improved on average to 3.5 DeltaE in calibrated scanners from 10 DeltaE in uncalibrated scanners. The average inter-scanner color difference is found to be 1.2 DeltaE. The improvement in color performance upon using the proposed method is apparent with the visual color quality of the tissue scans. PMID:26158041

  8. Development of a Consistent and Reproducible Porcine Scald Burn Model.

    PubMed

    Andrews, Christine J; Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  9. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  10. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  11. Is the Sciatic Functional Index always reliable and reproducible?

    PubMed

    Monte-Raso, Vanessa Vilela; Barbieri, Cláudio Henrique; Mazzer, Nilton; Yamasita, Alexandre Calura; Barbieri, Giuliano

    2008-05-30

    The Sciatic Functional Index (SFI) is a quite useful tool for the evaluation of functional recovery of the sciatic nerve of rats in a number of experimental injuries and treatments. Although it is an objective method, it depends on the examiner's ability to adequately recognize and mark the previously established footprint key points, which is an entirely subjective step, thus potentially interfering with the calculations according to the mathematical formulae proposed by different authors. Thus, an interpersonal evaluation of the reproducibility of an SFI computer-aided method was carried out here to study data variability. A severe crush injury was produced on a 5 mm-long segment of the right sciatic nerve of 20 Wistar rats (a 5000 g load directly applied for 10 min) and the SFI was measured by four different examiners (an experienced one and three newcomers) preoperatively and at weekly intervals from the 1st to the 8th postoperative week. Three measurements were made for each print and the average was calculated and used for statistical analysis. The results showed that interpersonal correlation was high (0.82) in the 3rd, 4th, 5th, 7th and 8th weeks, with an unexpected but significant (p<0.01) drop in the 6th week. There was virtually no interpersonal correlation (correlation index close to 0) on the 1st and 2nd weeks, a period during which the variability between animals and examiners (p=0.24 and 0.32, respectively) was similar, certainly due to a poor definition of the footprints. The authors conclude that the SFI method studied here is only reliable from the 3rd week on after a severe lesion of the sciatic nerve of rats.

  12. ANALYTICAL STAR FORMATION RATE FROM GRAVOTURBULENT FRAGMENTATION

    SciTech Connect

    Hennebelle, Patrick; Chabrier, Gilles

    2011-12-20

    We present an analytical determination of the star formation rate (SFR) in molecular clouds, based on a time-dependent extension of our analytical theory of the stellar initial mass function. The theory yields SFRs in good agreement with observations, suggesting that turbulence is the dominant, initial process responsible for star formation. In contrast to previous SFR theories, the present one does not invoke an ad hoc density threshold for star formation; instead, the SFR continuously increases with gas density, naturally yielding two different characteristic regimes, thus two different slopes in the SFR versus gas density relationship, in agreement with observational determinations. Besides the complete SFR derivation, we also provide a simplified expression, which reproduces the complete calculations reasonably well and can easily be used for quick determinations of SFRs in cloud environments. A key property at the heart of both our complete and simplified theory is that the SFR involves a density-dependent dynamical time, characteristic of each collapsing (prestellar) overdense region in the cloud, instead of one single mean or critical freefall timescale. Unfortunately, the SFR also depends on some ill-determined parameters, such as the core-to-star mass conversion efficiency and the crossing timescale. Although we provide estimates for these parameters, their uncertainty hampers a precise quantitative determination of the SFR, within less than a factor of a few.

  13. Meal Replacement Mass Reduction and Integration Acceptability Study

    NASA Technical Reports Server (NTRS)

    Sirmons, T.; Barrett, A.; Richardson, M.; Arias, D.; Schneiderman, J.; Slack, K.; Williams, T.; Douglas, G.

    2017-01-01

    NASA, in planning for long-duration missions, has an imperative to provide a food system with the necessary nutrition, acceptability, and safety to ensure sustainment of crew health and performance. The Orion Multi-Purpose Crew Vehicle (MPCV) and future exploration missions are mass constrained; therefore the team is challenged to reduce the mass of the food system by 10% while maintaining product safety, nutrition, and acceptability. Commercially available products do not meet the nutritional requirements for a full meal replacement in the spaceflight food system, and it is currently unknown if daily meal replacements will impact crew food intake and psychosocial health over time. The purpose of this study was to develop a variety of nutritionally balanced breakfast replacement bars that meet spaceflight nutritional, microbiological, sensorial, and shelf-life requirements, while enabling a 10% savings in food mass. To date, six nutrient-dense meal replacement bars (approximately 700 calories per bar) have been developed, using traditional methods of compression as well as novel ultrasonic compression technologies developed by Creative Resonance Inc. (Phoenix, AZ). The four highest rated bars were evaluated in the Human Exploration Research Analog (HERA) to assess the frequency with which actual meal replacement options may be implemented. Specifically, overall impact of bars on mood, satiety, digestive discomfort, and satisfaction with food. These factors are currently being analyzed to inform successful implementation strategies where crew maintain adequate food intake. In addition, these bars are currently undergoing shelf-life testing to determine long-term sensory acceptability, nutritional stability, qualitative stability of analytical measurements (i.e. water activity and texture), and microbiological compliance over two years of storage at room temperature and potential temperature abuse conditions to predict long-term acceptability. It is expected that

  14. Has analytical flexibility increased in imaging studies of bipolar disorder and major depression?

    PubMed

    Munafò, M R; Kempton, M J

    2015-02-01

    There has been extensive discussion of problems of reproducibility of research. Analytical flexibility may contribute to this, by increasing the likelihood that a reported finding represents a chance result. We explored whether analytical flexibility has increased over time, using human imaging studies of bipolar disorder and major depression. Our results indicate that the number of measures collected per study has increased over time for studies of bipolar disorder, but not for studies of major depression.

  15. A rapid, reproducible, noninvasive predictor of liver graft survival

    PubMed Central

    Zarrinpar, Ali; Lee, Coney; Noguchi, Emily; Yersiz, Hasan; Agopian, Vatche G.; Kaldas, Fady M.; Farmer, Douglas G.; Busuttil, Ronald W.

    2016-01-01

    Background Clinical and laboratory criteria are not reliable predictors of deceased donor liver graft quality. Intraoperative assessment of experienced surgeons is the gold standard. Standardizing and quantifying this assessment is especially needed now that regional sharing is the rule. We prospectively evaluated a novel, simple, rapid, noninvasive, quantitative measure of liver function performed before graft procurement. Materials and methods Using a portable, finger-probe–based device, indocyanine green plasma disappearance rates (ICG-PDR) were measured in adult brain-dead donors in the local donor service area before organ procurement. Results were compared with graft function and outcomes. Both donor and recipient teams were blinded to ICG-PDR measurements. Results Measurements were performed on 53 consecutive donors. Eleven liver grafts were declined by all centers because of quality; the other 42 grafts were transplanted. Logistic regression analysis showed ICG-PDR to be the only donor variable to be significantly associated with 7-d graft survival. Donor risk index, donor age, and transaminase levels at peak or procurement were not significantly associated with 7-d graft survival. Conclusions We report the successful use of a portable quantitative means of measuring liver function and its association with graft survival. These data warrant further exploration in a variety of settings to evaluate acceptable values for donated liver grafts. PMID:25940156

  16. 12 CFR 250.164 - Bankers' acceptances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Bankers' acceptances. 250.164 Section 250.164 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM MISCELLANEOUS INTERPRETATIONS Interpretations § 250.164 Bankers' acceptances. (a) Section 207 of the Bank...

  17. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  18. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  19. 48 CFR 3011.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Market acceptance. 3011.103 Section 3011.103 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Developing Requirements Documents 3011.103 Market acceptance. (a) Contracting officers may act on behalf...

  20. 48 CFR 411.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Market acceptance. 411.103... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 411.103 Market... accordance with FAR 11.103(a), the market acceptability of their items to be offered. (b) The...

  1. 12 CFR 615.5550 - Bankers' acceptances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FUNDING AND FISCAL AFFAIRS, LOAN POLICIES AND OPERATIONS, AND FUNDING OPERATIONS Bankers' Acceptances § 615.5550 Bankers' acceptances. Banks... cooperatives' board of directors, under established policies, may delegate this authority to management....

  2. Mindfulness, Acceptance and Catastrophizing in Chronic Pain

    PubMed Central

    de Boer, Maaike J.; Steinhagen, Hannemike E.; Versteegen, Gerbrig J.; Struys, Michel M. R. F.; Sanderman, Robbert

    2014-01-01

    Objectives Catastrophizing is often the primary target of the cognitive-behavioral treatment of chronic pain. Recent literature on acceptance and commitment therapy (ACT) suggests an important role in the pain experience for the concepts mindfulness and acceptance. The aim of this study is to examine the influence of mindfulness and general psychological acceptance on pain-related catastrophizing in patients with chronic pain. Methods A cross-sectional survey was conducted, including 87 chronic pain patients from an academic outpatient pain center. Results The results show that general psychological acceptance (measured with the AAQ-II) is a strong predictor of pain-related catastrophizing, independent of gender, age and pain intensity. Mindfulness (measured with the MAAS) did not predict levels of pain-related catastrophizing. Discussion Acceptance of psychological experiences outside of pain itself is related to catastrophizing. Thus, acceptance seems to play a role in the pain experience and should be part of the treatment of chronic pain. The focus of the ACT treatment of chronic pain does not necessarily have to be on acceptance of pain per se, but may be aimed at acceptance of unwanted experiences in general. Mindfulness in the sense of “acting with awareness” is however not related to catastrophizing. Based on our research findings in comparisons with those of other authors, we recommend a broader conceptualization of mindfulness and the use of a multifaceted questionnaire for mindfulness instead of the unidimensional MAAS. PMID:24489915

  3. Consumer acceptance of ginseng food products.

    PubMed

    Chung, Hee Sook; Lee, Young-Chul; Rhee, Young Kyung; Lee, Soo-Yeun

    2011-01-01

    Ginseng has been utilized less in food products than in dietary supplements in the United States. Sensory acceptance of ginseng food products by U.S. consumers has not been reported. The objectives of this study were to: (1) determine the sensory acceptance of commercial ginseng food products and (2) assess influence of the addition of sweeteners to ginseng tea and ginseng extract to chocolate on consumer acceptance. Total of 126 consumers participated in 3 sessions for (1) 7 commercial red ginseng food products, (2) 10 ginseng teas varying in levels of sugar or honey, and (3) 10 ginseng milk or dark chocolates varying in levels of ginseng extract. Ginseng candy with vitamin C and ginseng crunchy white chocolate were the most highly accepted, while sliced ginseng root product was the least accepted among the seven commercial products. Sensory acceptance increased in proportion to the content of sugar and honey in ginseng tea, whereas acceptance decreased with increasing content of ginseng extract in milk and dark chocolates. Findings demonstrate that ginseng food product types with which consumers have been already familiar, such as candy and chocolate, will have potential for success in the U.S. market. Chocolate could be suggested as a food matrix into which ginseng can be incorporated, as containing more bioactive compounds than ginseng tea at a similar acceptance level. Future research may include a descriptive analysis with ginseng-based products to identify the key drivers of liking and disliking for successful new product development.

  4. 36 CFR 251.62 - Acceptance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 251.62 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LAND USES Special Uses § 251.62 Acceptance. Except for an easement, a special use authorization shall become effective... extended by the authorized officer. Refusal of an applicant to sign and accept a special use...

  5. 36 CFR 251.62 - Acceptance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 251.62 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LAND USES Special Uses § 251.62 Acceptance. Except for an easement, a special use authorization shall become effective... extended by the authorized officer. Refusal of an applicant to sign and accept a special use...

  6. 36 CFR 251.62 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 251.62 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LAND USES Special Uses § 251.62 Acceptance. Except for an easement, a special use authorization shall become effective... extended by the authorized officer. Refusal of an applicant to sign and accept a special use...

  7. 36 CFR 251.62 - Acceptance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 251.62 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LAND USES Special Uses § 251.62 Acceptance. Except for an easement, a special use authorization shall become effective... extended by the authorized officer. Refusal of an applicant to sign and accept a special use...

  8. 36 CFR 251.62 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 251.62 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LAND USES Special Uses § 251.62 Acceptance. Except for an easement, a special use authorization shall become effective... extended by the authorized officer. Refusal of an applicant to sign and accept a special use...

  9. Improving Acceptance of Automated Counseling Procedures.

    ERIC Educational Resources Information Center

    Johnson, James H.; And Others

    This paper discusses factors that may influence the acceptance of automated counseling procedures by the military. A consensual model of the change process is presented which structures organizational readiness, the change strategy, and acceptance as integrated variables to be considered in a successful installation. A basic introduction to the…

  10. 48 CFR 11.103 - Market acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... supported by market research; (4) Include consideration of items supplied satisfactorily under recent or... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Market acceptance. 11.103... DESCRIBING AGENCY NEEDS Selecting and Developing Requirements Documents 11.103 Market acceptance. (a)...

  11. 49 CFR 193.2303 - Construction acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Construction acceptance. 193.2303 Section 193.2303 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY...: FEDERAL SAFETY STANDARDS Construction § 193.2303 Construction acceptance. No person may place in...

  12. 7 CFR 1205.326 - Acceptance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 10 2012-01-01 2012-01-01 false Acceptance. 1205.326 Section 1205.326 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Research and Promotion Order Cotton Board § 1205.326 Acceptance. Any person selected by the Secretary as...

  13. 7 CFR 1205.326 - Acceptance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Acceptance. 1205.326 Section 1205.326 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Research and Promotion Order Cotton Board § 1205.326 Acceptance. Any person selected by the Secretary as...

  14. 12 CFR 250.164 - Bankers' acceptances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 4 2012-01-01 2012-01-01 false Bankers' acceptances. 250.164 Section 250.164... reserve requirements under section 7 of the International Banking Act of 1978 (12 U.S.C. 3105). The Board..., Form FR Y-7, are also to be used in the calculation of the acceptance limits applicable to...

  15. 16 CFR 1110.5 - Acceptable certificates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Acceptable certificates. 1110.5 Section 1110.5 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS CERTIFICATES OF COMPLIANCE § 1110.5 Acceptable certificates. A certificate that is in hard copy or...

  16. Enzyme Reactions and Acceptability of Plant Foods.

    ERIC Educational Resources Information Center

    Palmer, James K.

    1984-01-01

    Provides an overview of enzyme reactions which contribute to the character and acceptability of plant foods. A detailed discussion of polyphenoloxidase is also provided as an example of an enzyme which can markedly affect the character and acceptability of such foods. (JN)

  17. Consumer acceptance of ginseng food products.

    PubMed

    Chung, Hee Sook; Lee, Young-Chul; Rhee, Young Kyung; Lee, Soo-Yeun

    2011-01-01

    Ginseng has been utilized less in food products than in dietary supplements in the United States. Sensory acceptance of ginseng food products by U.S. consumers has not been reported. The objectives of this study were to: (1) determine the sensory acceptance of commercial ginseng food products and (2) assess influence of the addition of sweeteners to ginseng tea and ginseng extract to chocolate on consumer acceptance. Total of 126 consumers participated in 3 sessions for (1) 7 commercial red ginseng food products, (2) 10 ginseng teas varying in levels of sugar or honey, and (3) 10 ginseng milk or dark chocolates varying in levels of ginseng extract. Ginseng candy with vitamin C and ginseng crunchy white chocolate were the most highly accepted, while sliced ginseng root product was the least accepted among the seven commercial products. Sensory acceptance increased in proportion to the content of sugar and honey in ginseng tea, whereas acceptance decreased with increasing content of ginseng extract in milk and dark chocolates. Findings demonstrate that ginseng food product types with which consumers have been already familiar, such as candy and chocolate, will have potential for success in the U.S. market. Chocolate could be suggested as a food matrix into which ginseng can be incorporated, as containing more bioactive compounds than ginseng tea at a similar acceptance level. Future research may include a descriptive analysis with ginseng-based products to identify the key drivers of liking and disliking for successful new product development. PMID:22416723

  18. Heavy Metal, Religiosity, and Suicide Acceptability.

    ERIC Educational Resources Information Center

    Stack, Steven

    1998-01-01

    Reports on data taken from the General Social Survey that found a link between "heavy metal" rock fanship and suicide acceptability. Finds that relationship becomes nonsignificant once level of religiosity is controlled. Heavy metal fans are low in religiosity, which contributes to greater suicide acceptability. (Author/JDM)

  19. Nevada Test Site Waste Acceptance Criteria (NTSWAC)

    SciTech Connect

    NNSA /NSO Waste Management Project

    2008-06-01

    This document establishes the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office, Nevada Test Site Waste Acceptance Criteria (NTSWAC). The NTSWAC provides the requirements, terms, and conditions under which the Nevada Test Site will accept low-level radioactive (LLW) and LLW Mixed Waste (MW) for disposal.

  20. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with