The quest for improved reproducibility in MALDI mass spectrometry.
O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P
2018-03-01
Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.
Reproducible and controllable induction voltage adder for scaled beam experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko
2016-08-15
A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.
Borries, Carola; Sandel, Aaron A; Koenig, Andreas; Fernandez-Duque, Eduardo; Kamilar, Jason M; Amoroso, Caroline R; Barton, Robert A; Bray, Joel; Di Fiore, Anthony; Gilby, Ian C; Gordon, Adam D; Mundry, Roger; Port, Markus; Powell, Lauren E; Pusey, Anne E; Spriggs, Amanda; Nunn, Charles L
2016-09-01
Recent decades have seen rapid development of new analytical methods to investigate patterns of interspecific variation. Yet these cutting-edge statistical analyses often rely on data of questionable origin, varying accuracy, and weak comparability, which seem to have reduced the reproducibility of studies. It is time to improve the transparency of comparative data while also making these improved data more widely available. We, the authors, met to discuss how transparency, usability, and reproducibility of comparative data can best be achieved. We propose four guiding principles: 1) data identification with explicit operational definitions and complete descriptions of methods; 2) inclusion of metadata that capture key characteristics of the data, such as sample size, geographic coordinates, and nutrient availability (for example, captive versus wild animals); 3) documentation of the original reference for each datum; and 4) facilitation of effective interactions with the data via user friendly and transparent interfaces. We urge reviewers, editors, publishers, database developers and users, funding agencies, researchers publishing their primary data, and those performing comparative analyses to embrace these standards to increase the transparency, usability, and reproducibility of comparative studies. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinmann, Vera; Chakraborty, Rupak; Rekemeyer, Paul
2016-11-21
As novel absorber materials are developed and screened for their photovoltaic (PV) properties, the challenge remains to rapidly test promising candidates in high-performing PV devices. There is a need to engineer new compatible device architectures, including the development of novel transparent conductive oxides and buffer layers. Here, we consider the two approaches of a substrate-style and a superstrate-style device architecture for novel thin-film solar cells. We use tin sulfide as a test absorber material. Upon device engineering, we demonstrate new approaches to improve device performance and performance reproducibility.
ERIC Educational Resources Information Center
State Fair Community Coll., Sedalia, MO.
Five objectives are reported for a project to develop and test effective procedures for designing, field testing, reproducing, and disseminating individualized mediated instructional materials: (1) improvement of teacher input, (2) development of individualized instruction modules, (3) development of methodology for evaluating the effectiveness of…
Where next for the reproducibility agenda in computational biology?
Lewis, Joanna; Breeze, Charles E; Charlesworth, Jane; Maclaren, Oliver J; Cooper, Jonathan
2016-07-15
The concept of reproducibility is a foundation of the scientific method. With the arrival of fast and powerful computers over the last few decades, there has been an explosion of results based on complex computational analyses and simulations. The reproducibility of these results has been addressed mainly in terms of exact replicability or numerical equivalence, ignoring the wider issue of the reproducibility of conclusions through equivalent, extended or alternative methods. We use case studies from our own research experience to illustrate how concepts of reproducibility might be applied in computational biology. Several fields have developed 'minimum information' checklists to support the full reporting of computational simulations, analyses and results, and standardised data formats and model description languages can facilitate the use of multiple systems to address the same research question. We note the importance of defining the key features of a result to be reproduced, and the expected agreement between original and subsequent results. Dynamic, updatable tools for publishing methods and results are becoming increasingly common, but sometimes come at the cost of clear communication. In general, the reproducibility of computational research is improving but would benefit from additional resources and incentives. We conclude with a series of linked recommendations for improving reproducibility in computational biology through communication, policy, education and research practice. More reproducible research will lead to higher quality conclusions, deeper understanding and more valuable knowledge.
Electron tubes for industrial applications
NASA Astrophysics Data System (ADS)
Gellert, Bernd
1994-05-01
This report reviews research and development efforts within the last years for vacuum electron tubes, in particular power grid tubes for industrial applications. Physical and chemical effects are discussed that determine the performance of todays devices. Due to the progress made in the fundamental understanding of materials and newly developed processes the reliability and reproducibility of power grid tubes could be improved considerably. Modern computer controlled manufacturing methods ensure a high reproducibility of production and continuous quality certification according to ISO 9001 guarantees future high quality standards. Some typical applications of these tubes are given as an example.
Sampling-based ensemble segmentation against inter-operator variability
NASA Astrophysics Data System (ADS)
Huo, Jing; Okada, Kazunori; Pope, Whitney; Brown, Matthew
2011-03-01
Inconsistency and a lack of reproducibility are commonly associated with semi-automated segmentation methods. In this study, we developed an ensemble approach to improve reproducibility and applied it to glioblastoma multiforme (GBM) brain tumor segmentation on T1-weigted contrast enhanced MR volumes. The proposed approach combines samplingbased simulations and ensemble segmentation into a single framework; it generates a set of segmentations by perturbing user initialization and user-specified internal parameters, then fuses the set of segmentations into a single consensus result. Three combination algorithms were applied: majority voting, averaging and expectation-maximization (EM). The reproducibility of the proposed framework was evaluated by a controlled experiment on 16 tumor cases from a multicenter drug trial. The ensemble framework had significantly better reproducibility than the individual base Otsu thresholding method (p<.001).
Austin Community College Study Guide Series.
ERIC Educational Resources Information Center
Academic Therapy, 1989
1989-01-01
A series of study guides, which were developed to assist students in improving study and library skills, is reproduced for teachers' use. The guides cover test-taking skills; writing business letters; selecting a writing topic; plagiarism; developing search strategies; documentation; and locating dictionaries, periodical articles, books,…
Semiautomated digital analysis of knee joint space width using MR images.
Agnesi, Filippo; Amrami, Kimberly K; Frigo, Carlo A; Kaufman, Kenton R
2007-05-01
The goal of this study was to (a) develop a semiautomated computer algorithm to measure knee joint space width (JSW) from magnetic resonance (MR) images using standard imaging techniques and (b) evaluate the reproducibility of the algorithm. Using a standard clinical imaging protocol, bilateral knee MR images were obtained twice within a 2-week period from 17 asymptomatic research participants. Images were analyzed to determine the variability of the measurements performed by the program compared with the variability of manual measurements. Measurement variability of the computer algorithm was considerably smaller than the variability of manual measurements. The average difference between two measurements of the same slice performed with the computer algorithm by the same user was 0.004 +/- 0.07 mm for the tibiofemoral joint (TF) and 0.009 +/- 0.11 mm for the patellofemoral joint (PF) compared with an average of 0.12 +/- 0.22 mm TF and 0.13 +/- 0.29 mm PF, respectively, for the manual method. Interuser variability of the computer algorithm was also considerably smaller, with an average difference of 0.004 +/- 0.1 mm TF and 0.0006 +/- 0.1 mm PF compared with 0.38 +/- 0.59 mm TF and 0.31 +/- 0.66 mm PF obtained using a manual method. The between-day reproducibility was larger but still within acceptable limits at 0.09 +/- 0.39 mm TF and 0.09 +/- 0.51 mm PF. This technique has proven consistently reproducible on a same slice base,while the reproducibility comparing different acquisitions of the same subject was larger. Longitudinal reproducibility improvement needs to be addressed through acquisition protocol improvements. A semiautomated method for measuring knee JSW from MR images has been successfully developed.
ERIC Educational Resources Information Center
Romero, Frederico R.; Romero, Antonio W.; Filho, Thadeu Brenny; Kulysz, David; Oliveira, Fernando C., Jr.; Filho, Renato Tambara
2012-01-01
Objective: To help students, residents, and general practitioners to improve the technique, skills, and reproducibility of their prostate examination. Methods: We developed a comprehensive guideline outlining prostate anatomy, indications, patient preparation, positioning, technique, findings, and limitations of this ancient art of urological…
Shackney, Stanley; Emlet, David R; Pollice, Agnese; Smith, Charles; Brown, Kathryn; Kociban, Deborah
2006-01-01
Laser scanning Cytometry (LSC) is a versatile technology that makes it possible to perform multiple measurements on individual cells and correlate them cell by cell with other cellular features. It would be highly desirable to be able to perform reproducible, quantitative, correlated cell-based immunofluorescence studies on individual cells from human solid tumors. However, such studies can be challenging because of the presence of large numbers of cell aggregates and other confounding factors. Techniques have been developed to deal with cell aggregates in data sets collected by LSC. Experience has also been gained in addressing other key technical and methodological issues that can affect the reproducibility of such cell-based immunofluorescence measurements. We describe practical aspects of cell sample collection, cell fixation and staining, protocols for performing multiparameter immunofluorescence measurements by LSC, use of controls and reference samples, and approaches to data analysis that we have found useful in improving the accuracy and reproducibility of LSC data obtained in human tumor samples. We provide examples of the potential advantages of LSC in examining quantitative aspects of cell-based analysis. Improvements in the quality of cell-based multiparameter immunofluorescence measurements make it possible to extract useful information from relatively small numbers of cells. This, in turn, permits the performance of multiple multicolor panels on each tumor sample. With links among the different panels that are provided by overlapping measurements, it is possible to develop increasingly more extensive profiles of intracellular expression of multiple proteins in clinical samples of human solid tumors. Examples of such linked panels of measurements are provided. Advances in methodology can improve cell-based multiparameter immunofluorescence measurements on cell suspensions from human solid tumors by LSC for use in prognostic and predictive clinical applications. Copyright (c) 2005 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robey, Robert W.
2016-06-27
The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?
Is Schooling Good for the Development of Society?: The Case of South Africa
ERIC Educational Resources Information Center
Harber, Clive; Mncube, Vusi
2011-01-01
This paper is concerned with three possible theoretical relationships, between education and social, economic and political development, that--(a) education improves society, (b) education reproduces society as it is and (c) education actually makes society worse. The paper then uses South Africa as a case study to critically analyse these…
Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.
Zou, L; Bloebaum, R D; Bachus, K N
1997-01-01
Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.
OpenMS - A platform for reproducible analysis of mass spectrometry data.
Pfeuffer, Julianus; Sachsenberg, Timo; Alka, Oliver; Walzer, Mathias; Fillbrunn, Alexander; Nilse, Lars; Schilling, Oliver; Reinert, Knut; Kohlbacher, Oliver
2017-11-10
In recent years, several mass spectrometry-based omics technologies emerged to investigate qualitative and quantitative changes within thousands of biologically active components such as proteins, lipids and metabolites. The research enabled through these methods potentially contributes to the diagnosis and pathophysiology of human diseases as well as to the clarification of structures and interactions between biomolecules. Simultaneously, technological advances in the field of mass spectrometry leading to an ever increasing amount of data, demand high standards in efficiency, accuracy and reproducibility of potential analysis software. This article presents the current state and ongoing developments in OpenMS, a versatile open-source framework aimed at enabling reproducible analyses of high-throughput mass spectrometry data. It provides implementations of frequently occurring processing operations on MS data through a clean application programming interface in C++ and Python. A collection of 185 tools and ready-made workflows for typical MS-based experiments enable convenient analyses for non-developers and facilitate reproducible research without losing flexibility. OpenMS will continue to increase its ease of use for developers as well as users with improved continuous integration/deployment strategies, regular trainings with updated training materials and multiple sources of support. The active developer community ensures the incorporation of new features to support state of the art research. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
2017-04-01
notice for non -US Government use and distribution. External use: This material may be reproduced in its entirety, without modification, and freely...Combinatorial Design Methods 4 2.1 Identification of Significant Improvement Opportunity 4 2.2 Methodology Development 4 2.3 Piloting...11 3 Process Performance Modeling and Analysis 13 3.1 Identification of Significant Improvement Opportunity 13 3.2 Methodology Development 13 3.3
Improved estimates of fixed reproducible tangible wealth, 1929-95
DOT National Transportation Integrated Search
1997-05-01
This article presents revised estimates of the value of fixed reproducible tangible wealth in the United States for 192995; these estimates incorporate the definitional and statistical : improvements introduced in last years comprehensive revis...
Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2016-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971
Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2017-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.
de Souza Lucas, Francisco Willian; Welch, Adam W.; Baranowski, Lauryn L.; ...
2016-08-01
CuSbS 2 is a promising nontoxic and earth-abundant photovoltaic absorber that is chemically simpler than the widely studied Cu 2ZnSnS 4. However, CuSbS 2 photovoltaic (PV) devices currently have relatively low efficiency and poor reproducibility, often due to suboptimal material quality and insufficient optoelectronic properties. To address these issues, here we develop a thermochemical treatment (TT) for CuSbS 2 thin films, which consists of annealing in Sb 2S 3 vapor followed by a selective KOH surface chemical etch. The annealed CuSbS 2 films show improved structural quality and optoelectronic properties, such as stronger band-edge photoluminescence and longer photoexcited carrier lifetime.more » These improvements also lead to more reproducible CuSbS 2 PV devices, with performance currently limited by a large cliff-type interface band offset with CdS contact. Altogether, these results point to the potential avenues to further increase the performance of CuSbS 2 thin film solar cell, and the findings can be transferred to other thin film photovoltaic technologies.« less
NASA Astrophysics Data System (ADS)
Dudchenko, Oleksandr Ye; Pyeshkova, Viktoriya M.; Soldatkin, Oleksandr O.; Akata, Burcu; Kasap, Berna O.; Soldatkin, Alexey P.; Dzyadevych, Sergei V.
2016-02-01
The application of silicalite for improvement of enzyme adsorption on new stainless steel electrodes is reported. Glucose oxidase (GOx) was immobilized by two methods: cross-linking by glutaraldehyde (GOx-GA) and cross-linking by glutaraldehyde along with GOx adsorption on silicalite-modified electrode (SME) (GOx-SME-GA). The GOx-SME-GA biosensors were characterized by a four- to fivefold higher sensitivity than GOx-GA biosensor. It was concluded that silicalite together with GA sufficiently enhances enzyme adhesion on stainless steel electrodes. The developed GOx-SME-GA biosensors were characterized by good reproducibility of biosensor preparation (relative standard deviation (RSD)—18 %), improved signal reproducibility (RSD of glucose determination was 7 %), and good storage stability (29 % loss of activity after 18-day storage). A series of fruit juices and nectars was analyzed using GOx-SME-GA biosensor for determination of glucose concentration. The obtained results showed good correlation with the data of high-performance liquid chromatography (HPLC) ( R = 0.99).
Recent Developments in the Treatment of Ankle and Subtalar Instability
Sugimoto, Kazuya
2017-01-01
It was nearly a centenary ago that severe ankle sprain was recognized as an injury of the ankle ligament(s). With the recent technological advances and tools in imaging and surgical procedures, the management of ankle sprains - including subtalar injuries - has drastically improved. The repair or reconstruction of ankle ligaments is getting more anatomical and less invasive than previously. More specifically, ligamentous reconstruction with tendon graft has been the gold standard in the management of severely damaged ligament, however, it does not reproduce the original ultrastructure of the ankle ligaments. The anatomical ligament structure of a ligament comprises a ligament with enthesis at both ends and the structure should also exhibit proprioceptive function. To date, it remains impossible to reconstruct a functionally intact and anatomical ligament. Cooperation of the regenerative medicine and surgical technology in expected to improve reconstructions of the ankle ligament, however, we need more time to develop a technology in reproducing the ideal ligament complex. PMID:28979582
NASA Astrophysics Data System (ADS)
Yan, A.; West, J.
2016-12-01
The validity of Geosciences research is of great significance to general public and policy-makers. In an earlier study, we surveyed 136 faculty and graduate students in geosciences. The result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing, suggesting a general lack of research reproducibility in geosciences. Although there is much enthusiasm for creation of technologies such as workflow system, literate programming, and cloud-based system to facilitate reproducibility, much less emphasis has been placed on the information services essential for meaningful use of these tools. Library and Information Science (LIS) has a rich tradition of providing customized service for research communities. LIS professionals such as academic librarians have made strong contribution to resources locating, software recommending, data curation, metadata guidance, project management, submission review and author training. In particular, university libraries have been actively developing tools and offering guidelines, consultations, and trainings on Data Management Plan (DMP) required by National Science Foundation (NSF). And effective data management is a significant first step towards reproducible research. Hereby we argue that LIS professionals may be well-positioned to assist researchers to make their research reproducible. In this study, we aim to answer the question: how can LIS professionals assist geoscience researchers in making their research capable of being reproduced? We first synthesize different definitions of "reproducibility" and provide a conceptual framework of "reproducibility" in geosciences to resolve some of the misunderstandings around related terminology. Using a case study approach, we then examine 1) university librarians' technical skills, domain knowledge, professional activities, together with their awareness of, readiness for, and attitudes towards research reproducibility and 2) geosciences researcher needs for assistance in making research reproducible and attitude towards LIS services. The results of our study provide empirical evidence for an extension of library services, as well as for a potential solution in facilitating research reproducibility.
Moore, Susan M; Thomas, Maribeth; Woo, Savio L-Y; Gabriel, Mary T; Kilger, Robert; Debski, Richard E
2006-01-01
The objective of this study was to develop a novel method to more accurately reproduce previously recorded 6-DOF kinematics of the tibia with respect to the femur using robotic technology. Furthermore, the effect of performing only a single or multiple registrations and the effect of robot joint configuration were investigated. A single registration consisted of registering the tibia and femur with respect to the robot at full extension and reproducing all kinematics while multiple registrations consisted of registering the bones at each flexion angle and reproducing only the kinematics of the corresponding flexion angle. Kinematics of the knee in response to an anterior (134 N) and combined internal/external (+/-10 N m) and varus/valgus (+/-5 N m) loads were collected at 0 degrees , 15 degrees , 30 degrees , 60 degrees , and 90 degrees of flexion. A six axes, serial-articulated robotic manipulator (PUMA Model 762) was calibrated and the working volume was reduced to improve the robot's accuracy. The effect of the robot joint configuration was determined by performing single and multiple registrations for three selected configurations. For each robot joint configuration, the accuracy in position of the reproduced kinematics improved after multiple registrations (0.7+/-0.3, 1.2+/-0.5, and 0.9+/-0.2 mm, respectively) when compared to only a single registration (1.3+/-0.9, 2.0+/-1.0, and 1.5+/-0.7 mm, respectively) (p<0.05). The accuracy in position of each robot joint configuration was unique as significant differences were detected between each of the configurations. These data demonstrate that the number of registrations and the robot joint configuration both affect the accuracy of the reproduced kinematics. Therefore, when using robotic technology to reproduce previously recorded kinematics, it may be necessary to perform these analyses for each individual robotic system and for each diarthrodial joint, as different joints will require the robot to be placed in different robot joint configurations.
Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.
Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda
2013-01-01
How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
Bringing Management Reality into the Classroom--The Development of Interactive Learning.
ERIC Educational Resources Information Center
Nicholson, Alastair
1997-01-01
Effective learning in management education can be enhanced by reproducing the real-world need to solve problems under pressure of time, inadequate information, and group interaction. An interactive classroom communication system involving problems in decision making and continuous improvement is one method for bridging theory and practice. (SK)
Ryan, Justin R; Almefty, Kaith K; Nakaji, Peter; Frakes, David H
2016-04-01
Neurosurgery simulator development is growing as practitioners recognize the need for improved instructional and rehearsal platforms to improve procedural skills and patient care. In addition, changes in practice patterns have decreased the volume of specific cases, such as aneurysm clippings, which reduces the opportunity for operating room experience. The authors developed a hands-on, dimensionally accurate model for aneurysm clipping using patient-derived anatomic data and three-dimensional (3D) printing. Design of the model focused on reproducibility as well as adaptability to new patient geometry. A modular, reproducible, and patient-derived medical simulacrum was developed for medical learners to practice aneurysmal clipping procedures. Various forms of 3D printing were used to develop a geometrically accurate cranium and vascular tree featuring 9 patient-derived aneurysms. 3D printing in conjunction with elastomeric casting was leveraged to achieve a patient-derived brain model with tactile properties not yet available from commercial 3D printing technology. An educational pilot study was performed to gauge simulation efficacy. Through the novel manufacturing process, a patient-derived simulacrum was developed for neurovascular surgical simulation. A follow-up qualitative study suggests potential to enhance current educational programs; assessments support the efficacy of the simulacrum. The proposed aneurysm clipping simulator has the potential to improve learning experiences in surgical environment. 3D printing and elastomeric casting can produce patient-derived models for a dynamic learning environment that add value to surgical training and preparation. Copyright © 2016 Elsevier Inc. All rights reserved.
Shape Perception and Navigation in Blind Adults
Gori, Monica; Cappagli, Giulia; Baud-Bovy, Gabriel; Finocchietti, Sara
2017-01-01
Different sensory systems interact to generate a representation of space and to navigate. Vision plays a critical role in the representation of space development. During navigation, vision is integrated with auditory and mobility cues. In blind individuals, visual experience is not available and navigation therefore lacks this important sensory signal. In blind individuals, compensatory mechanisms can be adopted to improve spatial and navigation skills. On the other hand, the limitations of these compensatory mechanisms are not completely clear. Both enhanced and impaired reliance on auditory cues in blind individuals have been reported. Here, we develop a new paradigm to test both auditory perception and navigation skills in blind and sighted individuals and to investigate the effect that visual experience has on the ability to reproduce simple and complex paths. During the navigation task, early blind, late blind and sighted individuals were required first to listen to an audio shape and then to recognize and reproduce it by walking. After each audio shape was presented, a static sound was played and the participants were asked to reach it. Movements were recorded with a motion tracking system. Our results show three main impairments specific to early blind individuals. The first is the tendency to compress the shapes reproduced during navigation. The second is the difficulty to recognize complex audio stimuli, and finally, the third is the difficulty in reproducing the desired shape: early blind participants occasionally reported perceiving a square but they actually reproduced a circle during the navigation task. We discuss these results in terms of compromised spatial reference frames due to lack of visual input during the early period of development. PMID:28144226
Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.
2015-01-01
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality control measures, enables sensitive, specific, reproducible, and quantitative measurements of proteins and peptides in complex biological matrices such as plasma. PMID:25693799
Development and fabrication of improved Schottky power diodes, phases I and II
NASA Technical Reports Server (NTRS)
Cordes, L. F.; Garfinkle, M.; Taft, E. A.
1974-01-01
Reproducible methods for the fabrication of silicon Schottky diodes were developed for the metals tungsten, aluminum, conventional platinum silicide and low temperature platinum silicide. Barrier heights and barrier lowering were measured permitting the accurate prediction of ideal forward and reverse diode performance. Processing procedures were developed which permit the fabrication of large area (approximately 1 sqcm) mesa-geometry power Schottky diodes with forward and reverse characteristics that approach theoretical values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.
2013-12-15
Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients priormore » to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.« less
3D gut-liver chip with a PK model for prediction of first-pass metabolism.
Lee, Dong Wook; Ha, Sang Keun; Choi, Inwook; Sung, Jong Hwan
2017-11-07
Accurate prediction of first-pass metabolism is essential for improving the time and cost efficiency of drug development process. Here, we have developed a microfluidic gut-liver co-culture chip that aims to reproduce the first-pass metabolism of oral drugs. This chip consists of two separate layers for gut (Caco-2) and liver (HepG2) cell lines, where cells can be co-cultured in both 2D and 3D forms. Both cell lines were maintained well in the chip, verified by confocal microscopy and measurement of hepatic enzyme activity. We investigated the PK profile of paracetamol in the chip, and corresponding PK model was constructed, which was used to predict PK profiles for different chip design parameters. Simulation results implied that a larger absorption surface area and a higher metabolic capacity are required to reproduce the in vivo PK profile of paracetamol more accurately. Our study suggests the possibility of reproducing the human PK profile on a chip, contributing to accurate prediction of pharmacological effect of drugs.
Farley, Christopher; Burks, Geoffry; Siegert, Thomas; Juers, Douglas H
2014-08-01
In macromolecular cryocrystallography unit-cell parameters can have low reproducibility, limiting the effectiveness of combining data sets from multiple crystals and inhibiting the development of defined repeatable cooling protocols. Here, potential sources of unit-cell variation are investigated and crystal dehydration during loop-mounting is found to be an important factor. The amount of water lost by the unit cell depends on the crystal size, the loop size, the ambient relative humidity and the transfer distance to the cooling medium. To limit water loss during crystal mounting, a threefold strategy has been implemented. Firstly, crystal manipulations are performed in a humid environment similar to the humidity of the crystal-growth or soaking solution. Secondly, the looped crystal is transferred to a vial containing a small amount of the crystal soaking solution. Upon loop transfer, the vial is sealed, which allows transport of the crystal at its equilibrated humidity. Thirdly, the crystal loop is directly mounted from the vial into the cold gas stream. This strategy minimizes the exposure of the crystal to relatively low humidity ambient air, improves the reproducibility of low-temperature unit-cell parameters and offers some new approaches to crystal handling and cryoprotection.
Farley, Christopher; Burks, Geoffry; Siegert, Thomas; Juers, Douglas H.
2014-01-01
In macromolecular cryocrystallography unit-cell parameters can have low reproducibility, limiting the effectiveness of combining data sets from multiple crystals and inhibiting the development of defined repeatable cooling protocols. Here, potential sources of unit-cell variation are investigated and crystal dehydration during loop-mounting is found to be an important factor. The amount of water lost by the unit cell depends on the crystal size, the loop size, the ambient relative humidity and the transfer distance to the cooling medium. To limit water loss during crystal mounting, a threefold strategy has been implemented. Firstly, crystal manipulations are performed in a humid environment similar to the humidity of the crystal-growth or soaking solution. Secondly, the looped crystal is transferred to a vial containing a small amount of the crystal soaking solution. Upon loop transfer, the vial is sealed, which allows transport of the crystal at its equilibrated humidity. Thirdly, the crystal loop is directly mounted from the vial into the cold gas stream. This strategy minimizes the exposure of the crystal to relatively low humidity ambient air, improves the reproducibility of low-temperature unit-cell parameters and offers some new approaches to crystal handling and cryoprotection. PMID:25084331
The Case for Laboratory Developed Procedures
Sabatini, Linda M.; Tsongalis, Gregory J.; Caliendo, Angela M.; Olsen, Randall J.; Ashwood, Edward R.; Bale, Sherri; Benirschke, Robert; Carlow, Dean; Funke, Birgit H.; Grody, Wayne W.; Hayden, Randall T.; Hegde, Madhuri; Lyon, Elaine; Pessin, Melissa; Press, Richard D.; Thomson, Richard B.
2017-01-01
An explosion of knowledge and technology is revolutionizing medicine and patient care. Novel testing must be brought to the clinic with safety and accuracy, but also in a timely and cost-effective manner, so that patients can benefit and laboratories can offer testing consistent with current guidelines. Under the oversight provided by the Clinical Laboratory Improvement Amendments, laboratories have been able to develop and optimize laboratory procedures for use in-house. Quality improvement programs, interlaboratory comparisons, and the ability of laboratories to adjust assays as needed to improve results, utilize new sample types, or incorporate new mutations, information, or technologies are positive aspects of Clinical Laboratory Improvement Amendments oversight of laboratory-developed procedures. Laboratories have a long history of successful service to patients operating under Clinical Laboratory Improvement Amendments. A series of detailed clinical examples illustrating the quality and positive impact of laboratory-developed procedures on patient care is provided. These examples also demonstrate how Clinical Laboratory Improvement Amendments oversight ensures accurate, reliable, and reproducible testing in clinical laboratories. PMID:28815200
RIPOSTE: a framework for improving the design and analysis of laboratory-based research.
Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn
2015-05-07
Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.
Economics in History and the Social Sciences.
ERIC Educational Resources Information Center
Joint Council on Economic Education, New York, NY.
Papers presented by social scientists at a 1974 Joint Council seminar designed to assist authors and publishers in improving existing materials or developing new texts in social studies are reproduced in this volume. The seven papers focus on how to integrate economics into elementary and secondary social studies and history courses. The first…
An improved diffusion welding technique for TD-NiCr
NASA Technical Reports Server (NTRS)
Holko, K. H.
1973-01-01
An improved diffusion welding technique has been developed for TD-NiCr sheet. In the most preferred form, the improved technique consists of diffusion welding 320-grit sanded plus chemically polished surfaces of unrecrystallized TD-NiCr at 760 C under 140 MN/m2 pressure for 1hr followed by postheating at 1180 C for 2hr. Compared to previous work, this improved technique has the advantages of shorter welding time, lower welding temperature, lower welding pressure, and a simpler and more reproducible surface preparation procedure. Weldments were made that had parent-metal creep-rupture shear strength at 1100 C.
Systematic heterogenization for better reproducibility in animal experimentation.
Richter, S Helene
2017-08-31
The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.
Shao, Shiying; Guo, Tiannan; Gross, Vera; Lazarev, Alexander; Koh, Ching Chiek; Gillessen, Silke; Joerger, Markus; Jochum, Wolfram; Aebersold, Ruedi
2016-06-03
The reproducible and efficient extraction of proteins from biopsy samples for quantitative analysis is a critical step in biomarker and translational research. Recently, we described a method consisting of pressure-cycling technology (PCT) and sequential windowed acquisition of all theoretical fragment ions-mass spectrometry (SWATH-MS) for the rapid quantification of thousands of proteins from biopsy-size tissue samples. As an improvement of the method, we have incorporated the PCT-MicroPestle into the PCT-SWATH workflow. The PCT-MicroPestle is a novel, miniaturized, disposable mechanical tissue homogenizer that fits directly into the microTube sample container. We optimized the pressure-cycling conditions for tissue lysis with the PCT-MicroPestle and benchmarked the performance of the system against the conventional PCT-MicroCap method using mouse liver, heart, brain, and human kidney tissues as test samples. The data indicate that the digestion of the PCT-MicroPestle-extracted proteins yielded 20-40% more MS-ready peptide mass from all tissues tested with a comparable reproducibility when compared to the conventional PCT method. Subsequent SWATH-MS analysis identified a higher number of biologically informative proteins from a given sample. In conclusion, we have developed a new device that can be seamlessly integrated into the PCT-SWATH workflow, leading to increased sample throughput and improved reproducibility at both the protein extraction and proteomic analysis levels when applied to the quantitative proteomic analysis of biopsy-level samples.
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
Richter, S. Helene; Garner, Joseph P.; Zipser, Benjamin; Lewejohann, Lars; Sachser, Norbert; Touma, Chadi; Schindler, Britta; Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter; van Stipdonk, Niek; van der Harst, Johanneke; Spruijt, Berry; Võikar, Vootele; Wolfer, David P.; Würbel, Hanno
2011-01-01
In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies. PMID:21305027
NASA Astrophysics Data System (ADS)
Esquef, Paulo A. A.
The first reproducible recording of human voice was made in 1877 on a tinfoil cylinder phonograph devised by Thomas A. Edison. Since then, much effort has been expended to find better ways to record and reproduce sounds. By the mid-1920s, the first electrical recordings appeared and gradually took over purely acoustic recordings. The development of electronic computers, in conjunction with the ability to record data onto magnetic or optical media, culminated in the standardization of compact disc format in 1980. Nowadays, digital technology is applied to several audio applications, not only to improve the quality of modern and old recording/reproduction techniques, but also to trade off sound quality for less storage space and less taxing transmission capacity requirements.
Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.
Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P
2018-02-23
Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.
A Practical Guide for Improving Transparency and Reproducibility in Neuroimaging Research
Poldrack, Russell A.
2016-01-01
Recent years have seen an increase in alarming signals regarding the lack of replicability in neuroscience, psychology, and other related fields. To avoid a widespread crisis in neuroimaging research and consequent loss of credibility in the public eye, we need to improve how we do science. This article aims to be a practical guide for researchers at any stage of their careers that will help them make their research more reproducible and transparent while minimizing the additional effort that this might require. The guide covers three major topics in open science (data, code, and publications) and offers practical advice as well as highlighting advantages of adopting more open research practices that go beyond improved transparency and reproducibility. PMID:27389358
2007-09-01
practically have dropped the collaboration with Biotraces as the company was not able to provide us with an improved version of their instrument...Although the claimed sensitivity was reproduced in studies conducted at BioTraces with recombinant PrP. The question was whether the same sensitivity
Panteva, Maria T; Giambaşu, George M; York, Darrin M
2015-05-15
The prevalence of Mg(2+) ions in biology and their essential role in nucleic acid structure and function has motivated the development of various Mg(2+) ion models for use in molecular simulations. Currently, the most widely used models in biomolecular simulations represent a nonbonded metal ion as an ion-centered point charge surrounded by a nonelectrostatic pairwise potential that takes into account dispersion interactions and exchange effects that give rise to the ion's excluded volume. One strategy toward developing improved models for biomolecular simulations is to first identify a Mg(2+) model that is consistent with the simulation force fields that closely reproduces a range of properties in aqueous solution, and then, in a second step, balance the ion-water and ion-solute interactions by tuning parameters in a pairwise fashion where necessary. The present work addresses the first step in which we compare 17 different nonbonded single-site Mg(2+) ion models with respect to their ability to simultaneously reproduce structural, thermodynamic, kinetic and mass transport properties in aqueous solution. None of the models based on a 12-6 nonelectrostatic nonbonded potential was able to reproduce the experimental radial distribution function, solvation free energy, exchange barrier and diffusion constant. The models based on a 12-6-4 potential offered improvement, and one model in particular, in conjunction with the SPC/E water model, performed exceptionally well for all properties. The results reported here establish useful benchmark calculations for Mg(2+) ion models that provide insight into the origin of the behavior in aqueous solution, and may aid in the development of next-generation models that target specific binding sites in biomolecules. © 2015 Wiley Periodicals, Inc.
Kang, Tianyu; Ding, Wei; Zhang, Luoyan; Ziemek, Daniel; Zarringhalam, Kourosh
2017-12-19
Stratification of patient subpopulations that respond favorably to treatment or experience and adverse reaction is an essential step toward development of new personalized therapies and diagnostics. It is currently feasible to generate omic-scale biological measurements for all patients in a study, providing an opportunity for machine learning models to identify molecular markers for disease diagnosis and progression. However, the high variability of genetic background in human populations hampers the reproducibility of omic-scale markers. In this paper, we develop a biological network-based regularized artificial neural network model for prediction of phenotype from transcriptomic measurements in clinical trials. To improve model sparsity and the overall reproducibility of the model, we incorporate regularization for simultaneous shrinkage of gene sets based on active upstream regulatory mechanisms into the model. We benchmark our method against various regression, support vector machines and artificial neural network models and demonstrate the ability of our method in predicting the clinical outcomes using clinical trial data on acute rejection in kidney transplantation and response to Infliximab in ulcerative colitis. We show that integration of prior biological knowledge into the classification as developed in this paper, significantly improves the robustness and generalizability of predictions to independent datasets. We provide a Java code of our algorithm along with a parsed version of the STRING DB database. In summary, we present a method for prediction of clinical phenotypes using baseline genome-wide expression data that makes use of prior biological knowledge on gene-regulatory interactions in order to increase robustness and reproducibility of omic-scale markers. The integrated group-wise regularization methods increases the interpretability of biological signatures and gives stable performance estimates across independent test sets.
Mizutani, Eiji; Wakayama, Sayaka; Wakayama, Teruhiko
2015-01-01
The successful production of cloned animals by somatic cell nuclear transfer (SCNT) is a promising technology with many potential applications in basic research, medicine, and agriculture. However, the low efficiency and the difficulty of cloning are major obstacles to the widespread use of this technology. Since the first mammal cloned from an adult donor cell was born, many attempts have been made to improve animal cloning techniques, and some approaches have successfully improved its efficiency. Nuclear transfer itself is still difficult because it requires an accomplished operator with a practiced technique. Thus, it is very important to find simple and reproducible methods for improving the success rate of SCNT. In this chapter, we will review our recent protocols, which seem to be the simplest and most reliable method to date to improve development of SCNT embryos.
Reproducibility of the vertical dimension of occlusion with an improved measuring gauge.
Morikawa, M; Kozono, Y; Noguchi, B S; Toyoda, S
1988-07-01
An improved gauge using an eyeglass frame, the TOM gauge, was devised. The reproducibility of the record of vertical dimension with this gauge was evaluated through repeated measurements on subjects having a definite centric stop with the natural dentition. Because of the stabilization provided by the frame and the reference point on the apex nasi, the TOM gauge showed excellent reproducibility of the record compared with the conventional gauges. The TOM gauge can be expected to significantly reduce the risk of errors in measuring the vertical dimension of occlusion especially in complete denture fabrication.
Stability enhanced, repeatability improved Parylene-C passivated on QCM sensor for aPTT measurement.
Yang, Yuchen; Zhang, Wei; Guo, Zhen; Zhang, Zhiqi; Zhu, Hongnan; Yan, Ruhong; Zhou, Lianqun
2017-12-15
Determination of blood clotting time is essential in monitoring therapeutic anticoagulants. In this work, Parylene-C passivated on quartz crystal microbalance (P-QCM) was developed for the activated partial thromboplastin time (aPTT) measurement. Compared with typical QCM, P-QCM possessed a hydrophobic surface and sensitive frequency response to viscoelastic variations on electrode surface. Fibrin could be adsorbed effectively, due to the hydrophobicity of the P-QCM surface. Comparing with typical QCM, the peak-to-peak value (PPV) of P-QCM was increased by 1.94% ± 0.63%, which indicated enhancement of signal-to-noise ratio. For P-QCM, the coefficient of variation (CV) of frequency decrease and aPTT were 2.58% and 1.24% separately, which demonstrated improvement of stability and reproducibility. Moreover, compared with SYSMEX CS 2000i haematology analyzer, clinical coefficient index (R 2 ) was 0.983. In conclusion, P-QCM exhibited potential for improving stability, reproducibility and linearity of piezoelectric sensors, and might be more promising for point of care testing (POCT) applications. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, K; Gil, M; Li, G
Purpose: To develop a novel approach to improve cervical spine (c-spine) curvature reproducibility for head and neck (HN) patients using optical surface imaging (OSI) with two regions of interests (ROIs). Methods: The OSI-guided, two-step setup procedure requires two ROIs: ROI-1 of the shoulders and ROI-2 of the face. The neck can be stretched or squeezed in superior-inferior (SI) direction using a specially-designed sliding head support. We hypothesize that when these two ROIs are aligned, the c-spine should fall into a naturally reproducible position under same setup conditions. An anthropomorphous phantom test was performed to examine neck pitch angles comparing withmore » the calculated angles. Three volunteers participated in the experiments, which start with conventional HN setup using skin markers and room lasers. An OSI image and lateral photo-picture were acquired as the references. In each of the three replicate tests, conventional setup was first applied after volunteers got on the couch. ROI-1 was aligned by moving the body, followed by ROI-2 alignment via adjusting head position and orientation under real-time OSI guidance. A final static OSI image and lateral picture were taken to evaluate both anterior and posterior surface alignments. Three degrees of freedom can be adjusted if an open-face mask was applied, including head SI shift using the sliding head support and pitch-and-roll rotations using a commercial couch extension. Surface alignment was analyzed comparing with conventional setup. Results: The neck pitch angle measured by OSI is consistent with the calculated (0.2±0.6°). Volunteer study illustrated improved c-spine setup reproducibility using OSI comparing with conventional setup. ROI alignments with 2mm/1° tolerance are achieved within 3 minutes. Identical knee support is important to achieve ROI-1 pitch alignment. Conclusion: The feasibility of this novel approach has been demonstrated for c-spine curvature setup reproducibility. Further evaluation is necessary with bony alignment variation in patient studies. This study is in part supported by the NIH (U54CA137788).« less
Sánchez Pérez, A; Honrubia López, F M; Larrosa Poves, J M; Polo Llorens, V; Melcon Sánchez-Frieras, B
2001-09-01
To develop a lens planimetry technique for the optic disc using AutoCAD. To determine variability magnitude of the optic disc morphological measurements. We employed AutoCAD R.14.0 Autodesk: image acquisition, contour delimitation by multiple lines fitting or ellipse adjustment, image sectorialization and measurements quantification (optic disc and excavation, vertical diameters, optic disc area, excavation area, neuroretinal sector area and Beta atrophy area). Intraimage or operator and interimage o total reproducibility was studied by coefficient of variability (CV) (n=10) in normal and myopic optic discs. This technique allows to obtain optic disc measurement in 5 to 10 minutes time. Total or interimage variability of measurements introduced by one observer presents CV range from 1.18-4.42. Operator or intraimage measurement presents CV range from 0.30-4.21. Optic disc contour delimitation by ellipse adjustment achieved better reproducibility results than multiple lines adjustment in all measurements. Computer assisted AutoCAD planimetry is an interactive method to analyse the optic disc, feasible to incorporate to clinical practice. Reproducibility results are comparable to other analyzers in quantification optic disc morphology. Ellipse adjustment improves results in optic disc contours delimitation.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.
Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel
2018-05-22
Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.
Robust tissue classification for reproducible wound assessment in telemedicine environments
NASA Astrophysics Data System (ADS)
Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves
2010-04-01
In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.
Ortega, Nàdia; Macià, Alba; Romero, Maria-Paz; Trullols, Esther; Morello, Jose-Ramón; Anglès, Neus; Motilva, Maria-Jose
2009-08-26
An improved chromatographic method was developed using ultra-performance liquid chromatography-tandem mass spectrometry to identify and quantify phenolic compounds and alkaloids, theobromine and caffeine, in carob flour samples. The developed method has been validated in terms of speed, sensitivity, selectivity, peak efficiency, linearity, reproducibility, limits of detection, and limits of quantification. The chromatographic method allows the identification and quantification of 20 phenolic compounds, that is, phenolic acids, flavonoids, and their aglycone and glucoside forms, together with the determination of the alkaloids, caffeine and theobromine, at low concentration levels all in a short analysis time of less than 20 min.
Technical advances in proteomics: new developments in data-independent acquisition.
Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro
2016-01-01
The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.
Yokaribas, Volkan; Schneider, Daniel S.; Friebertshäuser, Philipp; Lemme, Max C.; Fritzen, Claus-Peter
2017-01-01
The two-dimensional material graphene promises a broad variety of sensing activities. Based on its low weight and high versatility, the sensor density can significantly be increased on a structure, which can improve reliability and reduce fluctuation in damage detection strategies such as structural health monitoring (SHM). Moreover; it initializes the basis of structure–sensor fusion towards self-sensing structures. Strain gauges are extensively used sensors in scientific and industrial applications. In this work, sensing in small strain fields (from −0.1% up to 0.1%) with regard to structural dynamics of a mechanical structure is presented with sensitivities comparable to bulk materials by measuring the inherent piezoresistive effect of graphene grown by chemical vapor deposition (CVD) with a very high aspect ratio of approximately 4.86 × 108. It is demonstrated that the increasing number of graphene layers with CVD graphene plays a key role in reproducible strain gauge application since defects of individual layers may become less important in the current path. This may lead to a more stable response and, thus, resulting in a lower scattering.. Further results demonstrate the piezoresistive effect in a network consisting of liquid exfoliated graphene nanoplatelets (GNP), which result in even higher strain sensitivity and reproducibility. A model-assisted approach provides the main parameters to find an optimum of sensitivity and reproducibility of GNP films. The fabricated GNP strain gauges show a minimal deviation in PRE effect with a GF of approximately 5.6 and predict a linear electromechanical behaviour up to 1% strain. Spray deposition is used to develop a low-cost and scalable manufacturing process for GNP strain gauges. In this context, the challenge of reproducible and reliable manufacturing and operating must be overcome. The developed sensors exhibit strain gauges by considering the significant importance of reproducible sensor performances and open the path for graphene strain gauges for potential usages in science and industry. PMID:29258260
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
Asymmetric Shock Wave Generation in a Microwave Rocket Using a Magnetic Field
NASA Astrophysics Data System (ADS)
Takahashi, Masayuki
2017-10-01
A plasma pattern is reproduced by coupling simulations between a particle-in- cell with Monte Carlo collisions model and a finite-difference time-domain simulation for an electromagnetic wave propagation when an external magnetic field is applied to the breakdown volume inside a microwave-rocket nozzle. The propagation speed and energy-absorption rate of the plasma are estimated based on the breakdown simulation, and these are utilized to reproduce shock wave propagation, which provides impulsive thrust for the microwave rocket. The shock wave propagation is numerically reproduced by solving the compressible Euler equation with an energy source of the microwave heating. The shock wave is asymmetrically generated inside the nozzle when the electron cyclotron resonance region has a lateral offset, which generates lateral and angular impulses for postural control of the vehicle. It is possible to develop an integrated device to maintain beaming ight of the microwave rocket, achieving both axial thrust improvement and postural control, by controlling the spatial distribution of the external magnetic field.
Hughey, Christine A; Wilcox, Bruce; Minardi, Carina S; Takehara, Chiyo W; Sundararaman, Meenakshi; Were, Lilian M
2008-05-30
A rapid negative ion ESI high-performance capillary liquid chromatography-mass spectrometry method was developed to identify and quantify flavonoids (e.g., flavanols, flavonols, flavanones and glycosides). Fifteen standards and two varieties of almond skin extract powder (Carmel and Nonpareil) were used to demonstrate the chromatographic separation, reproducibility and accuracy of the method that employed a 150 mm x 0.3 mm ChromXP 3C18-EP-120 column. All standards eluted in less than 10 min, providing a 9-12x reduction in analysis time compared to existing methods (90-120 min). However, isomers (e.g., catechin/epicatechin and galactosides/glucosides) were not resolved and, therefore, identified and quantified collectively. RSDs for retention time and peak area reproducibility (mass spectrometry data) were <0.5% and <5.0%, respectively. Peak area reproducibility was greatly improved (from a RSD>10%) after the implementation of a low-flow metal needle in the ESI source. Quantitation by mass spectrometry also afforded a % error less than 5% for most compounds.
Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G
2017-10-01
A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Development of a stain shade guide to aid the measurement of extrinsic dental stain.
Gadhia, K; Shah, R; Swaminathan, D; Wetton, S; Moran, J
2006-05-01
Accurate and reproducible assessment of extrinsic staining is pivotal to determining efficacy of some tooth whitening oral hygiene products. The aim of this study was: (1) to produce a stain shade guide to aid the in vitro and in vivo stain assessment (2) to assess intra- and inter-examiner reproducibility of stain assessment using the stain shade guide. Using chlorhexidine and tea, perspex and acrylic teeth specimens were stained. The amount of staining on the perspex was measured with a spectrophotometer and the values obtained were assigned to the stained acrylic teeth, which were made into a stain guide. Using clinical photographs and a group of 10 volunteers, stain area and intensity were assessed using the stain guide and the recognized Lobene stain index by two examiners. The degree of intra- and inter-examiner reproducibility for these measurements were assessed using Cohen's kappa statistics. For both the clinical examination and use of photographs, intra-examiner reproducibility for stain intensity was improved when using the stain guide compared with the Lobene Index. Similarly, when assessing inter-examiner reproducibility, stain intensity kappa values were greater using the stain guide (kappa = 0.82) compared with the Lobene Index (kappa = 0.57). The findings of this study would suggest that the use of the stain guide could be of importance in the assessment of extrinsic dental stain.
Running an open experiment: transparency and reproducibility in soil and ecosystem science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond-Lamberty, Benjamin; Smith, Ashly P.; Bailey, Vanessa L.
Researchers in soil and ecosystem science, and almost every other field, are being pushed--by funders, journals, governments, and their peers--to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent "open experiment", in which wemore » documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team's communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.« less
Running an open experiment: transparency and reproducibility in soil and ecosystem science
NASA Astrophysics Data System (ADS)
Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa
2016-08-01
Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.
2010-10-01
reproducibility of genotype c alls among the four batches by comparing the HapMap samples across batches. We also calculated identity-by- descent (IBD...used to aid clinicians in per sonalizing dosage to improve the therapeutic index of radiotherapy treatment for prostate cancer. References None Appendices None
Hypersonic Technology Developments with EU Co-Funded Projects
2010-09-01
metal or even high performance alloys . The hollow sphere technology allows high degrees of porosities, reproducible properties and fair process control...sandwich structure configuration will be investigated. Titanium alloys and Ti-aluminides exhibit excellent mechanical properties for applications where...cooling techniques, new alloys , improved thermodynamic cycles by increased pressure ratios and TIT, etc… As the Olympus 593 engine was based on the
Protein purification and analysis: next generation Western blotting techniques.
Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V
2017-11-01
Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
sdg Interacting boson hamiltonian in the seniority scheme
NASA Astrophysics Data System (ADS)
Yoshinaga, N.
1989-03-01
The sdg interacting boson hamiltonian is derived in the seniority scheme. We use the method of Otsuka, Arima and Iachello in order to derive the boson hamiltonian from the fermion hamiltonian. To examine how good is the boson approximation in the zeroth-order, we carry out the exact shell model calculations in a single j-shell. It is found that almost all low-lying levels are reproduced quite well by diagonalizing the sdg interacting boson hamiltonian in the vibrational case. In the deformed case the introduction of g-bosons improves the reproduction of the spectra and of the binding energies which are obtained by diagonalizing the exact shell model hamiltonian. In particular the sdg interacting boson model reproduces well-developed rotational bands.
RIPOSTE: a framework for improving the design and analysis of laboratory-based research
Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn
2015-01-01
Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517
Huang, Shao-Hua; Ng, Lean-Teik
2011-07-22
An improved normal phase high performance liquid chromatographic (NP-HPLC) method was developed for simultaneous quantification of eight vitamin E isomers (α-, β-, γ- and δ-tocopherols and α-, β-, γ- and δ-tocotrienols) and γ-oryzanol in rice. A complete separation of all compounds was achieved within 25 min using an Inertsil CN-3, SIL-100A 5 μM (4.6 mm × 250 mm) column and an isocratic elution system of hexane/isopropanol/ethylacetate/acetic acid (97.6:0.8:0.8:0.8, v/v/v/v) at a flow rate varying from 0.7 to 1.5 mL min(-1). A linear correlation coefficient (r(2)>0.99) and high reproducibility were obtained at concentrations ranging 0.05-10 μg mL(-1) for vitamin E isomers and 0.5-500 μg mL(-1) for γ-oryzanol. This method proved to be rapid, accurate and reproducible. Copyright © 2011 Elsevier B.V. All rights reserved.
Li, Qing-bo; Liu, Jie-qiang; Li, Xiang
2012-03-01
A small non-invasive measurement system for human blood glucose has been developed, which can achieve fast, real-time and non invasive measurement of human blood glucose. The device is mainly composed of four parts, i. e. fixture, light system, data acquisition and processing systems, and spectrometer. A new scheme of light source driving was proposed, which can meet the requirements of light source under a variety of conditions of spectral acquisition. An integrated fixture design was proposed, which not only simplifies the optical structure of the system, but also improves the reproducibility of measurement conditions. The micro control system mainly achieves control function, dealing with data, data storage and so on. As the most important component, microprocessor DSP TMS320F2812 has many advantages, such as low power, high processing speed, high computing ability and so on. Wavelet denoising is used to pretreat the spectral data, which can decrease the loss of incident light and improve the signal-to-noise ratio. Kernel partial least squares method was adopted to build the mathematical model, which can improve the precision of the system. In the calibration experiment of the system, the standard values were measured by One-Touch. The correlation coefficient between standard blood glucose values and truth values is 0.95. The root mean square error of measurement is 0.6 mmol x L(-1). The system has good reproducibility.
Improved capacitive stress transducers for high-field superconducting magnets
NASA Astrophysics Data System (ADS)
Benson, Christopher Pete; Holik, Eddie Frank, III; Jaisle, Andrew; McInturff, A.; McIntyre, P.
2012-06-01
High-field (12-18 Tesla) superconducting magnets are required to enable an increase in the energy of future colliders. Such field strength requires the use of Nb3Sn superconductor, which has limited tolerance for compressive and shear strain. A strategy for stress management has been developed at Texas A&M University and is being implemented in TAMU3, a short-model 14 Tesla stress-managed Nb3Sn block dipole. The strategy includes the use of laminar capacitive stress transducers to monitor the stresses within the coil package. We have developed fabrication techniques and fixtures, which improve the reproducibility of the transducer response both at room temperature and during cryogenic operation. This is a report of the status of transducer development.
Gray, Allan; Wright, Alex; Jackson, Pete; Hale, Mike; Treanor, Darren
2015-03-01
Histochemical staining of tissue is a fundamental technique in tissue diagnosis and research, but it suffers from significant variability. Efforts to address this include laboratory quality controls and quality assurance schemes, but these rely on subjective interpretation of stain quality, are laborious and have low reproducibility. We aimed (1) to develop a method for histochemical stain quantification using whole slide imaging and image analysis and (2) to demonstrate its usefulness in measuring staining variation. A method to quantify the individual stain components of histochemical stains on virtual slides was developed. It was evaluated for repeatability and reproducibility, then applied to control sections of an appendix to quantify H&E staining (H/E intensities and H:E ratio) between automated staining machines and to measure differences between six regional diagnostic laboratories. The method was validated with <0.5% variation in H:E ratio measurement when using the same scanner for a batch of slides (ie, it was repeatable) but was not highly reproducible between scanners or over time, where variation of 7% was found. Application of the method showed H:E ratios between three staining machines varied from 0.69 to 0.93, H:E ratio variation over time was observed. Interlaboratory comparison demonstrated differences in H:E ratio between regional laboratories from 0.57 to 0.89. A simple method using whole slide imaging can be used to quantify and compare histochemical staining. This method could be deployed in routine quality assurance and quality control. Work is needed on whole slide imaging devices to improve reproducibility. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Methods to increase reproducibility in differential gene expression via meta-analysis
Sweeney, Timothy E.; Haynes, Winston A.; Vallania, Francesco; Ioannidis, John P.; Khatri, Purvesh
2017-01-01
Findings from clinical and biological studies are often not reproducible when tested in independent cohorts. Due to the testing of a large number of hypotheses and relatively small sample sizes, results from whole-genome expression studies in particular are often not reproducible. Compared to single-study analysis, gene expression meta-analysis can improve reproducibility by integrating data from multiple studies. However, there are multiple choices in designing and carrying out a meta-analysis. Yet, clear guidelines on best practices are scarce. Here, we hypothesized that studying subsets of very large meta-analyses would allow for systematic identification of best practices to improve reproducibility. We therefore constructed three very large gene expression meta-analyses from clinical samples, and then examined meta-analyses of subsets of the datasets (all combinations of datasets with up to N/2 samples and K/2 datasets) compared to a ‘silver standard’ of differentially expressed genes found in the entire cohort. We tested three random-effects meta-analysis models using this procedure. We showed relatively greater reproducibility with more-stringent effect size thresholds with relaxed significance thresholds; relatively lower reproducibility when imposing extraneous constraints on residual heterogeneity; and an underestimation of actual false positive rate by Benjamini–Hochberg correction. In addition, multivariate regression showed that the accuracy of a meta-analysis increased significantly with more included datasets even when controlling for sample size. PMID:27634930
An Index to Objectively Score Supraglottic Abnormalities in Refractory Asthma
Good, James T.; Rollins, Donald R.; Curran-Everett, Douglas; Lommatzsch, Steven E.; Carolan, Brendan J.; Stubenrauch, Peter C.
2014-01-01
Background: Patients with refractory asthma frequently have elements of laryngopharyngeal reflux (LPR) with potential aspiration contributing to their poor control. We previously reported on a supraglottic index (SGI) scoring system that helps in the evaluation of LPR with potential aspiration. However, to further the usefulness of this SGI scoring system for bronchoscopists, a teaching system was developed that included both interobserver and intraobserver reproducibility. Methods: Five pulmonologists with expertise in fiber-optic bronchoscopy but novice to the SGI participated. A training system was developed that could be used via Internet interaction to make this learning technique widely available. Results: By the final testing, there was excellent interreader agreement (κ of at least 0.81), thus documenting reproducibility in scoring the SGI. For the measure of intrareader consistency, one reader was arbitrarily selected to rescore the final test 4 weeks later and had a κ value of 0.93, with a 95% CI of 0.79 to 1.00. Conclusions: In this study, we demonstrate that with an organized educational approach, bronchoscopists can develop skills to have highly reproducible assessment and scoring of supraglottic abnormalities. The SGI can be used to determine which patients need additional intervention to determine causes of LPR and gastroesophageal reflux. Identification of this problem in patients with refractory asthma allows for personal, individual directed therapy to improve asthma control. PMID:24202552
Macedo-Ojeda, Gabriela; Márquez-Sandoval, Fabiola; Fernández-Ballart, Joan; Vizmanos, Barbara
2016-01-01
The study of diet quality in a population provides information for the development of programs to improve nutritional status through better directed actions. The aim of this study was to assess the reproducibility and relative validity of a Mexican Diet Quality Index (ICDMx) for the assessment of the habitual diet of adults. The ICDMx was designed to assess the characteristics of a healthy diet using a validated semi-quantitative food frequency questionnaire (FFQ-Mx). Reproducibility was determined by comparing 2 ICDMx based on FFQs (one-year interval). Relative validity was assessed by comparing the ICDMx (2nd FFQ) with that estimated based on the intake averages from dietary records (nine days). The questionnaires were answered by 97 adults (mean age in years = 27.5, SD = 12.6). Pearson (r) and intraclass correlations (ICC) were calculated; Bland-Altman plots, Cohen’s κ coefficients and blood lipid determinations complemented the analysis. Additional analysis compared ICDMx scores with nutrients derived from dietary records, using a Pearson correlation. These nutrient intakes were transformed logarithmically to improve normality (log10) and adjusted according to energy, prior to analyses. The ICDMx obtained ICC reproducibility values ranged from 0.33 to 0.87 (23/24 items with significant correlations; mean = 0.63), while relative validity ranged from 0.26 to 0.79 (mean = 0.45). Bland-Altman plots showed a high level of agreement between methods. ICDMx scores were inversely correlated (p < 0.05) with total blood cholesterol (r = −0.33) and triglycerides (r = −0.22). ICDMx (as calculated from FFQs and DRs) obtained positive correlations with fiber, magnesium, potassium, retinol, thiamin, riboflavin, pyridoxine, and folate. The ICDMx obtained acceptable levels of reproducibility and relative validity in this population. It can be useful for population nutritional surveillance and to assess the changes resulting from the implementation of nutritional interventions. PMID:27563921
Progress toward openness, transparency, and reproducibility in cognitive neuroscience.
Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal
2017-05-01
Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.
Maikusa, Norihide; Yamashita, Fumio; Tanaka, Kenichiro; Abe, Osamu; Kawaguchi, Atsushi; Kabasawa, Hiroyuki; Chiba, Shoma; Kasahara, Akihiro; Kobayashi, Nobuhisa; Yuasa, Tetsuya; Sato, Noriko; Matsuda, Hiroshi; Iwatsubo, Takeshi
2013-06-01
Serial magnetic resonance imaging (MRI) images acquired from multisite and multivendor MRI scanners are widely used in measuring longitudinal structural changes in the brain. Precise and accurate measurements are important in understanding the natural progression of neurodegenerative disorders such as Alzheimer's disease. However, geometric distortions in MRI images decrease the accuracy and precision of volumetric or morphometric measurements. To solve this problem, the authors suggest a commercially available phantom-based distortion correction method that accommodates the variation in geometric distortion within MRI images obtained with multivendor MRI scanners. The authors' method is based on image warping using a polynomial function. The method detects fiducial points within a phantom image using phantom analysis software developed by the Mayo Clinic and calculates warping functions for distortion correction. To quantify the effectiveness of the authors' method, the authors corrected phantom images obtained from multivendor MRI scanners and calculated the root-mean-square (RMS) of fiducial errors and the circularity ratio as evaluation values. The authors also compared the performance of the authors' method with that of a distortion correction method based on a spherical harmonics description of the generic gradient design parameters. Moreover, the authors evaluated whether this correction improves the test-retest reproducibility of voxel-based morphometry in human studies. A Wilcoxon signed-rank test with uncorrected and corrected images was performed. The root-mean-square errors and circularity ratios for all slices significantly improved (p < 0.0001) after the authors' distortion correction. Additionally, the authors' method was significantly better than a distortion correction method based on a description of spherical harmonics in improving the distortion of root-mean-square errors (p < 0.001 and 0.0337, respectively). Moreover, the authors' method reduced the RMS error arising from gradient nonlinearity more than gradwarp methods. In human studies, the coefficient of variation of voxel-based morphometry analysis of the whole brain improved significantly from 3.46% to 2.70% after distortion correction of the whole gray matter using the authors' method (Wilcoxon signed-rank test, p < 0.05). The authors proposed a phantom-based distortion correction method to improve reproducibility in longitudinal structural brain analysis using multivendor MRI. The authors evaluated the authors' method for phantom images in terms of two geometrical values and for human images in terms of test-retest reproducibility. The results showed that distortion was corrected significantly using the authors' method. In human studies, the reproducibility of voxel-based morphometry analysis for the whole gray matter significantly improved after distortion correction using the authors' method.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
Minimally invasive surgical video analysis: a powerful tool for surgical training and navigation.
Sánchez-González, P; Oropesa, I; Gómez, E J
2013-01-01
Analysis of minimally invasive surgical videos is a powerful tool to drive new solutions for achieving reproducible training programs, objective and transparent assessment systems and navigation tools to assist surgeons and improve patient safety. This paper presents how video analysis contributes to the development of new cognitive and motor training and assessment programs as well as new paradigms for image-guided surgery.
The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.
Lash, Timothy L
2017-09-15
In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges.
Mullane, Kevin; Williams, Michael
2017-08-15
Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real. Copyright © 2017 Elsevier Inc. All rights reserved.
Multi-oriented windowed harmonic phase reconstruction for robust cardiac strain imaging.
Cordero-Grande, Lucilio; Royuela-del-Val, Javier; Sanz-Estébanez, Santiago; Martín-Fernández, Marcos; Alberola-López, Carlos
2016-04-01
The purpose of this paper is to develop a method for direct estimation of the cardiac strain tensor by extending the harmonic phase reconstruction on tagged magnetic resonance images to obtain more precise and robust measurements. The extension relies on the reconstruction of the local phase of the image by means of the windowed Fourier transform and the acquisition of an overdetermined set of stripe orientations in order to avoid the phase interferences from structures outside the myocardium and the instabilities arising from the application of a gradient operator. Results have shown that increasing the number of acquired orientations provides a significant improvement in the reproducibility of the strain measurements and that the acquisition of an extended set of orientations also improves the reproducibility when compared with acquiring repeated samples from a smaller set of orientations. Additionally, biases in local phase estimation when using the original harmonic phase formulation are greatly diminished by the one here proposed. The ideas here presented allow the design of new methods for motion sensitive magnetic resonance imaging, which could simultaneously improve the resolution, robustness and accuracy of motion estimates. Copyright © 2015 Elsevier B.V. All rights reserved.
Bambini, Deborah; Emery, Matthew; de Voest, Margaret; Meny, Lisa; Shoemaker, Michael J.
2016-01-01
There are significant limitations among the few prior studies that have examined the development and implementation of interprofessional education (IPE) experiences to accommodate a high volume of students from several disciplines and from different institutions. The present study addressed these gaps by seeking to determine the extent to which a single, large, inter-institutional, and IPE simulation event improves student perceptions of the importance and relevance of IPE and simulation as a learning modality, whether there is a difference in students’ perceptions among disciplines, and whether the results are reproducible. A total of 290 medical, nursing, pharmacy, and physical therapy students participated in one of two large, inter-institutional, IPE simulation events. Measurements included student perceptions about their simulation experience using the Attitude Towards Teamwork in Training Undergoing Designed Educational Simulation (ATTITUDES) Questionnaire and open-ended questions related to teamwork and communication. Results demonstrated a statistically significant improvement across all ATTITUDES subscales, while time management, role confusion, collaboration, and mutual support emerged as significant themes. Results of the present study indicate that a single IPE simulation event can reproducibly result in significant and educationally meaningful improvements in student perceptions towards teamwork, IPE, and simulation as a learning modality. PMID:28970407
NASA Astrophysics Data System (ADS)
Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku
2017-09-01
In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.
Porcupine: A visual pipeline tool for neuroimaging analysis
Snoek, Lukas; Knapen, Tomas
2018-01-01
The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461
Raguse, Marina; Fiebrandt, Marcel; Stapelmann, Katharina; Madela, Kazimierz; Laue, Michael; Lackmann, Jan-Wilm; Thwaite, Joanne E.; Setlow, Peter; Awakowicz, Peter
2016-01-01
Novel decontamination technologies, including cold low-pressure plasma and blue light (400 nm), are promising alternatives to conventional surface decontamination methods. However, the standardization of the assessment of such sterilization processes remains to be accomplished. Bacterial endospores of the genera Bacillus and Geobacillus are frequently used as biological indicators (BIs) of sterility. Ensuring standardized and reproducible BIs for reliable testing procedures is a significant problem in industrial settings. In this study, an electrically driven spray deposition device was developed, allowing fast, reproducible, and homogeneous preparation of Bacillus subtilis 168 spore monolayers on glass surfaces. A detailed description of the structural design as well as the operating principle of the spraying device is given. The reproducible formation of spore monolayers of up to 5 × 107 spores per sample was verified by scanning electron microscopy. Surface inactivation studies revealed that monolayered spores were inactivated by UV-C (254 nm), low-pressure argon plasma (500 W, 10 Pa, 100 standard cubic cm per min), and blue light (400 nm) significantly faster than multilayered spores were. We have thus succeeded in the uniform preparation of reproducible, highly concentrated spore monolayers with the potential to generate BIs for a variety of nonpenetrating surface decontamination techniques. PMID:26801572
A Caveat Note on Tuning in the Development of Coupled Climate Models
NASA Astrophysics Data System (ADS)
Dommenget, Dietmar; Rezny, Michael
2018-01-01
State-of-the-art coupled general circulation models (CGCMs) have substantial errors in their simulations of climate. In particular, these errors can lead to large uncertainties in the simulated climate response (both globally and regionally) to a doubling of CO2. Currently, tuning of the parameterization schemes in CGCMs is a significant part of the developed. It is not clear whether such tuning actually improves models. The tuning process is (in general) neither documented, nor reproducible. Alternative methods such as flux correcting are not used nor is it clear if such methods would perform better. In this study, ensembles of perturbed physics experiments are performed with the Globally Resolved Energy Balance (GREB) model to test the impact of tuning. The work illustrates that tuning has, in average, limited skill given the complexity of the system, the limited computing resources, and the limited observations to optimize parameters. While tuning may improve model performance (such as reproducing observed past climate), it will not get closer to the "true" physics nor will it significantly improve future climate change projections. Tuning will introduce artificial compensating error interactions between submodels that will hamper further model development. In turn, flux corrections do perform well in most, but not all aspects. A main advantage of flux correction is that it is much cheaper, simpler, more transparent, and it does not introduce artificial error interactions between submodels. These GREB model experiments should be considered as a pilot study to motivate further CGCM studies that address the issues of model tuning.
Bad Behavior: Improving Reproducibility in Behavior Testing.
Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan
2018-01-24
Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.
The operations manual: a mechanism for improving the research process.
Bowman, Ann; Wyman, Jean F; Peters, Jennifer
2002-01-01
The development and use of an operations manual has the potential to improve the capacity of nurse scientists to address the complex, multifaceted issues associated with conducting research in today's healthcare environment. An operations manual facilitates communication, standardizes training and evaluation, and enhances the development and standard implementation of clear policies, processes, and protocols. A 10-year review of methodology articles in relevant nursing journals revealed no attention to this topic. This article will discuss how an operations manual can improve the conduct of research methods and outcomes for both small-scale and large-scale research studies. It also describes the purpose and components of a prototype operations manual for use in quantitative research. The operations manual increases reliability and reproducibility of the research while improving the management of study processes. It can prevent costly and untimely delays or errors in the conduct of research.
Translational aspects of rectal evoked potentials: a comparative study in rats and humans
Nissen, Thomas Dahl; Graversen, Carina; Coen, Steven J.; Hultin, Leif; Aziz, Qasim; Lykkesfeldt, Jens; Drewes, Asbjørn Mohr
2013-01-01
Inconsistencies between species has stunted the progress of developing new analgesics. To increase the success of translating results between species, improved comparable models are required. Twelve rats received rectal balloon distensions on 2 different days separated by 24.3 (SD 24.6) days. Rectal balloon distensions were also performed in 18 humans (mean age: 34 yr; range: 21–56 yr; 12 men) on two separate occasions, separated by 9.3 (SD 5.5) days. In rats, cerebral evoked potentials (CEPs) were recorded by use of implanted skull-electrodes to distension pressure of 80 mmHg. In humans surface electrodes and individualized pressure, corresponding to pain detection threshold, were used. Comparison of morphology was assessed by wavelet analysis. Within- and between-day reproducibility was assessed in terms of latencies, amplitudes, and frequency content. In rats CEPs showed triphasic morphology. No differences in latencies, amplitudes, and power distribution were seen within or between days (all P ≥ 0.5). Peak-to-peak amplitude between the first positive and negative potential were the most reproducible characteristic within and between days (evaluated by intraclass correlation coefficients, ICC) (ICC = 0.99 and ICC = 9.98, respectively). In humans CEPs showed a triphasic morphology. No differences in latencies, amplitudes, or power distribution were seen within or between days (all P ≥ 0.2). Latency to the second negative potential (ICC = 0.98) and the second positive potential (ICC = 0.95) was the most reproducible characteristic within and between days. A unique and reliable translational platform was established assessing visceral sensitivity in rats and humans, which may improve the translational process of developing new drugs targeting visceral pain. PMID:23703652
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Wilkins-Diehr, Nancy; Chue Hong, Neil; Venters, Colin C.; Howison, James; Seinstra, Frank; Jones, Matthew; Cranston, Karen; Clune, Thomas L.; de Val-Borro, Miguel; Littauer, Richard
2016-02-01
This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2). The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a "Software Paper."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohlgemuth, J.; Bokria, J.; Gu, X.
Polymeric encapsulation materials may a change size when processed at typical module lamination temperatures. The relief of residual strain, trapped during the manufacture of encapsulation sheet, can affect module performance and reliability. For example, displaced cells and interconnects threaten: cell fracture; broken interconnects (open circuits and ground faults); delamination at interfaces; and void formation. A standardized test for the characterization of change in linear dimensions of encapsulation sheet has been developed and verified. The IEC 62788-1-5 standard quantifies the maximum change in linear dimensions that may occur to allow for process control of size change. Developments incorporated into the Committeemore » Draft (CD) of the standard as well as the assessment of the repeatability and reproducibility of the test method are described here. No pass/fail criteria are given in the standard, rather a repeatable protocol to quantify the change in dimension is provided to aid those working with encapsulation. The round-robin experiment described here identified that the repeatability and reproducibility of measurements is on the order of 1%. Recent refinements to the test procedure to improve repeatability and reproducibility include: the use of a convection oven to improve the thermal equilibration time constant and its uniformity; well-defined measurement locations reduce the effects of sampling size -and location- relative to the specimen edges; a standardized sand substrate may be readily obtained to reduce friction that would otherwise complicate the results; specimen sampling is defined, so that material is examined at known sites across the width and length of rolls; and encapsulation should be examined at the manufacturer’s recommended processing temperature, except when a cross-linking reaction may limit the size change. EVA, for example, should be examined 100 °C, between its melt transition (occurring up to 80 °C) and the onset of cross-linking (often at 100 °C).« less
An Algorithm Using Twelve Properties of Antibiotics to Find the Recommended Antibiotics, as in CPGs.
Tsopra, R; Venot, A; Duclos, C
2014-01-01
Clinical Decision Support Systems (CDSS) incorporating justifications, updating and adjustable recommendations can considerably improve the quality of healthcare. We propose a new approach to the design of CDSS for empiric antibiotic prescription, based on implementation of the deeper medical reasoning used by experts in the development of clinical practice guidelines (CPGs), to deduce the recommended antibiotics. We investigated two methods ("exclusion" versus "scoring") for reproducing this reasoning based on antibiotic properties. The "exclusion" method reproduced expert reasoning the more accurately, retrieving the full list of recommended antibiotics for almost all clinical situations. This approach has several advantages: (i) it provides convincing explanations for physicians; (ii) updating could easily be incorporated into the CDSS; (iii) it can provide recommendations for clinical situations missing from CPGs.
GigaDB: promoting data dissemination and reproducibility
Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.
2014-01-01
Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
ITK: enabling reproducible research and open science
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387
ITK: enabling reproducible research and open science.
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.
2012-01-01
Background Baeyer-Villiger monooxygenases (BVMOs) represent a group of enzymes of considerable biotechnological relevance as illustrated by their growing use as biocatalyst in a variety of synthetic applications. However, due to their increased use the reproducible expression of BVMOs and other biotechnologically relevant enzymes has become a pressing matter while knowledge about the factors governing their reproducible expression is scattered. Results Here, we have used phenylacetone monooxygenase (PAMO) from Thermobifida fusca, a prototype Type I BVMO, as a model enzyme to develop a stepwise strategy to optimize the biotransformation performance of recombinant E. coli expressing PAMO in 96-well microtiter plates in a reproducible fashion. Using this system, the best expression conditions of PAMO were investigated first, including different host strains, temperature as well as time and induction period for PAMO expression. This optimized system was used next to improve biotransformation conditions, the PAMO-catalyzed conversion of phenylacetone, by evaluating the best electron donor, substrate concentration, and the temperature and length of biotransformation. Combining all optimized parameters resulted in a more than four-fold enhancement of the biocatalytic performance and, importantly, this was highly reproducible as indicated by the relative standard deviation of 1% for non-washed cells and 3% for washed cells. Furthermore, the optimized procedure was successfully adapted for activity-based mutant screening. Conclusions Our optimized procedure, which provides a comprehensive overview of the key factors influencing the reproducible expression and performance of a biocatalyst, is expected to form a rational basis for the optimization of miniaturized biotransformations and for the design of novel activity-based screening procedures suitable for BVMOs and other NAD(P)H-dependent enzymes as well. PMID:22720747
Estimation of contrast agent bolus arrival delays for improved reproducibility of liver DCE MRI
NASA Astrophysics Data System (ADS)
Chouhan, Manil D.; Bainbridge, Alan; Atkinson, David; Punwani, Shonit; Mookerjee, Rajeshwar P.; Lythgoe, Mark F.; Taylor, Stuart A.
2016-10-01
Delays between contrast agent (CA) arrival at the site of vascular input function (VIF) sampling and the tissue of interest affect dynamic contrast enhanced (DCE) MRI pharmacokinetic modelling. We investigate effects of altering VIF CA bolus arrival delays on liver DCE MRI perfusion parameters, propose an alternative approach to estimating delays and evaluate reproducibility. Thirteen healthy volunteers (28.7 ± 1.9 years, seven males) underwent liver DCE MRI using dual-input single compartment modelling, with reproducibility (n = 9) measured at 7 days. Effects of VIF CA bolus arrival delays were assessed for arterial and portal venous input functions. Delays were pre-estimated using linear regression, with restricted free modelling around the pre-estimated delay. Perfusion parameters and 7 days reproducibility were compared using this method, freely modelled delays and no delays using one-way ANOVA. Reproducibility was assessed using Bland-Altman analysis of agreement. Maximum percent change relative to parameters obtained using zero delays, were -31% for portal venous (PV) perfusion, +43% for total liver blood flow (TLBF), +3247% for hepatic arterial (HA) fraction, +150% for mean transit time and -10% for distribution volume. Differences were demonstrated between the 3 methods for PV perfusion (p = 0.0085) and HA fraction (p < 0.0001), but not other parameters. Improved mean differences and Bland-Altman 95% Limits-of-Agreement for reproducibility of PV perfusion (9.3 ml/min/100 g, ±506.1 ml/min/100 g) and TLBF (43.8 ml/min/100 g, ±586.7 ml/min/100 g) were demonstrated using pre-estimated delays with constrained free modelling. CA bolus arrival delays cause profound differences in liver DCE MRI quantification. Pre-estimation of delays with constrained free modelling improved 7 days reproducibility of perfusion parameters in volunteers.
Hackley, Paul C.; Araujo, Carla Viviane; Borrego, Angeles G.; Bouzinos, Antonis; Cardott, Brian; Cook, Alan C.; Eble, Cortland; Flores, Deolinda; Gentzis, Thomas; Gonçalves, Paula Alexandra; Filho, João Graciano Mendonça; Hámor-Vidó, Mária; Jelonek, Iwona; Kommeren, Kees; Knowles, Wayne; Kus, Jolanta; Mastalerz, Maria; Menezes, Taíssa Rêgo; Newman, Jane; Pawlewicz, Mark; Pickel, Walter; Potter, Judith; Ranasinghe, Paddy; Read, Harold; Reyes, Julito; Rodriguez, Genaro De La Rosa; de Souza, Igor Viegas Alves Fernandes; Suarez-Ruiz, Isabel; Sýkorová, Ivana; Valentine, Brett J.
2015-01-01
Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the D7708 measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (0.31-1.53% Ro), from organic-lean to organic-rich (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability limits (maximum difference between valid repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility limits (maximum difference between valid results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility limits degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping the other components. Discussion among participants suggested this problem could not be easily corrected via kerogen concentration or solvent extraction and is related to operator training and background. No statistical difference in mean reflectance was identified between participants reporting bitumen reflectance vs. vitrinite reflectance vs. a mixture of bitumen and vitrinite reflectance values, suggesting empirical conversion schemes should be treated with caution. Analysis of reproducibility limits obtained during this exercise in comparison to reproducibility limits from historical interlaboratory exercises suggests use of a common methodology (D7708) improves interlaboratory precision. Future work will investigate opportunities to improve reproducibility in high maturity, organic-lean shale varieties.
How reproducible are methods to measure the dynamic viscoelastic properties of poroelastic media?
NASA Astrophysics Data System (ADS)
Bonfiglio, Paolo; Pompoli, Francesco; Horoshenkov, Kirill V.; Rahim, Mahmud Iskandar B. Seth A.; Jaouen, Luc; Rodenas, Julia; Bécot, François-Xavier; Gourdon, Emmanuel; Jaeger, Dirk; Kursch, Volker; Tarello, Maurizio; Roozen, Nicolaas Bernardus; Glorieux, Christ; Ferrian, Fabrizio; Leroy, Pierre; Vangosa, Francesco Briatico; Dauchez, Nicolas; Foucart, Félix; Lei, Lei; Carillo, Kevin; Doutres, Olivier; Sgard, Franck; Panneton, Raymond; Verdiere, Kévin; Bertolini, Claudio; Bär, Rolf; Groby, Jean-Philippe; Geslain, Alan; Poulain, Nicolas; Rouleau, Lucie; Guinault, Alain; Ahmadi, Hamid; Forge, Charlie
2018-08-01
There is a considerable number of research publications on the acoustical properties of porous media with an elastic frame. A simple search through the Web of Science™ (last accessed 21 March 2018) suggests that there are at least 819 publications which deal with the acoustics of poroelastic media. A majority of these researches require accurate knowledge of the elastic properties over a broad frequency range. However, the accuracy of the measurement of the dynamic elastic properties of poroelastic media has been a contentious issue. The novelty of this paper is that it studies the reproducibility of some popular experimental methods which are used routinely to measure the key elastic properties such as the dynamic Young's modulus, loss factor and Poisson ratio of poroelastic media. In this paper, fourteen independent sets of laboratory measurements were performed on specimens of the same porous materials. The results from these measurements suggest that the reproducibility of this type of experimental method is poor. This work can be helpful to suggest improvements which can be developed to harmonize the way the elastic properties of poroelastic media are measured worldwide.
Massive and Reproducible Production of Liver Buds Entirely from Human Pluripotent Stem Cells.
Takebe, Takanori; Sekine, Keisuke; Kimura, Masaki; Yoshizawa, Emi; Ayano, Satoru; Koido, Masaru; Funayama, Shizuka; Nakanishi, Noriko; Hisai, Tomoko; Kobayashi, Tatsuya; Kasai, Toshiharu; Kitada, Rina; Mori, Akira; Ayabe, Hiroaki; Ejiri, Yoko; Amimoto, Naoki; Yamazaki, Yosuke; Ogawa, Shimpei; Ishikawa, Momotaro; Kiyota, Yasujiro; Sato, Yasuhiko; Nozawa, Kohei; Okamoto, Satoshi; Ueno, Yasuharu; Taniguchi, Hideki
2017-12-05
Organoid technology provides a revolutionary paradigm toward therapy but has yet to be applied in humans, mainly because of reproducibility and scalability challenges. Here, we overcome these limitations by evolving a scalable organ bud production platform entirely from human induced pluripotent stem cells (iPSC). By conducting massive "reverse" screen experiments, we identified three progenitor populations that can effectively generate liver buds in a highly reproducible manner: hepatic endoderm, endothelium, and septum mesenchyme. Furthermore, we achieved human scalability by developing an omni-well-array culture platform for mass producing homogeneous and miniaturized liver buds on a clinically relevant large scale (>10 8 ). Vascularized and functional liver tissues generated entirely from iPSCs significantly improved subsequent hepatic functionalization potentiated by stage-matched developmental progenitor interactions, enabling functional rescue against acute liver failure via transplantation. Overall, our study provides a stringent manufacturing platform for multicellular organoid supply, thus facilitating clinical and pharmaceutical applications especially for the treatment of liver diseases through multi-industrial collaborations. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Development of a device to simulate tooth mobility.
Erdelt, Kurt-Jürgen; Lamper, Timea
2010-10-01
The testing of new materials under simulation of oral conditions is essential in medicine. For simulation of fracture strength different simulation devices are used for test set-up. The results of these in vitro tests differ because there is no standardization of tooth mobility in simulation devices. The aim of this study is to develop a simulation device that depicts the tooth mobility curve as accurately as possible and creates reproducible and scalable mobility curves. With the aid of published literature and with the help of dentists, average forms of tooth classes were generated. Based on these tooth data, different abutment tooth shapes and different simulation devices were designed with a CAD system and were generated with a Rapid Prototyping system. Then, for all simulation devices the displacement curves were created with a universal testing machine and compared with the tooth mobility curve. With this new information, an improved adapted simulation device was constructed. A simulations device that is able to simulate the mobility curve of natural teeth with high accuracy and where mobility is reproducible and scalable was developed.
Design and development of biomimetic quadruped robot for behavior studies of rats and mice.
Ishii, Hiroyuki; Masuda, Yuichi; Miyagishima, Syunsuke; Fumino, Shogo; Takanishi, Atsuo; Laschi, Cecilia; Mazzolai, Barbara; Mattoli, Virgilio; Dario, Paolo
2009-01-01
This paper presents the design and development of a novel biomimetic quadruped robot for behavior studies of rats and mice. Many studies have been performed using these animals for the purpose of understanding human mind in psychology, pharmacology and brain science. In these fields, several experiments on social interactions have been performed using rats as basic studies of mental disorders or social learning. However, some researchers mention that the experiments on social interactions using animals are poorly-reproducible. Therefore, we consider that reproducibility of these experiments can be improved by using a robotic agent that interacts with an animal subject. Thus, we developed a small quadruped robot WR-2 (Waseda Rat No. 2) that behaves like a real rat. Proportion and DOF arrangement of WR-2 are designed based on those of a mature rat. This robot has four 3-DOF legs, a 2-DOF waist and a 1-DOF neck. A microcontroller and a wireless communication module are implemented on it. A battery is also implemented. Thus, it can walk, rear by limbs and groom its body.
Mars, Godelief M J; van Eijk, Jacques Th M; Post, Marcel W M; Proot, Ireen M; Mesters, Ilse; Kempen, Gertrudis I J M
2014-08-01
To develop and test the Maastricht Personal Autonomy Questionnaire (MPAQ), an instrument measuring personal autonomy of older adults with a chronic physical illness in accordance with their experience of autonomy. Achievement of personal autonomy is conceptualized as correspondence between the way people's lives are actually arranged and the way people want to arrange their lives. A field test was conducted in three waves (n = 412, n = 125 and n = 244) among a random sample of people older than 59 years with either chronic obstructive pulmonary disease or diabetes mellitus. Construct validity, reproducibility and responsiveness were evaluated. The MPAQ entailing 16 items consists of three scales: degree of (personal) autonomy, working on autonomy and dilemmas. Construct validity was largely supported by confirmatory factor analysis and correlations between the MPAQ and other instruments. Intraclass correlation coefficients ranged from 0.61 to 0.80 and SRDsgroup from 0.10 to 0.13. Mean change was larger (0.54) than was SRDgroup (0.11) in patients who had deteriorated, but smaller in patients who had improved (0.07). The MPAQ has good content and construct validity and moderate reproducibility. Responsiveness is weak, although better for deterioration than for improvement.
Macintyre, Lisa
2011-11-01
Accurate measurement of the pressure delivered by medical compression products is highly desirable both in monitoring treatment and in developing new pressure inducing garments or products. There are several complications in measuring pressure at the garment/body interface and at present no ideal pressure measurement tool exists for this purpose. This paper summarises a thorough evaluation of the accuracy and reproducibility of measurements taken following both of Tekscan Inc.'s recommended calibration procedures for I-scan sensors; and presents an improved method for calibrating and using I-scan pressure sensors. The proposed calibration method enables accurate (±2.1 mmHg) measurement of pressures delivered by pressure garments to body parts with a circumference ≥30 cm. This method is too cumbersome for routine clinical use but is very useful, accurate and reproducible for product development or clinical evaluation purposes. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.
2016-01-01
The Cancer Target Discovery and Development (CTD2) Network was established to accelerate the transformation of “Big Data” into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding. This manuscript represents a first attempt to delineate the challenges of supporting and confirming discoveries arising from the systematic analysis of large-scale data resources in a collaborative work environment and to provide a framework that would begin a community discussion to resolve these challenges. The Network implemented a multi-Tier framework designed to substantiate the biological and biomedical relevance as well as the reproducibility of data and insights resulting from its collaborative activities. The same approach can be used by the broad scientific community to drive development of novel therapeutic and biomarker strategies for cancer. PMID:27401613
Reproducibility of HTS-SQUID magnetocardiography in an unshielded clinical environment.
Leder, U; Schrey, F; Haueisen, J; Dörrer, L; Schreiber, J; Liehr, M; Schwarz, G; Solbrig, O; Figulla, H R; Seidel, P
2001-07-01
A new technology has been developed which measures the magnetic field of the human heart (magnetocardiogram, MCG) by using high temperature superconducting (HTS) sensors. These sensors can be operated at the temperature of liquid nitrogen without electromagnetic shielding. We tested the reproducibility of HTS-MCG measurements in healthy volunteers. Unshielded HTS-MCG measurements were performed in 18 healthy volunteers in left precordial position in two separate sessions in a clinical environment. The heart cycles of 10 min were averaged, smoothed, the baselines were adjusted, and the data were standardized to the respective areas under the curves (AUC) of the absolute values of the QRST amplitudes. The QRS complexes and the ST-T intervals were used to assess the reproducibility of the two measurements. Ratios (R(QRS), R(STT)) were calculated by dividing the AUC of the first measurement by the ones of the second measurement. The linear correlation coefficients (CORR(QRS), CORR(STT)) of the time intervals of the two measurements were calculated, too. The HTS-MCG signal was completely concealed by the high noise level in the raw data. The averaging and smoothing algorithms unmasked the QRS complex and the ST segment. A high reproducibility was found for the QRS complex (R(QRS)=1.2+/-0.3, CORR(QRS)=0.96+/-0.06). Similarly to the shape of the ECG it was characterized by three bends, the Q, R, and S waves. In the ST-T interval, the reproducibility was considerably lower (R(STT)=0.9+/-0.2, CORR(STT)=0.66+/-0.28). In contrast to the shape of the ECG, a baseline deflection after the T wave which may belong to U wave activity was found in a number of volunteers. HTS-MCG devices can be operated in a clinical environment without shielding. Whereas the reproducibility was found to be high for the depolarization interval, it was considerably lower for the ST segment and for the T wave. Therefore, before clinically applying HTS-MCG systems to the detection of repolarization abnormalities in acute coronary syndromes, further technical development of the systems is necessary to improve the signal-to-noise ratio.
Reproducibility in science: improving the standard for basic and preclinical research.
Begley, C Glenn; Ioannidis, John P A
2015-01-02
Medical and scientific advances are predicated on new knowledge that is robust and reliable and that serves as a solid foundation on which further advances can be built. In biomedical research, we are in the midst of a revolution with the generation of new data and scientific publications at a previously unprecedented rate. However, unfortunately, there is compelling evidence that the majority of these discoveries will not stand the test of time. To a large extent, this reproducibility crisis in basic and preclinical research may be as a result of failure to adhere to good scientific practice and the desperation to publish or perish. This is a multifaceted, multistakeholder problem. No single party is solely responsible, and no single solution will suffice. Here we review the reproducibility problems in basic and preclinical biomedical research, highlight some of the complexities, and discuss potential solutions that may help improve research quality and reproducibility. © 2015 American Heart Association, Inc.
Development of delineator testing standard.
DOT National Transportation Integrated Search
2015-02-01
The objective of this project was to develop a new test method for evaluating the impact performance : of delineators for given applications. The researchers focused on developing a test method that was : reproducible and attempted to reproduce failu...
Ducret, Maxime; Fabre, Hugo; Degoul, Olivier; Atzeni, Gianluigi; McGuckin, Colin; Forraz, Nico; Alliot-Licht, Brigitte; Mallein-Gerin, Frédéric; Perrier-Groult, Emeline; Farges, Jean-Christophe
2015-01-01
In recent years, mesenchymal cell-based products have been developed to improve surgical therapies aimed at repairing human tissues. In this context, the tooth has recently emerged as a valuable source of stem/progenitor cells for regenerating orofacial tissues, with easy access to pulp tissue and high differentiation potential of dental pulp mesenchymal cells. International guidelines now recommend the use of standardized procedures for cell isolation, storage and expansion in culture to ensure optimal reproducibility, efficacy and safety when cells are used for clinical application. However, most dental pulp cell-based medicinal products manufacturing procedures may not be fully satisfactory since they could alter the cells biological properties and the quality of derived products. Cell isolation, enrichment and cryopreservation procedures combined to long-term expansion in culture media containing xeno- and allogeneic components are known to affect cell phenotype, viability, proliferation and differentiation capacities. This article focuses on current manufacturing strategies of dental pulp cell-based medicinal products and proposes a new protocol to improve efficiency, reproducibility and safety of these strategies. PMID:26300779
Atomic torsional modal analysis for high-resolution proteins.
Tirion, Monique M; ben-Avraham, Daniel
2015-03-01
We introduce a formulation for normal mode analyses of globular proteins that significantly improves on an earlier one-parameter formulation [M. M. Tirion, Phys. Rev. Lett. 77, 1905 (1996)] that characterized the slow modes associated with protein data bank structures. Here we develop that empirical potential function that is minimized at the outset to include two features essential to reproduce the eigenspectra and associated density of states in the 0 to 300cm-1 frequency range, not merely the slow modes. First, introduction of preferred dihedral-angle configurations via use of torsional stiffness constants eliminates anomalous dispersion characteristics due to insufficiently bound surface side chains and helps fix the spectrum thin tail frequencies (100-300cm-1). Second, we take into account the atomic identities and the distance of separation of all pairwise interactions, improving the spectrum distribution in the 20 to 300cm-1 range. With these modifications, not only does the spectrum reproduce that of full atomic potentials, but we obtain stable reliable eigenmodes for the slow modes and over a wide range of frequencies.
Improved ultrasonic standard reference blocks
NASA Technical Reports Server (NTRS)
Eitzen, D. G.
1975-01-01
A program to improve the quality, reproducibility and reliability of nondestructive testing through the development of improved ASTM-type ultrasonic reference standards is described. Reference blocks of aluminum, steel, and titanium alloys were considered. Equipment representing the state-of-the-art in laboratory and field ultrasonic equipment was obtained and evaluated. Some RF and spectral data on ten sets of ultrasonic reference blocks were taken as part of a task to quantify the variability in response from nominally identical blocks. Techniques for residual stress, preferred orientation, and microstructural measurements were refined and are applied to a reference block rejected by the manufacturer during fabrication in order to evaluate the effect of metallurgical condition on block response.
Exploratory Development of Corrosion Inhibiting Primers
1977-05-01
far superior in reproducibility and uniformity. The developed C-5301 electroprimer is readily adaptable to automated processing methods and can provide uniform, reproducible films which are cost effective.
Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives
NASA Astrophysics Data System (ADS)
Yan, An
2016-04-01
Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.
Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study
van de Water, Tanja; Faber, Irene; Elferink-Gemser, Marije
2017-01-01
Abstract Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p < 0.05). Elite players did not outscore non-elite players on domain-general reaction time nor on both components of inhibitory control (p > 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p < 0.01) and non-elite (p = 0.70, p < 0.05) players. No relationship was found between the national ranking and badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players’ performance. PMID:28210347
Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study.
van de Water, Tanja; Huijgen, Barbara; Faber, Irene; Elferink-Gemser, Marije
2017-01-01
Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p < 0.05). Elite players did not outscore non-elite players on domain-general reaction time nor on both components of inhibitory control (p > 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p < 0.01) and non-elite (p = 0.70, p < 0.05) players. No relationship was found between the national ranking and badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players' performance.
Microbiologic tests in epidemiologic studies: are they reproducible?
Aass, A M; Preus, H R; Zambon, J J; Gjermo, P
1994-12-01
Microbiologic assessments are often included in longitudinal studies to elucidate the significance of the association of certain Gram-negative bacteria and the development of periodontal diseases. In such studies, the reliability of methods is crucial. There are several methods to identify putative pathogens, and some of them are commercially available. The purpose of the present study was to compare the reproducibility of four different methods for detecting Actinobacillus actinomycetemcomitans, Porphyromonas gingivalis, and Prevotella intermedia in order to evaluate their usefulness in epidemiologic studies. The test panel consisted of 10 young subjects and 10 adult periodontitis patients. Subgingival plaque was sampled from sites showing bone loss and "healthy" control sites. The four different methods for detecting the target bacteria were 1) cultivation, 2) Evalusite (a chair-side kit based on ELISA), 3) OmniGene, Inc, based on DNA probes, and 4) indirect immunofluorescence (IIF). The test procedure was repeated after a 1-wk interval and was performed by one examiner. Sites reported to be positive for a microorganism by any of the four methods at one or both examinations were considered to be positive for that organism and included in the analysis. The reproducibility of the four methods was low. The IIF and the cultivation methods showed somewhat higher reproducibility than did the commercial systems. A second test was done for Evalusite, three paper points for sampling being used instead of one as described in the manual. The reproducibility of the second test was improved, indicating that the detection level of the system may influence the reliability.
Accuracy of femoral templating in reproducing anatomical femoral offset in total hip replacement.
Davies, H; Foote, J; Spencer, R F
2007-01-01
Restoration of hip biomechanics is a crucial component of successful total hip replacement. Preoperative templating is recommended to ensure that the size and orientation of implants is optimised. We studied how closely natural femoral offset could be reproduced using the manufacturers' templates for 10 femoral stems in common use in the UK. A series of 23 consecutive preoperative radiographs from patients who had undergone unilateral total hip replacement for unilateral osteoarthritis of the hip was employed. The change in offset between the templated position of the best-fitting template and the anatomical centre of the hip was measured. The templates were then ranked according to their ability to reproduce the normal anatomical offset. The most accurate was the CPS-Plus (Root Mean Square Error 2.0 mm) followed in rank order by: C stem (2.16), CPT (2.40), Exeter (3.23), Stanmore (3.28), Charnley (3.65), Corail (3.72), ABG II (4.30), Furlong HAC (5.08) and Furlong modular (7.14). A similar pattern of results was achieved when the standard error of variability of offset was analysed. We observed a wide variation in the ability of the femoral prosthesis templates to reproduce normal femoral offset. This variation was independent of the seniority of the observer. The templates of modern polished tapered stems with high modularity were best able to reproduce femoral offset. The current move towards digitisation of X-rays may offer manufacturers an opportunity to improve template designs in certain instances, and to develop appropriate computer software.
An Algorithm Using Twelve Properties of Antibiotics to Find the Recommended Antibiotics, as in CPGs
Tsopra, R.; Venot, A.; Duclos, C.
2014-01-01
Background Clinical Decision Support Systems (CDSS) incorporating justifications, updating and adjustable recommendations can considerably improve the quality of healthcare. We propose a new approach to the design of CDSS for empiric antibiotic prescription, based on implementation of the deeper medical reasoning used by experts in the development of clinical practice guidelines (CPGs), to deduce the recommended antibiotics. Methods We investigated two methods (“exclusion” versus “scoring”) for reproducing this reasoning based on antibiotic properties. Results The “exclusion” method reproduced expert reasoning the more accurately, retrieving the full list of recommended antibiotics for almost all clinical situations. Discussion This approach has several advantages: (i) it provides convincing explanations for physicians; (ii) updating could easily be incorporated into the CDSS; (iii) it can provide recommendations for clinical situations missing from CPGs. PMID:25954422
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
1996-01-01
As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.
An improved protocol for harvesting Bacillus subtilis colony biofilms.
Fuchs, Felix Matthias; Driks, Adam; Setlow, Peter; Moeller, Ralf
2017-03-01
Bacterial biofilms cause severe problems in medicine and industry due to the high resistance to disinfectants and environmental stress of organisms within biofilms. Addressing challenges caused by biofilms requires full understanding of the underlying mechanisms for bacterial resistance and survival in biofilms. However, such work is hampered by a relative lack of systems for biofilm cultivation that are practical and reproducible. To address this problem, we developed a readily applicable method to culture Bacillus subtilis biofilms on a membrane filter. The method results in biofilms with highly reproducible characteristics, and which can be readily analyzed by a variety of methods with little further manipulation. This biofilm preparation method simplifies routine generation of B. subtilis biofilms for molecular and cellular analysis, and could be applicable to other microbial systems. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherlock, M.; Brodrick, J. P.; Ridgers, C. P.
Here, we compare the reduced non-local electron transport model developed to Vlasov-Fokker-Planck simulations. Two new test cases are considered: the propagation of a heat wave through a high density region into a lower density gas, and a one-dimensional hohlraum ablation problem. We find that the reduced model reproduces the peak heat flux well in the ablation region but significantly over-predicts the coronal preheat. The suitability of the reduced model for computing non-local transport effects other than thermal conductivity is considered by comparing the computed distribution function to the Vlasov-Fokker-Planck distribution function. It is shown that even when the reduced modelmore » reproduces the correct heat flux, the distribution function is significantly different to the Vlasov-Fokker-Planck prediction. Two simple modifications are considered which improve agreement between models in the coronal region.« less
Improving the Asynchronous Video Learning Model
ERIC Educational Resources Information Center
Griffiths, Michael E.
2010-01-01
Online education is popular from a consumer perspective, but there are elements of face-to-face instruction and assessment that are difficult to reproduce online (Bassoppo-Moyo 2006). The difficulty of reproducing valued elements of a face-to-face setting leads to concerns regarding the overall quality of the online learning experience.…
Van Deun, Jan; Hendrix, An
2017-01-01
The EV-TRACK knowledgebase is developed to cope with the need for transparency and rigour to increase reproducibility and facilitate standardization of extracellular vesicle (EV) research. The knowledgebase includes a checklist for authors and editors intended to improve the transparency of methodological aspects of EV experiments, allows queries and meta-analysis of EV experiments and keeps track of the current state of the art. Widespread implementation by the EV research community is key to its success.
Investigation and development of a production operation for rapid carbonitriding of automobile parts
NASA Astrophysics Data System (ADS)
Bodyako, B. M.; Shipko, A. A.; Gurchenko, P. S.
1986-08-01
The use of carbonitriding with induction heating in the vapors of liquid media makes it possible to eliminate local overheating of the surface being treated, to improve the reproducibility of the process, to solve the problem of rational delivery and removal of the reaction products from the surface being impregnated, to accurately regulate the parameters of the case, and to increase economy in treatment and production culture.
Reproducibility of ECG-gated ultrasound diameter assessment of small abdominal aortic aneurysms.
Bredahl, K; Eldrup, N; Meyer, C; Eiberg, J E; Sillesen, H
2013-03-01
No standardised ultrasound procedure to obtain reliable growth estimates for abdominal aortic aneurysms (AAA) is currently available. We investigated the feasibility and reproducibility of a novel approach controlling for a combination of vessel wall delineation and cardiac cycle variation. Prospective comparative study. Consecutive patients (N = 27) with an AAA, attending their 6-month control as part of a medical treatment trial, were scanned twice by two ultrasound operators. Then, all ultrasound recordings were transferred to a core facility and analysed by a third person. The AAA diameter was determined in four different ways: from the leading edge of adventitia on the anterior wall to either the leading edge of the adventitia (method A) or leading edge of the intima (method B) on the posterior wall, with both measurements performed in systole and diastole. Inter-operator reproducibility was ± 3 mm for all methods applied. There was no difference in outcome between methods A and B; likewise, end-diastolic measurement did not improve reproducibility in preference to peak-systolic measurement. The use of a standardised ultrasound protocol including ECG-gating and subsequent off-line reading with minute calliper placement reduces variability. This may be of use in developing protocols to better detect even small AAA growth rates during clinical trials. Copyright © 2012 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yu, Shanshan; Murakami, Yuri; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2006-09-01
The article proposes a multispectral image compression scheme using nonlinear spectral transform for better colorimetric and spectral reproducibility. In the method, we show the reduction of colorimetric error under a defined viewing illuminant and also that spectral accuracy can be improved simultaneously using a nonlinear spectral transform called Labplus, which takes into account the nonlinearity of human color vision. Moreover, we show that the addition of diagonal matrices to Labplus can further preserve the spectral accuracy and has a generalized effect of improving the colorimetric accuracy under other viewing illuminants than the defined one. Finally, we discuss the usage of the first-order Markov model to form the analysis vectors for the higher order channels in Labplus to reduce the computational complexity. We implement a multispectral image compression system that integrates Labplus with JPEG2000 for high colorimetric and spectral reproducibility. Experimental results for a 16-band multispectral image show the effectiveness of the proposed scheme.
Scientific Utopia: An agenda for improving scientific communication (Invited)
NASA Astrophysics Data System (ADS)
Nosek, B.
2013-12-01
The scientist's primary incentive is publication. In the present culture, open practices do not increase chances of publication, and they often require additional work. Practicing the abstract scientific values of openness and reproducibility thus requires behaviors in addition to those relevant for the primary, concrete rewards. When in conflict, concrete rewards are likely to dominate over abstract ones. As a consequence, the reward structure for scientists does not encourage openness and reproducibility. This can be changed by nudging incentives to align scientific practices with scientific values. Science will benefit by creating and connecting technologies that nudge incentives while supporting and improving the scientific workflow. For example, it should be as easy to search the research literature for my topic as it is to search the Internet to find hilarious videos of cats falling off of furniture. I will introduce the Center for Open Science (http://centerforopenscience.org/) and efforts to improve openness and reproducibility such as http://openscienceframework.org/. There will be no cats.
Improving and Accelerating Drug Development for Nervous System Disorders
Pankevich, Diana E.; Altevogt, Bruce M.; Dunlop, John; Gage, Fred H.; Hyman, Steve E.
2014-01-01
Advances in the neurosciences have placed the field in the position where it is poised to significantly reduce the burden of nervous system disorders. However, drug discovery, development and translation for nervous system disorders still pose many unique challenges. The key scientific challenges can be summarized as follows: mechanisms of disease, target identification and validation, predictive models, biomarkers for patient stratification and as endpoints for clinical trials, clear regulatory pathways, reliability and reproducibility of published data, and data sharing and collaboration. To accelerate nervous system drug development the Institute of Medicine’s Forum on Neuroscience and Nervous System Disorders has hosted a series of public workshops that brought together representatives of industry, government (including both research funding and regulatory agencies), academia, and patient groups to discuss these challenges and offer potential strategies to improve the translational neuroscience. PMID:25442933
Olorisade, Babatunde Kazeem; Brereton, Pearl; Andras, Peter
2017-09-01
Independent validation of published scientific results through study replication is a pre-condition for accepting the validity of such results. In computation research, full replication is often unrealistic for independent results validation, therefore, study reproduction has been justified as the minimum acceptable standard to evaluate the validity of scientific claims. The application of text mining techniques to citation screening in the context of systematic literature reviews is a relatively young and growing computational field with high relevance for software engineering, medical research and other fields. However, there is little work so far on reproduction studies in the field. In this paper, we investigate the reproducibility of studies in this area based on information contained in published articles and we propose reporting guidelines that could improve reproducibility. The study was approached in two ways. Initially we attempted to reproduce results from six studies, which were based on the same raw dataset. Then, based on this experience, we identified steps considered essential to successful reproduction of text mining experiments and characterized them to measure how reproducible is a study given the information provided on these steps. 33 articles were systematically assessed for reproducibility using this approach. Our work revealed that it is currently difficult if not impossible to independently reproduce the results published in any of the studies investigated. The lack of information about the datasets used limits reproducibility of about 80% of the studies assessed. Also, information about the machine learning algorithms is inadequate in about 27% of the papers. On the plus side, the third party software tools used are mostly free and available. The reproducibility potential of most of the studies can be significantly improved if more attention is paid to information provided on the datasets used, how they were partitioned and utilized, and how any randomization was controlled. We introduce a checklist of information that needs to be provided in order to ensure that a published study can be reproduced. Copyright © 2017 Elsevier Inc. All rights reserved.
Chen, Hsiao-Ping; Yeh, Chun-Yi; Hung, Pei-Chin; Wang, Shau-Chun
2014-02-01
In this study, induced electroosmotic vortex flows were generated using an AC electric field by one pair of external electrodes to rapidly mix luminescence reagents in a 30 μL micromixer and enhance the reproducibility of chemiluminescence (CL) assays. A solution containing the catalyst reagent ferricyanide ions (4 μL) was pipetted into a reservoir containing luminol to produce CL in the presence of hydrogen peroxide. When the added ferricyanide aliquot contacted the reservoir solution, the CL began flashing, but rapidly diminished as the ferricyanide was consumed. In such a short illumination period, the solutes could not mix homogeneously. Therefore, the reproducibility of CL intensities collected using a CCD and multiple aliquot additions was determined to be inadequate. By contrast, when the solutes were efficiently mixed after adding a ferricyanide aliquot to a micromixer, the intensity reproducibility was significantly improved. When the CL temporal profile was analyzed using a PMT, a consistent improvement in reproducibility was observed between the CL intensity and estimated CL reaction rate. Replicating the proposed device would create a multiple well plate that contains a micromixer in each reservoir; this system is compatible with conventional CL instrumentation and requires no CL enhancer to slow a reaction. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dysplastic naevus: histological criteria and their inter-observer reproducibility.
Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K
1994-06-01
Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.
Plasma-sprayed self-lubricating coatings
NASA Technical Reports Server (NTRS)
Nakamura, H. H.; Logan, W. R.; Harada, Y.
1982-01-01
One of the most important criterion for acceptable commercial application of a multiple phase composition is uniformity and reproducibility. This means that the performance characteristics of the coat - e.g., its lubricating properties, bond strength to the substrate, and thermal properties - can be readily predicted to give a desired performance. The improvement of uniformity and reproducibility of the coats, the oxidation behavior at three temperature ranges, the effect of bond coat and the effect of preheat treatment as measured by adhesive strength tests, coating examination procedures, and physical property measurements were studied. The following modifications improved the uniformity and reproducibility: (1) changes and closer control in the particle size range of the raw materials used, (2) increasing the binder content from 3.2% to 4.1% (dried weight), and (3) analytical processing procedures using step by step checking to assure consistency.
NASA Astrophysics Data System (ADS)
Couvidat, Florian; Bessagnet, Bertrand; Garcia-Vivanco, Marta; Real, Elsa; Menut, Laurent; Colette, Augustin
2018-01-01
A new aerosol module was developed and integrated in the air quality model CHIMERE. Developments include the use of the Model of Emissions and Gases and Aerosols from Nature (MEGAN) 2.1 for biogenic emissions, the implementation of the inorganic thermodynamic model ISORROPIA 2.1, revision of wet deposition processes and of the algorithms of condensation/evaporation and coagulation and the implementation of the secondary organic aerosol (SOA) mechanism H2O and the thermodynamic model SOAP. Concentrations of particles over Europe were simulated by the model for the year 2013. Model concentrations were compared to the European Monitoring and Evaluation Programme (EMEP) observations and other observations available in the EBAS database to evaluate the performance of the model. Performances were determined for several components of particles (sea salt, sulfate, ammonium, nitrate, organic aerosol) with a seasonal and regional analysis of results. The model gives satisfactory performance in general. For sea salt, the model succeeds in reproducing the seasonal evolution of concentrations for western and central Europe. For sulfate, except for an overestimation of sulfate in northern Europe, modeled concentrations are close to observations and the model succeeds in reproducing the seasonal evolution of concentrations. For organic aerosol, the model reproduces with satisfactory results concentrations for stations with strong modeled biogenic SOA concentrations. However, the model strongly overestimates ammonium nitrate concentrations during late autumn (possibly due to problems in the temporal evolution of emissions) and strongly underestimates summer organic aerosol concentrations over most of the stations (especially in the northern half of Europe). This underestimation could be due to a lack of anthropogenic SOA or biogenic emissions in northern Europe. A list of recommended tests and developments to improve the model is also given.
Piriou, P; Ouenzerfi, G; Migaud, H; Renault, E; Massi, F; Serrault, M
2016-06-01
Modern ceramic (CoC) bearings for hip arthroplasty (THA) have been used in younger patients who expect improved survivorship. However, audible squeaking produced by the implant is an annoying complication. Previous numerical simulations were not able to accurately reproduce in vitro and in vivo observations. Therefore, we developed a finite element model to: (1) reproduce in vitro squeaking and validate the model by comparing it with in vivo recordings, (2) determine why there are differences between in vivo and in vitro squeaking frequencies, (3) identify the stem's role in this squeaking, (4) predict which designs and materials are more likely to produce squeaking. A CoC THA numerical model can be developed that reproduces the squeaking frequencies observed in vivo. Numerical methods (finite element analysis [ANSYS]) and experimental methods (using a non-lubricated simulated hip with a cementless 32mm CoC THA) were developed to reproduce squeaking. Numerical analysis was performed to identify the frequencies that cause vibrations perceived as an acoustic emission. The finite element analysis (FEA) model was enhanced by adjusting periprosthetic bone and soft tissue elements in order to reproduce the squeaking frequencies recorded in vivo. A numerical method (complex eigenvalue analysis) was used to find the acoustic frequencies of the squeaking noise. The frequencies obtained from the model and the hip simulator were compared to those recorded in vivo. The numerical results were validated by experiments with the laboratory hip simulator. The frequencies obtained (mean 2790Hz with FEA, 2755Hz with simulator, decreasing to 1759Hz when bone and soft tissue were included in the FEA) were consistent with those of squeaking hips recorded in vivo (1521Hz). The cup and ceramic insert were the source of the vibration, but had little influence on the diffusion of the noise required to make the squeaking audible to the human ear. The FEA showed that diffusion of squeaking was due to an unstable vibration of the stem during frictional contact. The FEA predicted a higher rate of squeaking (at a lower coefficient of friction) when TZMF™ alloy is used instead of Ti6Al4V and when an anatomic press-fit stem is used instead of straight self-locking designs. The current FEA model is reliable; it can be used to assess various stem designs and alloys to predict the different rates of squeaking that certain stems will likely produce. Level IV in vitro study. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Spectroscopic investigation of nitrogen-functionalized carbon materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Kevin N.; Christensen, Steven T.; Nordlund, Dennis
2016-04-07
Carbon materials are used in a diverse set of applications ranging from pharmaceuticals to catalysis. Nitrogen modification of carbon powders has shown to be an effective method for enhancing both surface and bulk properties of as-received material for a number of applications. Unfortunately, control of the nitrogen modification process is challenging and can limit the effectiveness and reproducibility of N-doped materials. Additionally, the assignment of functional groups to specific moieties on the surface of nitrogen-modified carbon materials is not straightforward. Herein, we complete an in-depth analysis of functional groups present at the surface of ion-implanted Vulcan and Graphitic Vulcan throughmore » the use of X-ray photoelectron spectroscopy (XPS) and near edge X-ray adsorption fine structure spectroscopy (NEXAFS). Our results show that regardless of the initial starting materials used, nitrogen ion implantation conditions can be tuned to increase the amount of nitrogen incorporation and to obtain both similar and reproducible final distributions of nitrogen functional groups. The development of a well-controlled/reproducible nitrogen implantation pathway opens the door for carbon supported catalyst architectures to have improved numbers of nucleation sites, decreased particle size, and enhanced catalyst-support interactions.« less
2006-01-01
representation of RAND intellectual property is provided for non- commercial use only. Permission is required from RAND to reproduce, or reuse in another...Law 97-219) created the Small Business Innovation Research (SBIR) program by mandating that all federal research, development, test , and evaluation (RDT...contractors and the small, technology-oriented business community. 6. Expand intellectual capital in the United States. POLICY OPTIONS FOR THE DOD SBIR PROGRAM
The Cancer Target Discovery and Development (CTD^2) Network was established to accelerate the transformation of "Big Data" into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding.
Van Deun, Jan; Hendrix, An
2017-01-01
ABSTRACT The EV-TRACK knowledgebase is developed to cope with the need for transparency and rigour to increase reproducibility and facilitate standardization of extracellular vesicle (EV) research. The knowledgebase includes a checklist for authors and editors intended to improve the transparency of methodological aspects of EV experiments, allows queries and meta-analysis of EV experiments and keeps track of the current state of the art. Widespread implementation by the EV research community is key to its success. PMID:29184624
Training model for control of an internal carotid artery injury during transsphenoidal surgery.
Muto, Jun; Carrau, Ricardo L; Oyama, Kenichi; Otto, Brad A; Prevedello, Daniel M
2017-01-01
As the adoption of endoscopic endonasal approaches (EEA) continues to proliferate, increasing numbers of internal carotid artery (ICA) injuries are reported. The objective of this study was to develop a synthetic ICA injury-training model that could mimic this clinical scenario and be portable, repeatable, reproducible, and without risk of biological contamination. Based on computed tomography of a human head, we constructed a synthetic model using selective laser sintering with polyamide nylon and glass beads. Subsequently, the model was connected to a pulsatile pump using 6-mm silicon tubing. The pump maintains a pulsatile flow of an artificial blood-like fluid at a variable pressure to simulate heart beats. Volunteer surgeons with different levels of training and experience were provided simulation training sessions with the models. Pre- and posttraining questionnaires were completed by each of the participants. Pre- and posttraining questionnaires suggest that repeated simulation sessions improve the surgical skills and self-confidence of trainees. This ICA injury model is portable; reproducible; and avoids ethical, biohazard, religious, and legal problems associated with cadaveric models. A synthetic ICA injury model for EEA allows recurring training that may improve the surgeon's ability to maintain endoscopic visualization, control catastrophic bleeding, decrease psychomotor stress, and develop effective team strategies to achieve hemostasis. NA Laryngoscope, 127:38-43, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Xu, Tingting; Chi, Bo; Gao, Jian; Chu, Meilin; Fan, Wenlu; Yi, Meihui; Xu, Hong; Mao, Chun
2017-07-18
A simple and accurate immune sensor for quantitative detection of α-Fetoprotein (AFP) was developed based on the immobilization of antigen on the surface of Hep-PGA-PPy nanoparticles modified glassy carbon electrodes (GCE). The obtained Hep-PGA-PPy nanoparticles were characterized by fourier transform infrared (FT-IR) spectra and transmission electron microscopy (TEM). And the blood compatibility of Hep-PGA-PPy nanoparticles was investigated by in vitro coagulation tests, hemolysis assay and whole blood adhesion tests. Combining the conductive property of polypyrrole (PPy) and the biocompatibility of heparin (Hep), the Hep-PGA-PPy nanoparticles could improve not only the anti-biofouling effect the electrode, but also improved the electrochemical properties of the immune sensor. Under optimal conditions, the proposed immune sensor could detect AFP in a linear range from 0.1 to 100 ng mL -1 with a detection limit of 0.099 ng mL -1 at the signal-to-noise ratio of 3, and it also possessed good reproducibility and storage stability. Furthermore, the detection of AFP in five human blood samples also showed satisfactory accuracy with low relative errors. Thus, the developed immune sensor which showed acceptable reproducibility, selectivity, stability and accuracy could be potentially used for the detection of whole blood samples directly. Copyright © 2017. Published by Elsevier B.V.
Development of MMC Gamma Detectors for Precise Characterization of Uranium Isotopes
NASA Astrophysics Data System (ADS)
Kim, G. B.; Flynn, C. C.; Kempf, S.; Gastaldo, L.; Fleischmann, A.; Enss, C.; Friedrich, S.
2018-06-01
Precise nuclear data from radioactive decays are important for the accurate non-destructive assay of fissile materials in nuclear safeguards. We are developing high energy resolution gamma detectors based on metallic magnetic calorimeters (MMCs) to accurately measure gamma-ray energies and branching ratios of uranium isotopes. Our MMC gamma detectors exhibit good linearity, reproducibility and a consistent response function for low energy gamma-rays. We illustrate the capabilities of MMCs to improve literature values of nuclear data with an analysis of gamma spectra of U-233. In this context, we also improve the value of the energy for the single gamma-ray of the U-233 daughter Ra-225 by over an order of magnitude from 40.09 ± 0.05 to 40.0932 ± 0.0007 keV.
Neumann, Cedric; Ramotowski, Robert; Genessay, Thibault
2011-05-13
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library. Copyright © 2010 Elsevier B.V. All rights reserved.
Cagliani, Alberto; Østerberg, Frederik W; Hansen, Ole; Shiv, Lior; Nielsen, Peter F; Petersen, Dirch H
2017-09-01
We present a breakthrough in micro-four-point probe (M4PP) metrology to substantially improve precision of transmission line (transfer length) type measurements by application of advanced electrode position correction. In particular, we demonstrate this methodology for the M4PP current-in-plane tunneling (CIPT) technique. The CIPT method has been a crucial tool in the development of magnetic tunnel junction (MTJ) stacks suitable for magnetic random-access memories for more than a decade. On two MTJ stacks, the measurement precision of resistance-area product and tunneling magnetoresistance was improved by up to a factor of 3.5 and the measurement reproducibility by up to a factor of 17, thanks to our improved position correction technique.
Reproducible Growth of High-Quality Cubic-SiC Layers
NASA Technical Reports Server (NTRS)
Neudeck, Philip G.; Powell, J. Anthony
2004-01-01
Semiconductor electronic devices and circuits based on silicon carbide (SiC) are being developed for use in high-temperature, high-power, and/or high-radiation conditions under which devices made from conventional semiconductors cannot adequately perform. The ability of SiC-based devices to function under such extreme conditions is expected to enable significant improvements in a variety of applications and systems. These include greatly improved high-voltage switching for saving energy in public electric power distribution and electric motor drives; more powerful microwave electronic circuits for radar and communications; and sensors and controls for cleaner-burning, more fuel-efficient jet aircraft and automobile engines.
Surrogate biochemical markers: precise measurement for strategic drug and biologics development.
Lee, J W; Hulse, J D; Colburn, W A
1995-05-01
More efficient drug and biologics development is necessary for future success of pharmaceutical and biotechnology companies. One way to achieve this objective is to use rationally selected surrogate markers to improve the early decision-making process. Using typical clinical chemistry methods to measure biochemical markers may not ensure adequate precision and reproducibility. In contrast, using analytical methods that meet good laboratory practices along with rational selection and validation of biochemical markers can give those who use them a competitive advantage over those who do not by providing meaningful data for earlier decision making.
Sherlock, M.; Brodrick, J. P.; Ridgers, C. P.
2017-08-08
Here, we compare the reduced non-local electron transport model developed to Vlasov-Fokker-Planck simulations. Two new test cases are considered: the propagation of a heat wave through a high density region into a lower density gas, and a one-dimensional hohlraum ablation problem. We find that the reduced model reproduces the peak heat flux well in the ablation region but significantly over-predicts the coronal preheat. The suitability of the reduced model for computing non-local transport effects other than thermal conductivity is considered by comparing the computed distribution function to the Vlasov-Fokker-Planck distribution function. It is shown that even when the reduced modelmore » reproduces the correct heat flux, the distribution function is significantly different to the Vlasov-Fokker-Planck prediction. Two simple modifications are considered which improve agreement between models in the coronal region.« less
Aghayev, Kamran; Vrionis, Frank D
2013-09-01
The main aim of this paper was to report reproducible method of lumbar spine access via a lateral retroperitoneal route. The authors conducted a retrospective analysis of the technical aspects and clinical outcomes of six patients who underwent lateral multilevel retroperitoneal interbody fusion with psoas muscle retraction technique. The main goal was to develop a simple and reproducible technique to avoid injury to the lumbar plexus. Six patients were operated at 15 levels using psoas muscle retraction technique. All patients reported improvement in back pain and radiculopathy after the surgery. The only procedure-related transient complication was weakness and pain on hip flexion that resolved by the first follow-up visit. Psoas retraction technique is a reliable technique for lateral access to the lumbar spine and may avoid some of the complications related to traditional minimally invasive transpsoas approach.
Felder, Martijn; van de Bovenkamp, Hester; de Bont, Antoinette
2018-01-01
In Dutch healthcare, new market mechanisms have been introduced on an experimental basis in an attempt to contain costs and improve quality. Informed by a constructivist approach, we demonstrate that such experiments are not neutral testing grounds. Drawing from semi-structured interviews and policy texts, we reconstruct an experiment on free pricing in dental care that turned into a critical example of market failure, influencing developments in other sectors. Our analysis, however, shows that (1) different market logics and (2) different experimental logics were reproduced simultaneously during the course of the experiment. We furthermore reveal how (3) evaluation and political life influenced which logics were reproduced and became taken as the lessons learned. We use these insights to discuss the role of evaluation in learning from policy experimentation and close with four questions that evaluators could ask to better understand what is learned from policy experiments, how, and why. PMID:29568225
A method for the geometric and densitometric standardization of intraoral radiographs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duckworth, J.E.; Judy, P.F.; Goodson, J.M.
1983-07-01
The interpretation of dental radiographs for the diagnosis of periodontal disease conditions poses several difficulties. These include the inability to adequately reproduce the projection geometry and optical density of the exposures. In order to improve the ability to extract accurate quantitative information from a radiographic survey of periodontal status, a method was developed which provided for consistent reproduction of both geometric and densitometric exposure parameters. This technique employed vertical bitewing projections in holders customized to individual segments of the dentition. A copper stepwedge was designed to provide densitometric standardization, and wire markers were included to permit measurement of angular variation.more » In a series of 53 paired radiographs, measurement of alveolar crest heights was found to be reproducible within approximately 0.1 mm. This method provided a full mouth radiographic survey using seven films, each complete with internal standards suitable for computer-based image processing.« less
Communication: Role of explicit water models in the helix folding/unfolding processes
NASA Astrophysics Data System (ADS)
Palazzesi, Ferruccio; Salvalaglio, Matteo; Barducci, Alessandro; Parrinello, Michele
2016-09-01
In the last years, it has become evident that computer simulations can assume a relevant role in modelling protein dynamical motions for their ability to provide a full atomistic image of the processes under investigation. The ability of the current protein force-fields in reproducing the correct thermodynamics and kinetics systems behaviour is thus an essential ingredient to improve our understanding of many relevant biological functionalities. In this work, employing the last developments of the metadynamics framework, we compare the ability of state-of-the-art all-atom empirical functions and water models to consistently reproduce the folding and unfolding of a helix turn motif in a model peptide. This theoretical study puts in evidence that the choice of the water models can influence the thermodynamic and the kinetics of the system under investigation, and for this reason cannot be considered trivial.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
Semiautomated Segmentation of Polycystic Kidneys in T2-Weighted MR Images.
Kline, Timothy L; Edwards, Marie E; Korfiatis, Panagiotis; Akkus, Zeynettin; Torres, Vicente E; Erickson, Bradley J
2016-09-01
The objective of the present study is to develop and validate a fast, accurate, and reproducible method that will increase and improve institutional measurement of total kidney volume and thereby avoid the higher costs, increased operator processing time, and inherent subjectivity associated with manual contour tracing. We developed a semiautomated segmentation approach, known as the minimal interaction rapid organ segmentation (MIROS) method, which results in human interaction during measurement of total kidney volume on MR images being reduced to a few minutes. This software tool automatically steps through slices and requires rough definition of kidney boundaries supplied by the user. The approach was verified on T2-weighted MR images of 40 patients with autosomal dominant polycystic kidney disease of varying degrees of severity. The MIROS approach required less than 5 minutes of user interaction in all cases. When compared with the ground-truth reference standard, MIROS showed no significant bias and had low variability (mean ± 2 SD, 0.19% ± 6.96%). The MIROS method will greatly facilitate future research studies in which accurate and reproducible measurements of cystic organ volumes are needed.
Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bia...
Gregori, Josep; Villarreal, Laura; Sánchez, Alex; Baselga, José; Villanueva, Josep
2013-12-16
The microarray community has shown that the low reproducibility observed in gene expression-based biomarker discovery studies is partially due to relying solely on p-values to get the lists of differentially expressed genes. Their conclusions recommended complementing the p-value cutoff with the use of effect-size criteria. The aim of this work was to evaluate the influence of such an effect-size filter on spectral counting-based comparative proteomic analysis. The results proved that the filter increased the number of true positives and decreased the number of false positives and the false discovery rate of the dataset. These results were confirmed by simulation experiments where the effect size filter was used to evaluate systematically variable fractions of differentially expressed proteins. Our results suggest that relaxing the p-value cut-off followed by a post-test filter based on effect size and signal level thresholds can increase the reproducibility of statistical results obtained in comparative proteomic analysis. Based on our work, we recommend using a filter consisting of a minimum absolute log2 fold change of 0.8 and a minimum signal of 2-4 SpC on the most abundant condition for the general practice of comparative proteomics. The implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of the results obtained among independent laboratories and MS platforms. Quality control analysis of microarray-based gene expression studies pointed out that the low reproducibility observed in the lists of differentially expressed genes could be partially attributed to the fact that these lists are generated relying solely on p-values. Our study has established that the implementation of an effect size post-test filter improves the statistical results of spectral count-based quantitative proteomics. The results proved that the filter increased the number of true positives whereas decreased the false positives and the false discovery rate of the datasets. The results presented here prove that a post-test filter applying a reasonable effect size and signal level thresholds helps to increase the reproducibility of statistical results in comparative proteomic analysis. Furthermore, the implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of results obtained among independent laboratories and MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 Elsevier B.V. All rights reserved.
Goh, V; Halligan, S; Gartner, L; Bassett, P; Bartram, C I
2006-07-01
The purpose of this study was to determine if greater z-axis tumour coverage improves the reproducibility of quantitative colorectal cancer perfusion measurements using CT. A 65 s perfusion study was acquired following intravenous contrast administration in 10 patients with proven colorectal cancer using a four-detector row scanner. This was repeated within 48 h using identical technical parameters to allow reproducibility assessment. Quantitative tumour blood volume, blood flow, mean transit time and permeability measurements were determined using commercially available software (Perfusion 3.0; GE Healthcare, Waukesha, WI) for data obtained from a 5 mm z-axis tumour coverage, and from a 20 mm z-axis tumour coverage. Measurement reproducibility was assessed using Bland-Altman statistics, for a 5 mm z-axis tumour coverage, and 20 mm z-axis tumour coverage, respectively. The mean difference (95% limits of agreement) for blood volume, blood flow, mean transit time and permeability were 0.04 (-2.50 to +2.43) ml/100 g tissue; +8.80 (-50.5 to +68.0) ml/100 g tissue/min; -0.99 (-8.19 to +6.20) seconds; and +1.20 (-5.42 to +7.83) ml/100 g tissue/min, respectively, for a 5 mm coverage, and -0.04 (-2.61 to +2.53) ml/100 g tissue; +7.40 (-50.3 to +65.0) ml/100 g tissue/min; -2.46 (-12.61 to +7.69) seconds; and -0.23 (-8.31 to +7.85) ml/100 g tissue/min, respectively, for a 20 mm coverage, indicating similar levels of agreement. In conclusion, increasing z-axis coverage does not improve reproducibility of quantitative colorectal cancer perfusion measurements.
Copie, X; Blankoff, I; Hnatkova, K; Fei, L; Camm, A J; Malik, M
1996-06-01
The authors studied the possibility of improving the reproducibility of the signal averaged ECG by increasing the number of averaged QRS complexes. One hundred patients were included in the study. In each cases, 400 QRS complexes were recorded on twice, consecutively, in strictly identical conditions. During each recording, the total duration of the amplified and averaged QRS complex (tQRS), the duration of the terminal signal below 40 microV (LAS) and the root mean square of the amplitude of the last 40 ms (RMS) were determined for 100, 200, 300 and 400 recorded QRS complexes. The presence of late potentials was defined as the positivity of two of the following criteria: tQRS > 114 ms, LAS > 38 ms, RMS < 20 microV. The number of contradictory diagnostic conclusions between two successive recordings of the same duration decreased progressively with the number of averaged QRS complexes: 10 for 100 QRS, 10 for 200 QRS, 9 for 300 QRS and 6 for 400 QRS complexes, but this improvement was not statistically significant. The absolute differences of tQRS and RMS between two successive recordings of the same duration were statistically different for the four durations of recording (p = 0.05) and there was a tendency towards statistical significance for LAS (p = 0.09). The best quantitative reproducibility of the 3 parameters was obtained with the recording of 300 QRS complexes. In conclusion, the reproducibility of the signal averaged ECG is improved when the number of average QRS complexes is increased. The authors' results suggests that reproducibility this is optimal with the amplification and averaging of 300 QRS complexes.
Improvement and Extension of Shape Evaluation Criteria in Multi-Scale Image Segmentation
NASA Astrophysics Data System (ADS)
Sakamoto, M.; Honda, Y.; Kondo, A.
2016-06-01
From the last decade, the multi-scale image segmentation is getting a particular interest and practically being used for object-based image analysis. In this study, we have addressed the issues on multi-scale image segmentation, especially, in improving the performances for validity of merging and variety of derived region's shape. Firstly, we have introduced constraints on the application of spectral criterion which could suppress excessive merging between dissimilar regions. Secondly, we have extended the evaluation for smoothness criterion by modifying the definition on the extent of the object, which was brought for controlling the shape's diversity. Thirdly, we have developed new shape criterion called aspect ratio. This criterion helps to improve the reproducibility on the shape of object to be matched to the actual objectives of interest. This criterion provides constraint on the aspect ratio in the bounding box of object by keeping properties controlled with conventional shape criteria. These improvements and extensions lead to more accurate, flexible, and diverse segmentation results according to the shape characteristics of the target of interest. Furthermore, we also investigated a technique for quantitative and automatic parameterization in multi-scale image segmentation. This approach is achieved by comparing segmentation result with training area specified in advance by considering the maximization of the average area in derived objects or satisfying the evaluation index called F-measure. Thus, it has been possible to automate the parameterization that suited the objectives especially in the view point of shape's reproducibility.
Multi-scale hippocampal parcellation improves atlas-based segmentation accuracy
NASA Astrophysics Data System (ADS)
Plassard, Andrew J.; McHugo, Maureen; Heckers, Stephan; Landman, Bennett A.
2017-02-01
Known for its distinct role in memory, the hippocampus is one of the most studied regions of the brain. Recent advances in magnetic resonance imaging have allowed for high-contrast, reproducible imaging of the hippocampus. Typically, a trained rater takes 45 minutes to manually trace the hippocampus and delineate the anterior from the posterior segment at millimeter resolution. As a result, there has been a significant desire for automated and robust segmentation of the hippocampus. In this work we use a population of 195 atlases based on T1-weighted MR images with the left and right hippocampus delineated into the head and body. We initialize the multi-atlas segmentation to a region directly around each lateralized hippocampus to both speed up and improve the accuracy of registration. This initialization allows for incorporation of nearly 200 atlases, an accomplishment which would typically involve hundreds of hours of computation per target image. The proposed segmentation results in a Dice similiarity coefficient over 0.9 for the full hippocampus. This result outperforms a multi-atlas segmentation using the BrainCOLOR atlases (Dice 0.85) and FreeSurfer (Dice 0.75). Furthermore, the head and body delineation resulted in a Dice coefficient over 0.87 for both structures. The head and body volume measurements also show high reproducibility on the Kirby 21 reproducibility population (R2 greater than 0.95, p < 0.05 for all structures). This work signifies the first result in an ongoing work to develop a robust tool for measurement of the hippocampus and other temporal lobe structures.
Reproducibility of preclinical animal research improves with heterogeneity of study samples
Vogt, Lucile; Sena, Emily S.; Würbel, Hanno
2018-01-01
Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495
Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N
2012-01-01
Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.
O'Keeffe, S T; Lye, M; Donnellan, C; Carmichael, D N
1998-10-01
To examine the reproducibility and responsiveness to change of a six minute walk test and a quality of life measure in elderly patients with heart failure. Longitudinal within patient study. 60 patients with heart failure (mean age 82 years) attending a geriatric outpatient clinic, 45 of whom underwent a repeat assessment three to eight weeks later. Subjects underwent a standardised six minute walk test and completed the chronic heart failure questionnaire (CHQ), a heart failure specific quality of life questionnaire. Intraclass correlation coefficients (ICC) were calculated using a random effects one way analysis of variance as a measure of reproducibility. Guyatt's responsiveness coefficient and effect sizes were calculated as measures of responsiveness to change. 24 patients reported no major change in cardiac status, while seven had deteriorated and 14 had improved between the two clinic visits. Reproducibility was satisfactory (ICC > 0.75) for the six minute walk test, for the total CHQ score, and for the dyspnoea, fatigue, and emotion domains of the CHQ. Effect sizes for all measures were large (> 0.8), and responsiveness coefficients were very satisfactory (> 0.7). Effect sizes for detecting deterioration were greater than those for detecting improvement. Quality of life assessment and a six minute walk test are reproducible and responsive measures of cardiac status in frail, very elderly patients with heart failure.
de Beer, Jessica L.; Kremer, Kristin; Ködmön, Csaba; Supply, Philip
2012-01-01
Although variable-number tandem-repeat (VNTR) typing has gained recognition as the new standard for the DNA fingerprinting of Mycobacterium tuberculosis complex (MTBC) isolates, external quality control programs have not yet been developed. Therefore, we organized the first multicenter proficiency study on 24-locus VNTR typing. Sets of 30 DNAs of MTBC strains, including 10 duplicate DNA samples, were distributed among 37 participating laboratories in 30 different countries worldwide. Twenty-four laboratories used an in-house-adapted method with fragment sizing by gel electrophoresis or an automated DNA analyzer, nine laboratories used a commercially available kit, and four laboratories used other methods. The intra- and interlaboratory reproducibilities of VNTR typing varied from 0% to 100%, with averages of 72% and 60%, respectively. Twenty of the 37 laboratories failed to amplify particular VNTR loci; if these missing results were ignored, the number of laboratories with 100% interlaboratory reproducibility increased from 1 to 5. The average interlaboratory reproducibility of VNTR typing using a commercial kit was better (88%) than that of in-house-adapted methods using a DNA analyzer (70%) or gel electrophoresis (50%). Eleven laboratories using in-house-adapted manual typing or automated typing scored inter- and intralaboratory reproducibilities of 80% or higher, which suggests that these approaches can be used in a reliable way. In conclusion, this first multicenter study has documented the worldwide quality of VNTR typing of MTBC strains and highlights the importance of international quality control to improve genotyping in the future. PMID:22170917
Arabidopsis phenotyping through Geometric Morphometrics.
Manacorda, Carlos A; Asurmendi, Sebastian
2018-06-18
Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.
NASA Astrophysics Data System (ADS)
Chouhan, Manil D.; Bainbridge, Alan; Atkinson, David; Punwani, Shonit; Mookerjee, Rajeshwar P.; Lythgoe, Mark F.; Taylor, Stuart A.
2017-02-01
Liver dynamic contrast enhanced (DCE) MRI pharmacokinetic modelling could be useful in the assessment of diffuse liver disease and focal liver lesions, but is compromised by errors in arterial input function (AIF) sampling. In this study, we apply cardiac output correction to arterial input functions (AIFs) for liver DCE MRI and investigate the effect on dual-input single compartment hepatic perfusion parameter estimation and reproducibility. Thirteen healthy volunteers (28.7 ± 1.94 years, seven males) underwent liver DCE MRI and cardiac output measurement using aortic root phase contrast MRI (PCMRI), with reproducibility (n = 9) measured at 7 d. Cardiac output AIF correction was undertaken by constraining the first pass AIF enhancement curve using the indicator-dilution principle. Hepatic perfusion parameters with and without cardiac output AIF correction were compared and 7 d reproducibility assessed. Differences between cardiac output corrected and uncorrected liver DCE MRI portal venous (PV) perfusion (p = 0.066), total liver blood flow (TLBF) (p = 0.101), hepatic arterial (HA) fraction (p = 0.895), mean transit time (MTT) (p = 0.646), distribution volume (DV) (p = 0.890) were not significantly different. Seven day corrected HA fraction reproducibility was improved (mean difference 0.3%, Bland-Altman 95% limits-of-agreement (BA95%LoA) ±27.9%, coefficient of variation (CoV) 61.4% versus 9.3%, ±35.5%, 81.7% respectively without correction). Seven day uncorrected PV perfusion was also improved (mean difference 9.3 ml min-1/100 g, BA95%LoA ±506.1 ml min-1/100 g, CoV 64.1% versus 0.9 ml min-1/100 g, ±562.8 ml min-1/100 g, 65.1% respectively with correction) as was uncorrected TLBF (mean difference 43.8 ml min-1/100 g, BA95%LoA ±586.7 ml min-1/ 100 g, CoV 58.3% versus 13.3 ml min-1/100 g, ±661.5 ml min-1/100 g, 60.9% respectively with correction). Reproducibility of uncorrected MTT was similar (uncorrected mean difference 2.4 s, BA95%LoA ±26.7 s, CoV 60.8% uncorrected versus 3.7 s, ±27.8 s, 62.0% respectively with correction), as was and DV (uncorrected mean difference 14.1%, BA95%LoA ±48.2%, CoV 24.7% versus 10.3%, ±46.0%, 23.9% respectively with correction). Cardiac output AIF correction does not significantly affect the estimation of hepatic perfusion parameters but demonstrates improvements in normal volunteer 7 d HA fraction reproducibility, but deterioration in PV perfusion and TLBF reproducibility. Improved HA fraction reproducibility maybe important as arterialisation of liver perfusion is increased in chronic liver disease and within malignant liver lesions.
1992-05-22
Evaluation and Control of Compound Semiconductor Materials and Technologies (EXMATEC) at Ecole Centrale de Lyon (Ecully, France, 19th to 22nd May...semiconductor technologies to manufacture advanced devices with improved reproducibility, better reliability and lower cost. -’Device structures...concepts are required for expert evaluation and control of still developing technologies . In this context, the EXMATEC series will constitute a major
Duan, Dongsheng; Rafael-Fortney, Jill A; Blain, Alison; Kass, David A; McNally, Elizabeth M; Metzger, Joseph M; Spurney, Christopher F; Kinnett, Kathi
2016-02-01
A recent working group meeting focused on contemporary cardiac issues in Duchenne muscular dystrophy (DMD) was hosted by the National Heart, Lung, and Blood Institute in collaboration with the Parent Project Muscular Dystrophy. An outcome of this meeting was to provide freely available detailed protocols for preclinical animal studies. The goal of these protocols is to improve the quality and reproducibility of cardiac preclinical studies aimed at developing new therapeutics for the prevention and treatment of DMD cardiomyopathy.
NASA Astrophysics Data System (ADS)
Zharinov, I. O.; Zharinov, O. O.
2017-12-01
The problem of the research is concerned with quantitative analysis of influence of technological variation of the screen color profile parameters on chromaticity coordinates of the displayed image. Some mathematical expressions which approximate the two-dimensional distribution of chromaticity coordinates of an image, which is displayed on the screen with a three-component color formation principle were proposed. Proposed mathematical expressions show the way to development of correction techniques to improve reproducibility of the colorimetric features of displays.
Kent, Katherine; Charlton, Karen
2017-01-01
There is a large burden on researchers and participants when attempting to accurately measure dietary flavonoid intake using dietary assessment. Minimizing participant and researcher burden when collecting dietary data may improve the validity of the results, especially in older adults with cognitive impairment. A short 14-item food frequency questionnaire (FFQ) to measure flavonoid intake, and flavonoid subclasses (anthocyanins, flavan-3-ols, flavones, flavonols, and flavanones) was developed and assessed for validity and reproducibility against a 24-hour recall. Older adults with mild-moderate dementia (n = 49) attended two interviews 12 weeks apart. With the assistance of a family carer, a 24-h recall was collected at the first interview, and the flavonoid FFQ was interviewer-administered at both time-points. Validity and reproducibility was assessed using the Wilcoxon signed-rank sum test, Spearman's correlation coefficient, Bland-Altman Plots, and Cohen's kappa. Mean flavonoid intake was determined (FFQ1 = 795 ± 492.7 mg/day, 24-h recall = 515.6 ± 384.3 mg/day). Tests of validity indicated the FFQ was better at estimating total flavonoid intake than individual flavonoid subclasses compared with the 24-h recall. There was a significant difference in total flavonoid intake estimates between the FFQ and the 24-h recall (Wilcoxon signed-rank sum p < 0.001; Bland-Altman plots indicated large bias and wide limits of agreement), but they were well correlated (Spearman's correlation coefficient r = 0.74, p < 0.001; Cohen's kappa κ = 0.292, p < 0.001). The FFQ showed good reproducibility, with a small mean percentage difference (12.6%). The Wilcoxon signed-rank sum test showed no significant difference, Spearman's correlation coefficient indicated excellent reliability (r = 0.75, p < 0.001), Bland-Altman plots visually showed small, nonsignificant bias and wide limits of agreement, and Cohen's kappa indicated fair agreement (κ = 0.429, p < 0.001). A 14-item FFQ developed to easily measure flavonoid intake in older adults with dementia demonstrates fair validity against a 24-h recall and good reproducibility.
How measurement science can improve confidence in research results.
Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T
2018-04-01
The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.
Collaborative study for the validation of an improved HPLC assay for recombinant IFN-alfa-2.
Jönsson, K H; Daas, A; Buchheit, K H; Terao, E
2016-01-01
The current European Pharmacopoeia (Ph. Eur.) texts for Interferon (IFN)-alfa-2 include a nonspecific photometric protein assay using albumin as calibrator and a highly variable cell-based assay for the potency determination of the protective effects. A request was expressed by the Official Medicines Control Laboratories (OMCLs) for improved methods for the batch control of recombinant interferon alfa-2 bulk and market surveillance testing of finished products, including those formulated with Human Serum Albumin (HSA). A HPLC method was developed at the Medical Products Agency (MPA, Sweden) for the testing of IFN-alfa-2 products. An initial collaborative study run under the Biological Standardisation Programme (BSP; study code BSP039) revealed the need for minor changes to improve linearity of the calibration curves, assay reproducibility and robustness. The goal of the collaborative study, coded BSP071, was to transfer and further validate this improved HPLC method. Ten laboratories participated in the study. Four marketed IFN-alfa-2 preparations (one containing HSA) together with the Ph. Eur. Chemical Reference Substance (CRS) for IFN-alfa-2a and IFN-alfa-2b, and in-house reference standards from two manufacturers were used for the quantitative assay. The modified method was successfully transferred to all laboratories despite local variation in equipment. The resolution between the main and the oxidised forms of IFN-alfa-2 was improved compared to the results from the BSP039 study. The improved method even allowed partial resolution of an extra peak after the principal peak. Symmetry of the main IFN peak was acceptable for all samples in all laboratories. Calibration curves established with the Ph. Eur. IFN-alfa-2a and IFN-alfa-2b CRSs showed excellent linearity with intercepts close to the origin and coefficients of determination greater than 0.9995. Assay repeatability, intermediate precision and reproducibility varied with the tested sample within acceptable ranges. Test accuracy estimated by comparing the values obtained by the participants to the declared contents determined by the manufacturers was good despite the absence of a common reference preparation. In conclusion, the present study showed that the new method is suitable, reproducible and transferable. Proposals for the revision of Ph. Eur. texts are presented.
Viddeleer, Alain R; Sijens, Paul E; van Ooijen, Peter M A; Kuypers, Paul D L; Hovius, Steven E R; Oudkerk, Matthijs
2009-08-01
Nerve regeneration could be monitored by comparing MRI image intensities in time, as denervated muscles display increased signal intensity in STIR sequences. In this study long-term reproducibility of STIR image intensity was assessed under clinical conditions and the required image intensity nonuniformity correction was improved by using phantom scans obtained at multiple positions. Three-dimensional image intensity nonuniformity was investigated in phantom scans. Next, over a three-year period, 190 clinical STIR hand scans were obtained using a standardized acquisition protocol, and corrected for intensity nonuniformity by using the results of phantom scanning. The results of correction with 1, 3, and 11 phantom scans were compared. The image intensities in calibration tubes close to the hands were measured every time to determine the reproducibility of our method. With calibration, the reproducibility of STIR image intensity improved from 7.8 to 6.4%. Image intensity nonuniformity correction with 11 phantom scans gave significantly better results than correction with 1 or 3 scans. The image intensities in clinical STIR images acquired at different times can be compared directly, provided that the acquisition protocol is standardized and that nonuniformity correction is applied. Nonuniformity correction is preferably based on multiple phantom scans.
Development and fabrication of improved Schottky power diodes
NASA Technical Reports Server (NTRS)
Cordes, L. F.; Garfinkel, M.; Taft, E. A.
1975-01-01
Reproducible methods for the fabrication of silicon Schottky diodes have been developed for tungsten, aluminum, conventional platinum silicide, and low temperature platinum silicide. Barrier heights and barrier lowering under reverse bias have been measured, permitting the accurate prediction of forward and reverse diode characteristics. Processing procedures have been developed that permit the fabrication of large area (about 1 sq cm) mesageometry power Schottky diodes with forward and reverse characteristics that approach theoretical values. A theoretical analysis of the operation of bridge rectifier circuits has been performed, which indicates the ranges of frequency and voltage for which Schottky rectifiers are preferred to p-n junctions. Power Schottky rectifiers have been fabricated and tested for voltage ratings up to 140 volts.
On the temporal development of erythrocyte sedimentation rate using sealed vacuum tubes.
Kallner, A
1991-07-01
The temporal development of the erythrocyte sedimentation rate (ESR) was studied in wide, short vacuum tubes. It was found that in about 3% of the specimens arriving in the laboratory the ESR developed in three different phases during 60 min, whereas the other showed only two. The specimens with three phases behaved similarly in the Westergren method. It was shown that the Westergren ESR can be estimated with an acceptable accuracy already from measurements obtained after 30 min. Reproducibility and precision were improved by using a special instrument. Several advantages by this procedure were recognized, e.g., quicker results, identification of several otherwise missed rapid ESR. Accurate timing of the readings further improves accuracy and precision, and permits estimation of ESR (Westergren) up to 100 mm. In view of the obvious phases in the development of the ESR, it is proposed that this abbreviation is interpreted as erythrocyte sedimentation reaction and that the kind of quantity that is length is expressed in mm.
Improved ultrasonic standard reference blocks
NASA Technical Reports Server (NTRS)
Eitzen, D. G.; Sushinsky, G. F.; Chwirut, D. J.; Bechtoldt, C. J.; Ruff, A. W.
1976-01-01
A program to improve the quality, reproducibility and reliability of nondestructive testing through the development of improved ASTM-type ultrasonic reference standards is described. Reference blocks of aluminum, steel, and titanium alloys are to be considered. Equipment representing the state-of-the-art in laboratory and field ultrasonic equipment was obtained and evaluated. RF and spectral data on ten sets of ultrasonic reference blocks have been taken as part of a task to quantify the variability in response from nominally identical blocks. Techniques for residual stress, preferred orientation, and micro-structural measurements were refined and are applied to a reference block rejected by the manufacturer during fabrication in order to evaluate the effect of metallurgical condition on block response. New fabrication techniques for reference blocks are discussed and ASTM activities are summarized.
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
Stripline split-ring resonator with integrated optogalvanic sample cell
NASA Astrophysics Data System (ADS)
Persson, Anders; Berglund, Martin; Thornell, Greger; Possnert, Göran; Salehpour, Mehran
2014-04-01
Intracavity optogalvanic spectroscopy (ICOGS) has been proposed as a method for unambiguous detection of rare isotopes. Of particular interest is 14C, where detection of extremely low concentrations in the 1:1015 range (14C: 12C), is of interest in, e.g., radiocarbon dating and pharmaceutical sciences. However, recent reports show that ICOGS suffers from substantial problems with reproducibility. To qualify ICOGS as an analytical method, more stable and reliable plasma generation and signal detection are needed. In our proposed setup, critical parameters have been improved. We have utilized a stripline split-ring resonator microwave-induced microplasma source to excite and sustain the plasma. Such a microplasma source offers several advantages over conventional ICOGS plasma sources. For example, the stripline split-ring resonator concept employs separated plasma generation and signal detection, which enables sensitive detection at stable plasma conditions. The concept also permits in situ observation of the discharge conditions, which was found to improve reproducibility. Unique to the stripline split-ring resonator microplasma source in this study, is that the optogalvanic sample cell has been embedded in the device itself. This integration enables improved temperature control and more stable and accurate signal detection. Significant improvements are demonstrated, including reproducibility, signal-to-noise ratio, and precision.
Kumar, Manoj; Singh, Rajendra; Meena, Anil; Patidar, Bhagwan S; Prasad, Rajendra; Chhabra, Sunil K; Bansal, Surendra K
2017-01-01
The 2-dimensional gel electrophoresis (2-DE) technique is widely used for the analysis of complex protein mixtures extracted from biological samples. It is one of the most commonly used analytical techniques in proteomics to study qualitative and quantitative protein changes between different states of a cell or an organism (eg, healthy and diseased), conditionally expressed proteins, posttranslational modifications, and so on. The 2-DE technique is used for its unparalleled ability to separate thousands of proteins simultaneously. The resolution of the proteins by 2-DE largely depends on the quality of sample prepared during protein extraction which increases results in terms of reproducibility and minimizes protein modifications that may result in artifactual spots on 2-DE gels. The buffer used for the extraction and solubilization of proteins influences the quality and reproducibility of the resolution of proteins on 2-DE gel. The purification by cleanup kit is another powerful process to prevent horizontal streaking which occurs during isoelectric focusing due to the presence of contaminants such as salts, lipids, nucleic acids, and detergents. Erythrocyte membrane proteins serve as prototypes for multifunctional proteins in various erythroid and nonerythroid cells. In this study, we therefore optimized the selected major conditions of 2-DE for resolving various proteins of human erythrocyte membrane. The modification included the optimization of conditions for sample preparation, cleanup of protein sample, isoelectric focusing, equilibration, and storage of immobilized pH gradient strips, which were further carefully examined to achieve optimum conditions for improving the quality of protein spots on 2-DE gels. The present improved 2-DE analysis method enabled better detection of protein spots with higher quality and reproducibility. Therefore, the conditions established in this study may be used for the 2-DE analysis of erythrocyte membrane proteins for different diseases, which may help to identify the proteins that may serve as markers for diagnostics as well as targets for development of new therapeutic potential. PMID:28469466
2016-01-01
Background A high-quality search strategy is considered an essential component of systematic reviews but many do not contain reproducible search strategies. It is unclear if low reproducibility spans medical disciplines, is affected by librarian/search specialist involvement or has improved with increased awareness of reporting guidelines. Objectives To examine the reporting of search strategies in systematic reviews published in Pediatrics, Surgery or Cardiology journals in 2012 and determine rates and predictors of including a reproducible search strategy. Methods We identified all systematic reviews published in 2012 in the ten highest impact factor journals in Pediatrics, Surgery and Cardiology. Each search strategy was coded to indicate what elements were reported and whether the overall search was reproducible. Reporting and reproducibility rates were compared across disciplines and we measured the influence of librarian/search specialist involvement, discipline or endorsement of a reporting guideline on search reproducibility. Results 272 articles from 25 journals were included. Reporting of search elements ranged widely from 91% of articles naming search terms to 33% providing a full search strategy and 22% indicating the date the search was executed. Only 22% of articles provided at least one reproducible search strategy and 13% provided a reproducible strategy for all databases searched in the article. Librarians or search specialists were reported as involved in 17% of articles. There were strong disciplinary differences on the reporting of search elements. In the multivariable analysis, only discipline (Pediatrics) was a significant predictor of the inclusion of a reproducible search strategy. Conclusions Despite recommendations to report full, reproducible search strategies, many articles still do not. In addition, authors often report a single strategy as covering all databases searched, further decreasing reproducibility. Further research is needed to determine how disciplinary culture may encourage reproducibility and the role that journal editors and peer reviewers could play. PMID:27669416
36 CFR 903.12 - Fees for furnishing and reproducing records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Fees for furnishing and reproducing records. 903.12 Section 903.12 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION PRIVACY ACT § 903.12 Fees for furnishing and reproducing records. (a) Individuals will not be...
Shah, Rajal B; Leandro, Gioacchino; Romerocaces, Gloria; Bentley, James; Yoon, Jiyoon; Mendrinos, Savvas; Tadros, Yousef; Tian, Wei; Lash, Richard
2016-10-01
One of the major goals of an anatomic pathology laboratory quality program is to minimize unwarranted diagnostic variability and equivocal reporting. This study evaluated the utility of Miraca Life Sciences' "Disease-Focused Diagnostic Review" (DFDR) quality program in improving interobserver diagnostic reproducibility associated with classification of "atypical glands suspicious for adenocarcinoma" (ATYP) in prostate biopsies. Seventy-one selected prostate biopsies with a focus of ATYP were reviewed by 8 pathologists. Participants were blinded to the original diagnosis and were first asked to classify the ATYP as benign, atypical, or limited adenocarcinoma. DFDR comprised a "theoretical consensus" (in which pathologists first reached consensus on the morphological features they considered relevant for the diagnosis of limited prostatic adenocarcinoma), a didactic review including relevant literature, and "practical consensus" (pathologists performed joint microscopic sessions, reconciling each other's observations and positions evaluating a separate unique slide set). Participants were finally asked to reclassify the original 71 ATYP cases based on knowledge gleaned from DFDR. Pre- and post-DFDR interobserver reproducibility of overall diagnostic agreement was assessed. Interobserver reproducibility measured by Fleiss κ values of pre- and post-DFDR was 0.36 and 0.59, respectively (P=.006). Post-DFDR, there were significant improvement for "100% concordance" (P=.011) and reduction for "no consensus" (P=.0004) categories. Despite a lower pre-DFDR reproducibility for non-uropathology fellowship-trained (n=3, κ=0.38) versus uropathology fellowship-trained (n=5, κ=0.43) pathologists, both groups achieved similarly high post-DFDR κ levels (κ=0.58 and 0.56, respectively). DFDR represents an effective tool to formally achieve diagnostic consensus and reduce variability associated with critical diagnoses in an anatomic pathology practice. Copyright © 2016 Elsevier Inc. All rights reserved.
Reproducibility of the anti-Factor Xa and anti-Factor IIa assays applied to enoxaparin solution.
Martinez, Céline; Savadogo, Adama; Agut, Christophe; Anger, Pascal
2013-01-01
Enoxaparin is a widely used subcutaneously administered antithrombotic agent comprising a complex mixture of glycosaminoglycan chains. Owing to this complexity, its antithrombotic potency cannot be defined by physicochemical methods and is therefore evaluated using an enzymatic assay of anti-Xa and anti-IIa activity. Maintaining consistent anti-Xa activity in the final medicinal product allows physicians to ensure administration of the appropriate dosage to their patients. Bioassays are usually complex and display poorer reproducibility than physicochemical tests such as HPLC assays. Here, we describe the implementation of a common robotic platform and standard release potency testing procedures for enoxaparin sodium injection (Lovenox, Sanofi, Paris, France) products at seven quality control sites within Sanofi. Qualification and analytical procedures, as well as data handling, were optimized and harmonized to improve assay reproducibility. An inter-laboratory study was performed in routine-release conditions. The coefficients of variation for repeatability and reproducibility in assessments of anti-Xa activity were 1.0% and 1.2%, respectively. The tolerance interval in reproducibility precision conditions, expressed as percentage potency, was 96.8-103.2% of the drug product target of 10,000 IU/ml, comparing favorably with the United States of America Pharmacopeia specification (90-110%). The maximum difference between assays in two different laboratories is expected to be 4.1%. The reproducibility characteristics of anti-IIa activity assessments were found to be similar. These results demonstrate the effectiveness of the standardization process established and allow for further improvements to quality control in Lovenox manufacture. This process guarantees closeness between actual and target potencies, as exemplified by the results of release assays obtained during a three-year period. Copyright © 2013 Elsevier B.V. All rights reserved.
Finding the right wheel when you don't want to reinvent it
NASA Astrophysics Data System (ADS)
Hucka, Michael
2017-01-01
The increasing amount of software being developed in all areas of science brings new capabilities as well as new challenges. Two of these challenges are finding potentially-relevant software, and being able to reuse it. The notion that "surely someone must have written a tool to do XYZ" often runs into the reality of thousands of Google hits with little detail about capabilities and status of different options. Software directories such as ASCL can add tremendous value by helping to improve the signal-to-noise ratio when searching for software; in addition, developers themselves can also act to make their work more easily found and understood. In this context, it can be useful to know what people do in practice when they look for software, and some of the factors that help or hinder their ability to reuse the software they do find. The results point to some simple steps that developers can take. Improved findability and reusability of software has broad potential impact, ranging from improved reproducibility of research results to better return on investment by funding agencies.
Developing a multidisciplinary robotic surgery quality assessment program.
Gonsenhauser, Iahn; Abaza, Ronney; Mekhjian, Hagop; Moffatt-Bruce, Susan D
2012-01-01
The objective of this study was to test the feasibility of a novel quality-improvement (QI) program designed to incorporate multiple robotic surgical sub-specialties in one health care system. A robotic surgery quality assessment program was developed by The Ohio State University College of Medicine (OSUMC) in conjunction with The Ohio State University Medical Center Quality Improvement and Operations Department. A retrospective review of cases was performed using data interrogated from the OSUMC Information Warehouse from January 2007 through August 2009. Robotic surgery cases (n=2200) were assessed for operative times, length of stay (LOS), conversions, returns to surgery, readmissions and cancellations as potential quality indicators. An actionable and reproducible framework for the quality measurement and assessment of a multidisciplinary and interdepartmental robotic surgery program was successfully completed demonstrating areas for improvement opportunities. This report supports that standard quality indicators can be applied to multiple specialties within a health care system to develop a useful quality tracking and assessment tool in the highly specialized area of robotic surgery. © 2012 National Association for Healthcare Quality.
Coronary Artery Calcium Scoring: Is It Time for a Change in Methodology?
Blaha, Michael J; Mortensen, Martin Bødtker; Kianoush, Sina; Tota-Maharaj, Rajesh; Cainzos-Achirica, Miguel
2017-08-01
Quantification of coronary artery calcium (CAC) has been shown to be reliable, reproducible, and predictive of cardiovascular risk. Formal CAC scoring was introduced in 1990, with early scoring algorithms notable for their simplicity and elegance. Yet, with little evidence available on how to best build a score, and without a conceptual model guiding score development, these scores were, to a large degree, arbitrary. In this review, we describe the traditional approaches for clinical CAC scoring, noting their strengths, weaknesses, and limitations. We then discuss a conceptual model for developing an improved CAC score, reviewing the evidence supporting approaches most likely to lead to meaningful score improvement (for example, accounting for CAC density and regional distribution). After discussing the potential implementation of an improved score in clinical practice, we follow with a discussion of the future of CAC scoring, asking the central question: do we really need a new CAC score? Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
An interlaboratory study of TEX86 and BIT analysis of sediments, extracts, and standard mixtures
NASA Astrophysics Data System (ADS)
Schouten, Stefan; Hopmans, Ellen C.; Rosell-Melé, Antoni; Pearson, Ann; Adam, Pierre; Bauersachs, Thorsten; Bard, Edouard; Bernasconi, Stefano M.; Bianchi, Thomas S.; Brocks, Jochen J.; Carlson, Laura Truxal; Castañeda, Isla S.; Derenne, Sylvie; Selver, Ayça. Doǧrul; Dutta, Koushik; Eglinton, Timothy; Fosse, Celine; Galy, Valier; Grice, Kliti; Hinrichs, Kai-Uwe; Huang, Yongsong; Huguet, Arnaud; Huguet, Carme; Hurley, Sarah; Ingalls, Anitra; Jia, Guodong; Keely, Brendan; Knappy, Chris; Kondo, Miyuki; Krishnan, Srinath; Lincoln, Sara; Lipp, Julius; Mangelsdorf, Kai; Martínez-García, Alfredo; Ménot, Guillemette; Mets, Anchelique; Mollenhauer, Gesine; Ohkouchi, Naohiko; Ossebaar, Jort; Pagani, Mark; Pancost, Richard D.; Pearson, Emma J.; Peterse, Francien; Reichart, Gert-Jan; Schaeffer, Philippe; Schmitt, Gaby; Schwark, Lorenz; Shah, Sunita R.; Smith, Richard W.; Smittenberg, Rienk H.; Summons, Roger E.; Takano, Yoshinori; Talbot, Helen M.; Taylor, Kyle W. R.; Tarozo, Rafael; Uchida, Masao; van Dongen, Bart E.; Van Mooy, Benjamin A. S.; Wang, Jinxiang; Warren, Courtney; Weijers, Johan W. H.; Werne, Josef P.; Woltering, Martijn; Xie, Shucheng; Yamamoto, Masanobu; Yang, Huan; Zhang, Chuanlun L.; Zhang, Yige; Zhao, Meixun; Damsté, Jaap S. Sinninghe
2013-12-01
Two commonly used proxies based on the distribution of glycerol dialkyl glycerol tetraethers (GDGTs) are the TEX86 (TetraEther indeX of 86 carbon atoms) paleothermometer for sea surface temperature reconstructions and the BIT (Branched Isoprenoid Tetraether) index for reconstructing soil organic matter input to the ocean. An initial round-robin study of two sediment extracts, in which 15 laboratories participated, showed relatively consistent TEX86 values (reproducibility ±3-4°C when translated to temperature) but a large spread in BIT measurements (reproducibility ±0.41 on a scale of 0-1). Here we report results of a second round-robin study with 35 laboratories in which three sediments, one sediment extract, and two mixtures of pure, isolated GDGTs were analyzed. The results for TEX86 and BIT index showed improvement compared to the previous round-robin study. The reproducibility, indicating interlaboratory variation, of TEX86 values ranged from 1.3 to 3.0°C when translated to temperature. These results are similar to those of other temperature proxies used in paleoceanography. Comparison of the results obtained from one of the three sediments showed that TEX86 and BIT indices are not significantly affected by interlaboratory differences in sediment extraction techniques. BIT values of the sediments and extracts were at the extremes of the index with values close to 0 or 1, and showed good reproducibility (ranging from 0.013 to 0.042). However, the measured BIT values for the two GDGT mixtures, with known molar ratios of crenarchaeol and branched GDGTs, had intermediate BIT values and showed poor reproducibility and a large overestimation of the "true" (i.e., molar-based) BIT index. The latter is likely due to, among other factors, the higher mass spectrometric response of branched GDGTs compared to crenarchaeol, which also varies among mass spectrometers. Correction for this different mass spectrometric response showed a considerable improvement in the reproducibility of BIT index measurements among laboratories, as well as a substantially improved estimation of molar-based BIT values. This suggests that standard mixtures should be used in order to obtain consistent, and molar-based, BIT values.
Faber, Irene R; Nijhuis-Van Der Sanden, Maria W G; Elferink-Gemser, Marije T; Oosterveld, Frits G J
2015-01-01
A motor skills assessment could be helpful in talent development by estimating essential perceptuo-motor skills of young players, which are considered requisite to develop excellent technical and tactical qualities. The Netherlands Table Tennis Association uses a motor skills assessment in their talent development programme consisting of eight items measuring perceptuo-motor skills specific to table tennis under varying conditions. This study aimed to investigate this assessment regarding its reproducibility, internal consistency, underlying dimensions and concurrent validity in 113 young table tennis players (6-10 years). Intraclass correlation coefficients of six test items met the criteria of 0.7 with coefficients of variation between 3% and 8%. Cronbach's alpha valued 0.853 for internal consistency. The principal components analysis distinguished two conceptually meaningful factors: "ball control" and "gross motor function." Concurrent validity analyses demonstrated moderate associations between the motor skills assessment's results and national ranking; boys r = -0.53 (P < 0.001) and girls r = -0.45 (P = 0.015). In conclusion, this evaluation demonstrated six test items with acceptable reproducibility, good internal consistency and good prospects for validity. Two test items need revision to upgrade reproducibility. Since the motor skills assessment seems to be a reproducible, objective part of a talent development programme, more longitudinal studies are required to investigate its predictive validity.
Elsen, A.; Lens, K.; Nguyet, D. T. M.; Broos, S.; Stoffelen, R.; De Waele, D.
2001-01-01
Radopholus similis is one of the most damaging nematodes in bananas. Chemical control is currently the most-used method, but nematode control through genetic improvement is widely encouraged. The objective of this study was to establish an aseptic culture system for R. similis and determine whether R. similis can infect and reproduce on in vitro banana plantlets and in vitro Arabidopsis thaliana. In the study's first part, a suitable aseptic culture system was developed using alfalfa callus. Radopholus similis could penetrate and reproduce in the callus. Six weeks after inoculation with 25 females, the reproduction ratio was 26.3 and all vermiform stages were present. The reproduction ratio increased to 223.2 after 12 weeks. Results of a greenhouse test showed that R. similis did not lose its pathogenicity after culturing on alfalfa callus. In the study's second part, the infection and reproduction of the nematodes cultured on the callus were studied on both in vitro banana plantlets and A. thaliana. Radopholus similis infected and reproduced on both banana and A. thaliana. Furthermore, nematode damage was observed in the root systems of both hosts. These successful infections open new perspectives for rapid in vitro screening for resistance in banana cultivars and anti-nematode proteins expressed in A. thaliana. PMID:19266012
Evolvix BEST Names for semantic reproducibility across code2brain interfaces
Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha
2016-01-01
Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836
Spatial and temporal variability in response to hybrid electro-optical stimulation
NASA Astrophysics Data System (ADS)
Duke, Austin R.; Lu, Hui; Jenkins, Michael W.; Chiel, Hillel J.; Jansen, E. Duco
2012-06-01
Hybrid electro-optical neural stimulation is a novel paradigm combining the advantages of optical and electrical stimulation techniques while reducing their respective limitations. However, in order to fulfill its promise, this technique requires reduced variability and improved reproducibility. Here we used a comparative physiological approach to aid the further development of this technique by identifying the spatial and temporal factors characteristic of hybrid stimulation that may contribute to experimental variability and/or a lack of reproducibility. Using transient pulses of infrared light delivered simultaneously with a bipolar electrical stimulus in either the marine mollusk Aplysia californica buccal nerve or the rat sciatic nerve, we determined the existence of a finite region of excitability with size altered by the strength of the optical stimulus and recruitment dictated by the polarity of the electrical stimulus. Hybrid stimulation radiant exposures yielding 50% probability of firing (RE50) were shown to be negatively correlated with the underlying changes in electrical stimulation threshold over time. In Aplysia, but not in the rat sciatic nerve, increasing optical radiant exposures (J cm-2) beyond the RE50 ultimately resulted in inhibition of evoked potentials. Accounting for the sources of variability identified in this study increased the reproducibility of stimulation from 35% to 93% in Aplysia and 23% to 76% in the rat with reduced variability.
NASA Astrophysics Data System (ADS)
Biallas, Martin; Trajkovic, Ivo; Hagmann, Cornelia; Scholkmann, Felix; Jenny, Carmen; Holper, Lisa; Beck, Andreas; Wolf, Martin
2012-08-01
In this study 14 healthy term newborns (postnatal mean age 2.1 days) underwent photic stimulation during sleep on two different days. Near-infrared spectroscopy (NIRS) and electroencephalography (EEG) was acquired simultaneously. The aims of the study were: to determine (i) the sensitivity and (ii) the repeatability of NIRS to detect the hemodynamic response, (iii) the sensitivity and (iv) the repeatability of EEG to detect a visual evoked potential (VEP), (v) to analyze optical data for the optical neuronal signal, and (vi) to test whether inadequate stimulation could be reason for absent hemodynamic responses. The results of the study were as follows. (i) Sensitivity of NIRS was 61.5% to detect hemodynamic responses; (ii) their reproducibility was 41.7%. A VEP was detected (iii) in 96.3% of all subjects with (iv) a reproducibility of 92.3%. (v) In two measurements data met the criteria for an optical neuronal signal. The noise level was 9.6.10-5% change in optical density. (vi) Insufficient stimulation was excluded as reason for absent hemodynamic responses. We conclude that NIRS is an promising tool to study cognitive activation and development of the brain. For clinical application, however, the sensitivity and reproducibility on an individual level needs to be improved.
Ujiie, Hideki; Kato, Tatsuya; Hu, Hsin-Pei; Bauer, Patrycja; Patel, Priya; Wada, Hironobu; Lee, Daiyoon; Fujino, Kosuke; Schieman, Colin; Pierre, Andrew; Waddell, Thomas K; Keshavjee, Shaf; Darling, Gail E; Yasufuku, Kazuhiro
2017-06-01
Surgical trainees are required to develop competency in a variety of laparoscopic operations. Developing laparoscopic technical skills can be difficult as there has been a decrease in the number of procedures performed. This study aims to develop an inexpensive and anatomically relevant model for training in laparoscopic foregut procedures. An ex vivo , anatomic model of the human upper abdomen was developed using intact porcine esophagus, stomach, diaphragm and spleen. The Toronto lap-Nissen simulator was contained in a laparoscopic box-trainer and included an arch system to simulate the normal radial shape and tension of the diaphragm. We integrated the use of this training model as a part of our laparoscopic skills laboratory-training curriculum. Afterwards, we surveyed trainees to evaluate the observed benefit of the learning session. Twenty-five trainees and five faculty members completed a survey regarding the use of this model. Among the trainees, only 4 (16%) had experience with laparoscopic Heller myotomy and Nissen fundoplication. They reported that practicing with the model was a valuable use of their limited time, repeating the exercise would be of additional benefit, and that the exercise improved their ability to perform or assist in an actual case in the operating room. Significant improvements were found in the following subjective measures comparing pre- vs. post-training: (I) knowledge level (5.6 vs. 8.0, P<0.001); (II) comfort level in assisting (6.3 vs. 7.6, P<0.001); and (III) comfort level in performing as the primary surgeon (4.9 vs. 7.1, P<0.001). The trainees and faculty members agreed that this model was of adequate fidelity and was a representative simulation of actual human anatomy. We developed an easily reproducible training model for laparoscopic procedures. This simulator reproduces human anatomy and increases the trainees' comfort level in performing and assisting with myotomy and fundoplication.
Ujiie, Hideki; Kato, Tatsuya; Hu, Hsin-Pei; Bauer, Patrycja; Patel, Priya; Wada, Hironobu; Lee, Daiyoon; Fujino, Kosuke; Schieman, Colin; Pierre, Andrew; Waddell, Thomas K.; Keshavjee, Shaf; Darling, Gail E.
2017-01-01
Background Surgical trainees are required to develop competency in a variety of laparoscopic operations. Developing laparoscopic technical skills can be difficult as there has been a decrease in the number of procedures performed. This study aims to develop an inexpensive and anatomically relevant model for training in laparoscopic foregut procedures. Methods An ex vivo, anatomic model of the human upper abdomen was developed using intact porcine esophagus, stomach, diaphragm and spleen. The Toronto lap-Nissen simulator was contained in a laparoscopic box-trainer and included an arch system to simulate the normal radial shape and tension of the diaphragm. We integrated the use of this training model as a part of our laparoscopic skills laboratory-training curriculum. Afterwards, we surveyed trainees to evaluate the observed benefit of the learning session. Results Twenty-five trainees and five faculty members completed a survey regarding the use of this model. Among the trainees, only 4 (16%) had experience with laparoscopic Heller myotomy and Nissen fundoplication. They reported that practicing with the model was a valuable use of their limited time, repeating the exercise would be of additional benefit, and that the exercise improved their ability to perform or assist in an actual case in the operating room. Significant improvements were found in the following subjective measures comparing pre- vs. post-training: (I) knowledge level (5.6 vs. 8.0, P<0.001); (II) comfort level in assisting (6.3 vs. 7.6, P<0.001); and (III) comfort level in performing as the primary surgeon (4.9 vs. 7.1, P<0.001). The trainees and faculty members agreed that this model was of adequate fidelity and was a representative simulation of actual human anatomy. Conclusions We developed an easily reproducible training model for laparoscopic procedures. This simulator reproduces human anatomy and increases the trainees’ comfort level in performing and assisting with myotomy and fundoplication. PMID:28740664
NASA Astrophysics Data System (ADS)
Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.
2017-12-01
The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently developed, improved, scaled, and seamlessly reproduced among multidisciplinary users, thereby expanding the active learning curriculum and research opportunities for students in earth surface modeling and informatics.
Synthetic Biology Open Language (SBOL) Version 2.1.0.
Beal, Jacob; Cox, Robert Sidney; Grünberg, Raik; McLaughlin, James; Nguyen, Tramy; Bartley, Bryan; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Macklin, Chris; Madsen, Curtis; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Roehner, Nicholas; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John H; Myers, Chris; Sauro, Herbert; Wipat, Anil
2016-09-01
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The Synthetic Biology Open Language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.1 of SBOL that builds upon version 2.0 published in last year's JIB special issue. In particular, SBOL 2.1 includes improved rules for what constitutes a valid SBOL document, new role fields to simplify the expression of sequence features and how components are used in context, and new best practices descriptions to improve the exchange of basic sequence topology information and the description of genetic design provenance, as well as miscellaneous other minor improvements.
Synthetic Biology Open Language (SBOL) Version 2.1.0.
Beal, Jacob; Cox, Robert Sidney; Grünberg, Raik; McLaughlin, James; Nguyen, Tramy; Bartley, Bryan; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Macklin, Chris; Madsen, Curtis; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Roehner, Nicholas; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John; Myers, Chris; Sauro, Herbert; Wipat, Anil
2016-12-18
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The Synthetic Biology Open Language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.1 of SBOL that builds upon version 2.0 published in last year’s JIB special issue. In particular, SBOL 2.1 includes improved rules for what constitutes a valid SBOL document, new role fields to simplify the expression of sequence features and how components are used in context, and new best practices descriptions to improve the exchange of basic sequence topology information and the description of genetic design provenance, as well as miscellaneous other minor improvements.
Model improvements to simulate charging in SEM
NASA Astrophysics Data System (ADS)
Arat, K. T.; Klimpel, T.; Hagen, C. W.
2018-03-01
Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.
Pan, Qing; Yao, Jialiang; Wang, Ruofan; Cao, Ping; Ning, Gangmin; Fang, Luping
2017-08-01
The vessels in the microcirculation keep adjusting their structure to meet the functional requirements of the different tissues. A previously developed theoretical model can reproduce the process of vascular structural adaptation to help the study of the microcirculatory physiology. However, until now, such model lacks the appropriate methods for its parameter settings with subsequent limitation of further applications. This study proposed an improved quantum-behaved particle swarm optimization (QPSO) algorithm for setting the parameter values in this model. The optimization was performed on a real mesenteric microvascular network of rat. The results showed that the improved QPSO was superior to the standard particle swarm optimization, the standard QPSO and the previously reported Downhill algorithm. We conclude that the improved QPSO leads to a better agreement between mathematical simulation and animal experiment, rendering the model more reliable in future physiological studies.
High-density arrays of x-ray microcalorimeters for Constellation-X
NASA Astrophysics Data System (ADS)
Kilbourne, Caroline A.; Bandler, Simon R.; Brown, Ari D.; Chervenak, James A.; Figueroa-Feliciano, Enectali; Finkbeiner, Fred M.; Iyomoto, Naoko; Kelley, Richard L.; Porter, F. Scott; Saab, Tarek; Sadleir, John; White, Jennifer
2006-06-01
We have been developing x-ray microcalorimeters for the Constellation-X mission. Devices based on superconducting transition-edge sensors (TES) have demonstrated the potential to meet the Constellation-X requirements for spectral resolution, speed, and array scale (> 1000 pixels) in a close-packed geometry. In our part of the GSFC/NIST collaboration on this technology development, we have been concentrating on the fabrication of arrays of pixels suitable for the Constellation-X reference configuration. We have fabricated 8x8 arrays with 0.25-mm pixels arranged with 92% fill factor. The pixels are based on Mo/Au TES and Bi/Cu or Au/Bi absorbers. We have achieved a resolution of 4.0 eV FWHM at 6 keV in such devices, which meets the Constellation-X resolution requirement at 6 keV. Studies of the thermal transport in our Bi/Cu absorbers have shown that, while there is room for improvement, for 0.25-mm pixels the standard absorber design is adequate to avoid unacceptable line-broadening from position dependence caused by thermal diffusion. In order to improve reproducibility and to push closer to the 2-eV goal at 6 keV, however, we are refining the design of the TES and the interface to the absorber. Recent efforts to introduce a barrier layer between the Bi and the Mo/Au to avoid variable interface chemistry and thus improve the reproducibility of device characteristics have thus far yielded unsatisfactory results. However, we have developed a new set of absorber designs with contacts to the TES engineered to allow contact only in regions that do not serve as the active thermometer. We have further constrained the design so that a low-resistance absorber will not electrically short the TES. It is with such a design that we have achieved 4.0 eV resolution at 6 keV.
Hackley, Paul C.
2014-01-01
Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708-11: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (Ro 0.31-1.53%), from organic-rich to organic-lean (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability values (difference between repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility values (difference between results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping the other components. Discussion among participants suggested this problem could not be corrected via kerogen concentration or solvent extraction and is related to operator training and background. Poor reproducibility (0.54% absolute reflectance, related to increased anisotropy?) in the highest maturity sample (Ro 1.53%) suggests that vitrinite reflectance is not a highly reliable parameter in such rocks. Future work will investigate opportunities to improve reproducibility in similar high maturity, organic-lean shale varieties.
Ontology for the asexual development and anatomy of the colonial chordate Botryllus schlosseri.
Manni, Lucia; Gasparini, Fabio; Hotta, Kohji; Ishizuka, Katherine J; Ricci, Lorenzo; Tiozzo, Stefano; Voskoboynik, Ayelet; Dauga, Delphine
2014-01-01
Ontologies provide an important resource to integrate information. For developmental biology and comparative anatomy studies, ontologies of a species are used to formalize and annotate data that are related to anatomical structures, their lineage and timing of development. Here, we have constructed the first ontology for anatomy and asexual development (blastogenesis) of a bilaterian, the colonial tunicate Botryllus schlosseri. Tunicates, like Botryllus schlosseri, are non-vertebrates and the only chordate taxon species that reproduce both sexually and asexually. Their tadpole larval stage possesses structures characteristic of all chordates, i.e. a notochord, a dorsal neural tube, and gill slits. Larvae settle and metamorphose into individuals that are either solitary or colonial. The latter reproduce both sexually and asexually and these two reproductive modes lead to essentially the same adult body plan. The Botryllus schlosseri Ontology of Development and Anatomy (BODA) will facilitate the comparison between both types of development. BODA uses the rules defined by the Open Biomedical Ontologies Foundry. It is based on studies that investigate the anatomy, blastogenesis and regeneration of this organism. BODA features allow the users to easily search and identify anatomical structures in the colony, to define the developmental stage, and to follow the morphogenetic events of a tissue and/or organ of interest throughout asexual development. We invite the scientific community to use this resource as a reference for the anatomy and developmental ontology of B. schlosseri and encourage recommendations for updates and improvements.
Ontology for the Asexual Development and Anatomy of the Colonial Chordate Botryllus schlosseri
Manni, Lucia; Gasparini, Fabio; Hotta, Kohji; Ishizuka, Katherine J.; Ricci, Lorenzo; Tiozzo, Stefano; Voskoboynik, Ayelet; Dauga, Delphine
2014-01-01
Ontologies provide an important resource to integrate information. For developmental biology and comparative anatomy studies, ontologies of a species are used to formalize and annotate data that are related to anatomical structures, their lineage and timing of development. Here, we have constructed the first ontology for anatomy and asexual development (blastogenesis) of a bilaterian, the colonial tunicate Botryllus schlosseri. Tunicates, like Botryllus schlosseri, are non-vertebrates and the only chordate taxon species that reproduce both sexually and asexually. Their tadpole larval stage possesses structures characteristic of all chordates, i.e. a notochord, a dorsal neural tube, and gill slits. Larvae settle and metamorphose into individuals that are either solitary or colonial. The latter reproduce both sexually and asexually and these two reproductive modes lead to essentially the same adult body plan. The Botryllus schlosseri Ontology of Development and Anatomy (BODA) will facilitate the comparison between both types of development. BODA uses the rules defined by the Open Biomedical Ontologies Foundry. It is based on studies that investigate the anatomy, blastogenesis and regeneration of this organism. BODA features allow the users to easily search and identify anatomical structures in the colony, to define the developmental stage, and to follow the morphogenetic events of a tissue and/or organ of interest throughout asexual development. We invite the scientific community to use this resource as a reference for the anatomy and developmental ontology of B. schlosseri and encourage recommendations for updates and improvements. PMID:24789338
Nano-immunoassay with improved performance for detection of cancer biomarkers
Krasnoslobodtsev, Alexey V.; Torres, Maria P.; Kaur, Sukhwinder; ...
2015-01-01
Nano-immunoassay utilizing surface-enhanced Raman scattering (SERS) effect is a promising analytical technique for the early detection of cancer. In its current standing the assay is capable of discriminating samples of healthy individuals from samples of pancreatic cancer patients. Further improvements in sensitivity and reproducibility will extend practical applications of the SERS-based detection platforms to wider range of problems. In this report, we discuss several strategies designed to improve performance of the SERS-based detection system. We demonstrate that reproducibility of the platform is enhanced by using atomically smooth mica surface as a template for preparation of capture surface in SERS sandwichmore » immunoassay. Furthermore, the assay's stability and sensitivity can be further improved by using either polymer or graphene monolayer as a thin protective layer applied on top of the assay addresses. The protective layer renders the signal to be more stable against photo-induced damage and carbonaceous contamination.« less
Sakaguchi, Hitoshi; Ryan, Cindy; Ovigne, Jean-Marc; Schroeder, Klaus R; Ashikaga, Takao
2010-09-01
Regulatory policies in Europe prohibited the testing of cosmetic ingredients in animals for a number of toxicological endpoints. Currently no validated non-animal test methods exist for skin sensitization. Evaluation of changes in cell surface marker expression in dendritic cell (DC)-surrogate cell lines represents one non-animal approach. The human Cell Line Activation Test (h-CLAT) examines the level of CD86 and CD54 expression on the surface of THP-1 cells, a human monocytic leukemia cell line, following 24h of chemical exposure. To examine protocol transferability, between-lab reproducibility, and predictive capacity, the h-CLAT has been evaluated by five independent laboratories in several ring trials (RTs) coordinated by the European Cosmetics Association (COLIPA). The results of the first and second RTs demonstrated that the protocol was transferable and basically had good between-lab reproducibility and predictivity, but there were some false negative data. To improve performance, protocol and prediction model were modified. Using the modified prediction model in the first and second RT, accuracy was improved. However, about 15% of the outcomes were not correctly identified, which exposes some of the limitations of the assay. For the chemicals evaluated, the limitation may due to chemical being a weak allergen or having low solubility (ex. alpha-hexylcinnamaldehyde). The third RT evaluated the modified prediction model and satisfactory results were obtained. From the RT data, the feasibility of utilizing cell lines as surrogate DC in development of in vitro skin sensitization methods shows promise. The data also support initiating formal pre-validation of the h-CLAT in order to fully understand the capabilities and limitations of the assay. Copyright 2010 Elsevier Ltd. All rights reserved.
Calès, P; Zarski, J P; Chapplain, J Marc; Bertrais, S; Sturm, N; Michelet, C; Babany, G; Chaigneau, J; Eddine Charaf, M
2012-02-01
We evaluated whether quantitative measurements of liver fibrosis with recently developed diagnostics outperform histological staging in detecting natural or interferon-induced changes. We compared Metavir staging, morphometry (area and fractal dimension) and six blood tests in 157 patients with chronic hepatitis C from two trials testing maintenance interferon for 96 weeks. Paired liver biopsies and blood tests were available for 101 patients, and there was a significant improvement in Metavir activity and a significant increase in blood tests reflecting fibrosis quantity in patients treated with interferon when compared with controls - all per cent changes in histological fibrosis measures were significantly increased in F1 vs F2-4 stages only in the interferon group. For the whole population studied between weeks 0 and 96, there was significant progression only in the area of fibrosis (AOF) (P = 0.026), FibroMeter (P = 0.020) and CirrhoMeter (P = 0.003). With regards to dynamic reproducibility, agreement was good (r(ic) ≥ 0.72) only for Metavir fibrosis score, FibroMeter and CirrhoMeter. The per cent change in AOF was significantly higher than that of fractal dimension (P = 0.003) or Metavir fibrosis score (P = 0.015). CirrhoMeter was the only blood test with a change significantly higher than that of AOF (P = 0.039). AOF and two blood tests, reflecting fibrosis quantity, have high sensitivity and/or reproducibility permitting the detection of a small progression in liver fibrosis over two years. A blood test reflecting fibrosis quantity is more sensitive and reproducible than morphometry. The study also shows that maintenance interferon does not improve fibrosis, whatever its stage. © 2011 Blackwell Publishing Ltd.
Got, Jeanne; Cortés, María Paz; Maass, Alejandro
2018-01-01
Genome-scale metabolic models have become the tool of choice for the global analysis of microorganism metabolism, and their reconstruction has attained high standards of quality and reliability. Improvements in this area have been accompanied by the development of some major platforms and databases, and an explosion of individual bioinformatics methods. Consequently, many recent models result from “à la carte” pipelines, combining the use of platforms, individual tools and biological expertise to enhance the quality of the reconstruction. Although very useful, introducing heterogeneous tools, that hardly interact with each other, causes loss of traceability and reproducibility in the reconstruction process. This represents a real obstacle, especially when considering less studied species whose metabolic reconstruction can greatly benefit from the comparison to good quality models of related organisms. This work proposes an adaptable workspace, AuReMe, for sustainable reconstructions or improvements of genome-scale metabolic models involving personalized pipelines. At each step, relevant information related to the modifications brought to the model by a method is stored. This ensures that the process is reproducible and documented regardless of the combination of tools used. Additionally, the workspace establishes a way to browse metabolic models and their metadata through the automatic generation of ad-hoc local wikis dedicated to monitoring and facilitating the process of reconstruction. AuReMe supports exploration and semantic query based on RDF databases. We illustrate how this workspace allowed handling, in an integrated way, the metabolic reconstructions of non-model organisms such as an extremophile bacterium or eukaryote algae. Among relevant applications, the latter reconstruction led to putative evolutionary insights of a metabolic pathway. PMID:29791443
Lambron, Julien; Rakotonjanahary, Josué; Loisel, Didier; Frampas, Eric; De Carli, Emilie; Delion, Matthieu; Rialland, Xavier; Toulgoat, Frédérique
2016-02-01
Magnetic resonance (MR) images from children with optic pathway glioma (OPG) are complex. We initiated this study to evaluate the accuracy of MR imaging (MRI) interpretation and to propose a simple and reproducible imaging classification for MRI. We randomly selected 140 MRIs from among 510 MRIs performed on 104 children diagnosed with OPG in France from 1990 to 2004. These images were reviewed independently by three radiologists (F.T., 15 years of experience in neuroradiology; D.L., 25 years of experience in pediatric radiology; and J.L., 3 years of experience in radiology) using a classification derived from the Dodge and modified Dodge classifications. Intra- and interobserver reliabilities were assessed using the Bland-Altman method and the kappa coefficient. These reviews allowed the definition of reliable criteria for MRI interpretation. The reviews showed intraobserver variability and large discrepancies among the three radiologists (kappa coefficient varying from 0.11 to 1). These variabilities were too large for the interpretation to be considered reproducible over time or among observers. A consensual analysis, taking into account all observed variabilities, allowed the development of a definitive interpretation protocol. Using this revised protocol, we observed consistent intra- and interobserver results (kappa coefficient varying from 0.56 to 1). The mean interobserver difference for the solid portion of the tumor with contrast enhancement was 0.8 cm(3) (limits of agreement = -16 to 17). We propose simple and precise rules for improving the accuracy and reliability of MRI interpretation for children with OPG. Further studies will be necessary to investigate the possible prognostic value of this approach.
Chen, Chih-Hao; Hsu, Chueh-Lin; Huang, Shih-Hao; Chen, Shih-Yuan; Hung, Yi-Lin; Chen, Hsiao-Rong; Wu, Yu-Chung
2015-01-01
Although genome-wide expression analysis has become a routine tool for gaining insight into molecular mechanisms, extraction of information remains a major challenge. It has been unclear why standard statistical methods, such as the t-test and ANOVA, often lead to low levels of reproducibility, how likely applying fold-change cutoffs to enhance reproducibility is to miss key signals, and how adversely using such methods has affected data interpretations. We broadly examined expression data to investigate the reproducibility problem and discovered that molecular heterogeneity, a biological property of genetically different samples, has been improperly handled by the statistical methods. Here we give a mathematical description of the discovery and report the development of a statistical method, named HTA, for better handling molecular heterogeneity. We broadly demonstrate the improved sensitivity and specificity of HTA over the conventional methods and show that using fold-change cutoffs has lost much information. We illustrate the especial usefulness of HTA for heterogeneous diseases, by applying it to existing data sets of schizophrenia, bipolar disorder and Parkinson’s disease, and show it can abundantly and reproducibly uncover disease signatures not previously detectable. Based on 156 biological data sets, we estimate that the methodological issue has affected over 96% of expression studies and that HTA can profoundly correct 86% of the affected data interpretations. The methodological advancement can better facilitate systems understandings of biological processes, render biological inferences that are more reliable than they have hitherto been and engender translational medical applications, such as identifying diagnostic biomarkers and drug prediction, which are more robust. PMID:25793610
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, Katsumasa; Shioyama, Yoshiyuki; Nomoto, Satoru
2007-05-01
Purpose: The voluntary breath-hold (BH) technique is a simple method to control the respiration-related motion of a tumor during irradiation. However, the abdominal and chest wall position may not be accurately reproduced using the BH technique. The purpose of this study was to examine whether visual feedback can reduce the fluctuation in wall motion during BH using a new respiratory monitoring device. Methods and Materials: We developed a laser-based BH monitoring and visual feedback system. For this study, five healthy volunteers were enrolled. The volunteers, practicing abdominal breathing, performed shallow end-expiration BH (SEBH), shallow end-inspiration BH (SIBH), and deep end-inspirationmore » BH (DIBH) with or without visual feedback. The abdominal and chest wall positions were measured at 80-ms intervals during BHs. Results: The fluctuation in the chest wall position was smaller than that of the abdominal wall position. The reproducibility of the wall position was improved by visual feedback. With a monitoring device, visual feedback reduced the mean deviation of the abdominal wall from 2.1 {+-} 1.3 mm to 1.5 {+-} 0.5 mm, 2.5 {+-} 1.9 mm to 1.1 {+-} 0.4 mm, and 6.6 {+-} 2.4 mm to 2.6 {+-} 1.4 mm in SEBH, SIBH, and DIBH, respectively. Conclusions: Volunteers can perform the BH maneuver in a highly reproducible fashion when informed about the position of the wall, although in the case of DIBH, the deviation in the wall position remained substantial.« less
Development of a High Intensity Focused Ultrasound (HIFU) Hydrophone System
NASA Astrophysics Data System (ADS)
Schafer, Mark E.; Gessert, James
2009-04-01
The growing clinical use of High Intensity Focused Ultrasound (HIFU) has driven a need for reliable, reproducible measurements of HIFU acoustic fields. We have previously presented data on a reflective scatterer approach, incorporating several novel features for improved bandwidth, reliability, and reproducibility [Proc. 2005 IEEE Ultrasonics Symposium, 1739-1742]. We now report on several design improvements which have increase the signal to noise ratio of the system, and potentially reduced the cost of implementation. For the scattering element, we now use an artificial sapphire material to provide a more uniform radiating surface. The receiver is a segmented, truncated spherical structure with a 10 cm radius; the scattering element is positioned at the center of the sphere. The receiver is made from 25 micron thick, biaxially stretched PVDF, with a Pt-Au electrode on the front surface. In the new design, a specialized backing material provides the stiffness required to maintain structural stability, while at the same time providing both electrical shielding and ultrasonic absorption. Compared with the previous version, the new receiver design has improved the noise performance by 8-12 dB; the new scattering sphere has reduced the scattering loss by another 14 dB, producing an effective sensitivity of -298 dB re 1 microVolt/Pa. The design trade-off still involves receiver sensitivity with effective spot size, and signal distortion from the scatter structure. However, the reduced cost and improved repeatability of the new scatter approach makes the overall design more robust for routine waveform measurements of HIFU systems.
Fried, Peter J.; Jannati, Ali; Davila-Pérez, Paula; Pascual-Leone, Alvaro
2017-01-01
Background: Transcranial magnetic stimulation (TMS) can be used to assess neurophysiology and the mechanisms of cortical brain plasticity in humans in vivo. As the use of these measures in specific populations (e.g., Alzheimer’s disease; AD) increases, it is critical to understand their reproducibility (i.e., test–retest reliability) in the populations of interest. Objective: Reproducibility of TMS measures was evaluated in older adults, including healthy, AD, and Type-2 diabetes mellitus (T2DM) groups. Methods: Participants received two identical neurophysiological assessments within a year including motor thresholds, baseline motor evoked potentials (MEPs), short- and long-interval intracortical inhibition (SICI, LICI) and intracortical facilitation (ICF), and MEP changes following intermittent theta-burst stimulation (iTBS). Cronbach’s α coefficients were calculated to assess reproducibility. Multiple linear regression analyses were used to investigate factors related to intraindividual variability. Results: Reproducibility was highest for motor thresholds, followed by baseline MEPs, SICI and LICI, and was lowest for ICF and iTBS aftereffects. The AD group tended to show higher reproducibility than T2DM or controls. Intraindividual variability of baseline MEPs was related to age and variability of RMT, while the intraindividual variability in post-iTBS measures was related to baseline MEP variability, intervisit duration, and Brain-derived neurotrophic factor (BDNF) polymorphism. Conclusion: Increased reproducibility in AD may reflect pathophysiological declines in the efficacy of neuroplastic mechanisms. Reproducibility of iTBS aftereffects can be improved by keeping baseline MEPs consistent, controlling for BDNF genotype, and waiting at least a week between visits. Significance: These findings provide the first direct assessment of reproducibility of TMS measures in older clinical populations. Reproducibility coefficients may be used to adjust effect- and sample size calculations for future studies. PMID:28871222
Who or What? Self-Replication and Function-Reproduction in the Origin of Life
NASA Technical Reports Server (NTRS)
New, Michael H.; Stassinopoulos, Dimitris; Monaco, Regina; Pohorille, Andrew; DeVincenzi, Donald (Technical Monitor)
2002-01-01
In this presentation, we will present results on the fundamental properties of two classes of replicating systems: autocatalytic replicators that reproduce exact copies of a template molecule, and function reproducers that maintain a set of essential functions without replicating the identities of the functional moieties. We will describe the stability and behavior in-the-large of autocatalytic replicators. Most importantly, we have found no sharp distinction between an autocatalytic and a non-autocatalytic domain. We will also present a new derivation of von Kiedrowski's square-root rate law. Function - reproducers are proposed as an important component of protocells and we will present theoretical results on a simple model system that incorporates known peptide biophysics. For a wide range of parameters, we have shown that this type of system can improve its overall performance, even in the absence of any method for information storage. This type of system improvement is defined to be non-genomic evolution.
An open science peer review oath.
Aleksic, Jelena; Alexa, Adrian; Attwood, Teresa K; Chue Hong, Neil; Dahlö, Martin; Davey, Robert; Dinkel, Holger; Förstner, Konrad U; Grigorov, Ivo; Hériché, Jean-Karim; Lahti, Leo; MacLean, Dan; Markie, Michael L; Molloy, Jenny; Schneider, Maria Victoria; Scott, Camille; Smith-Unna, Richard; Vieira, Bruno Miguel
2014-01-01
One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously) uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions.
Janero, David R
2016-09-01
Drug discovery depends critically upon published results from the academy. The reproducibility of preclinical research findings reported by academia in the peer-reviewed literature has been called into question, seriously jeopardizing the value of academic science for inventing therapeutics. The corrosive effects of the reproducibility issue on drug discovery are considered. Purported correctives imposed upon academia from the outside deal mainly with expunging fraudulent literature and imposing punitive sanctions on the responsible authors. The salutary influence of such post facto actions on the reproducibility of discovery-relevant preclinical research data from academia appears limited. Rather, intentional doctoral-scientist education focused on data replicability and translationally-meaningful science and active participation of university entities charged with research innovation and asset commercialization toward ensuring data quality are advocated as key academic initiatives for addressing the reproducibility issue. A mindset shift on the part of both senior university faculty and the academy to take responsibility for the data reproducibility crisis and commit proactively to positive educational, incentivization, and risk- and reward-sharing practices will be fundamental for improving the value of published preclinical academic research to drug discovery.
StimDuino: an Arduino-based electrophysiological stimulus isolator.
Sheinin, Anton; Lavi, Ayal; Michaelevski, Izhak
2015-03-30
Electrical stimulus isolator is a widely used device in electrophysiology. The timing of the stimulus application is usually automated and controlled by the external device or acquisition software; however, the intensity of the stimulus is adjusted manually. Inaccuracy, lack of reproducibility and no automation of the experimental protocol are disadvantages of the manual adjustment. To overcome these shortcomings, we developed StimDuino, an inexpensive Arduino-controlled stimulus isolator allowing highly accurate, reproducible automated setting of the stimulation current. The intensity of the stimulation current delivered by StimDuino is controlled by Arduino, an open-source microcontroller development platform. The automatic stimulation patterns are software-controlled and the parameters are set from Matlab-coded simple, intuitive and user-friendly graphical user interface. The software also allows remote control of the device over the network. Electrical current measurements showed that StimDuino produces the requested current output with high accuracy. In both hippocampal slice and in vivo recordings, the fEPSP measurements obtained with StimDuino and the commercial stimulus isolators showed high correlation. Commercial stimulus isolators are manually managed, while StimDuino generates automatic stimulation patterns with increasing current intensity. The pattern is utilized for the input-output relationship analysis, necessary for assessment of excitability. In contrast to StimuDuino, not all commercial devices are capable for remote control of the parameters and stimulation process. StimDuino-generated automation of the input-output relationship assessment eliminates need for the current intensity manually adjusting, improves stimulation reproducibility, accuracy and allows on-site and remote control of the stimulation parameters. Copyright © 2015 Elsevier B.V. All rights reserved.
Dropwise additive manufacturing of pharmaceutical products for melt-based dosage forms.
Içten, Elçin; Giridhar, Arun; Taylor, Lynne S; Nagy, Zoltan K; Reklaitis, Gintaras V
2015-05-01
The US Food and Drug Administration introduced the quality by design approach and process analytical technology guidance to encourage innovation and efficiency in pharmaceutical development, manufacturing, and quality assurance. As part of this renewed emphasis on the improvement of manufacturing, the pharmaceutical industry has begun to develop more efficient production processes with more intensive use of online measurement and sensing, real-time quality control, and process control tools. Here, we present dropwise additive manufacturing of pharmaceutical products (DAMPP) as an alternative to conventional pharmaceutical manufacturing methods. This mini-manufacturing process for the production of pharmaceuticals utilizes drop on demand printing technology for automated and controlled deposition of melt-based formulations onto edible substrates. The advantages of drop-on-demand technology, including reproducible production of small droplets, adjustable drop sizing, high placement accuracy, and flexible use of different formulations, enable production of individualized dosing even for low-dose and high-potency drugs. In this work, DAMPP is used to produce solid oral dosage forms from hot melts of an active pharmaceutical ingredient and a polymer. The dosage forms are analyzed to show the reproducibility of dosing and the dissolution behavior of different formulations. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Arthroscopic Anatomic Reconstruction of the Lateral Ligaments of the Ankle With Gracilis Autograft
Guillo, Stéphane; Archbold, Pooler; Perera, Anthony; Bauer, Thomas; Sonnery-Cottet, Bertrand
2014-01-01
Lateral ankle sprains are common; if conservative treatment fails and chronic instability develops, stabilization surgery is indicated. Numerous surgical procedures have been described, but those that most closely reproduce normal ankle lateral ligament anatomy and kinematics have been shown to have the best outcomes. Arthroscopy is a common adjunct to open ligament surgery, but it is traditionally only used to improve the diagnosis and the management of any associated intra-articular lesions. The stabilization itself is performed open because standard anterior ankle arthroscopy provides only partial visualization of the anterior talofibular ligament from above and the calcaneofibular ligament attachments cannot be seen at all. However, lateral ankle endoscopy can provide a view of this area that is superior to open surgery. We have developed a technique of ankle endoscopy that enables anatomic positioning of the repair or fixation of the graft. In this article we describe a safe and reproducible arthroscopic anatomic reconstruction of the lateral ligaments of the ankle using a gracilis autograft. The aim of this procedure is to obtain a more physiological reconstruction while maintaining all the advantages of an arthroscopic approach. PMID:25473613
Yang, Pan; Peng, Yulan; Zhao, Haina; Luo, Honghao; Jin, Ya; He, Yushuang
2015-01-01
Static shear wave elastography (SWE) is used to detect breast lesions, but slice and plane selections result in discrepancies. To evaluate the intraobserver reproducibility of continuous SWE, and whether quantitative elasticities in orthogonal planes perform better in the differential diagnosis of breast lesions. One hundred and twenty-two breast lesions scheduled for ultrasound-guided biopsy were recruited. Continuous SWE scans were conducted in orthogonal planes separately. Quantitative elasticities and histopathology results were collected. Reproducibility in the same plane and diagnostic performance in different planes were evaluated. The maximum and mean elasticities of the hardest portion, and standard deviation of whole lesion, had high inter-class correlation coefficients (0.87 to 0.95) and large areas under receiver operation characteristic curve (0.887 to 0.899). Without loss of accuracy, sensitivities had increased in orthogonal planes compared with single plane (from 73.17% up to 82.93% at most). Mean elasticity of whole lesion and lesion-to-parenchyma ratio were significantly less reproducible and less accurate. Continuous SWE is highly reproducible for the same observer. The maximum and mean elasticities of the hardest portion and standard deviation of whole lesion are most reliable. Furthermore, the sensitivities of the three parameters are improved in orthogonal planes without loss of accuracies.
Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments
Russo, Francesco; Righelli, Dario
2016-01-01
We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D; Pollock, S; Keall, P
Purpose: Audiovisual biofeedback breath-hold (AVBH) was employed to reproduce tumor position on inhale and exhale breath-holds for 4D tumor information. We hypothesize that lung tumor position will be more consistent using AVBH compared with conventional breath-hold (CBH). Methods: Lung tumor positions were determined for seven lung cancer patients (age: 25 – 74) during to two separate 3T MRI sessions. A breathhold training session was performed prior to the MRI sessions to allow patients to become comfortable with AVBH and their exhale and inhale target positions. CBH and AVBH 4D image datasets were obtained in the first MRI session (pre-treatment) andmore » the second MRI session (midtreatment) within six weeks of the first session. Audio-instruction (MRI: Siemens Skyra) in CBH and verbal-instruction (radiographer) in AVBH were used. A radiation oncologist contoured the lung tumor using Eclipse (Varian Medical Systems); tumor position was quantified as the centroid of the contoured tumor after rigid registration based on vertebral anatomy across two MRI sessions. CBH and AVBH were compared in terms of the reproducibility assessed via (1) the difference between the two exhale positions for the two sessions and the two inhale positions for the sessions. (2) The difference in amplitude (exhale to inhale) between the two sessions. Results: Compared to CBH, AVBH improved the reproducibility of two exhale (or inhale) lung tumor positions relative to each other by 33%, from 6.4±5.3 mm to 4.3±3.0 mm (p=0.005). Compared to CBH, AVBH improved the reproducibility of exhale and inhale amplitude by 66%, from 5.6±5.9 mm to 1.9±1.4 mm (p=0.005). Conclusions: This study demonstrated that audiovisual biofeedback can be utilized for improving the reproducibility of breath-hold lung tumor position. These results are advantageous towards achieving more accurate emerging radiation treatment planning methods, in addition to imaging and treatment modalities utilizing breath-hold procedures.« less
Tip-enhanced Raman scattering microscopy: Recent advance in tip production
NASA Astrophysics Data System (ADS)
Fujita, Yasuhiko; Walke, Peter; De Feyter, Steven; Uji-i, Hiroshi
2016-08-01
Tip-enhanced Raman scattering (TERS) microscopy is a technique that combines the chemical sensitivity of Raman spectroscopy with the resolving power of scanning probe microscopy. The key component of any TERS setup is a plasmonically-active noble metal tip, which serves to couple far-field incident radiation with the near-field. Thus, the design and implementation of reproducible probes are crucial for the continued development of TERS as a tool for nanoscopic analysis. Here we discuss conventional methods for the fabrication of TERS-ready tips, highlighting the problems therein, as well as detailing more recent developments to improve reducibility. In addition, the idea of remote excitation-TERS is enlightened upon, whereby TERS sensitivity is further improved by using propagating surface plasmons to separate the incident radiation from the tip apex, as well as how this can be incorporated into the fabrication process.
Integrative prescreening in analysis of multiple cancer genomic studies
2012-01-01
Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431
Joint groupwise registration and ADC estimation in the liver using a B-value weighted metric.
Sanz-Estébanez, Santiago; Rabanillo-Viloria, Iñaki; Royuela-Del-Val, Javier; Aja-Fernández, Santiago; Alberola-López, Carlos
2018-02-01
The purpose of this work is to develop a groupwise elastic multimodal registration algorithm for robust ADC estimation in the liver on multiple breath hold diffusion weighted images. We introduce a joint formulation to simultaneously solve both the registration and the estimation problems. In order to avoid non-reliable transformations and undesirable noise amplification, we have included appropriate smoothness constraints for both problems. Our metric incorporates the ADC estimation residuals, which are inversely weighted according to the signal content in each diffusion weighted image. Results show that the joint formulation provides a statistically significant improvement in the accuracy of the ADC estimates. Reproducibility has also been measured on real data in terms of the distribution of ADC differences obtained from different b-values subsets. The proposed algorithm is able to effectively deal with both the presence of motion and the geometric distortions, increasing accuracy and reproducibility in diffusion parameters estimation. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
A new force field including charge directionality for TMAO in aqueous solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Usui, Kota; Nagata, Yuki, E-mail: sulpizi@uni-mainz.de, E-mail: nagata@mpip-mainz.mpg.de; Hunger, Johannes
We propose a new force field for trimethylamine N-oxide (TMAO), which is designed to reproduce the long-lived and highly directional hydrogen bond between the TMAO oxygen (O{sub TMAO}) atom and surrounding water molecules. Based on the data obtained by ab initio molecular dynamics simulations, we introduce three dummy sites around O{sub TMAO} to mimic the O{sub TMAO} lone pairs and we migrate the negative charge on the O{sub TMAO} to the dummy sites. The force field model developed here improves both structural and dynamical properties of aqueous TMAO solutions. Moreover, it reproduces the experimentally observed dependence of viscosity upon increasingmore » TMAO concentration quantitatively. The simple procedure of the force field construction makes it easy to implement in molecular dynamics simulation packages and makes it compatible with the existing biomolecular force fields. This paves the path for further investigation of protein-TMAO interaction in aqueous solutions.« less
A comparison of non-local electron transport models relevant to inertial confinement fusion
NASA Astrophysics Data System (ADS)
Sherlock, Mark; Brodrick, Jonathan; Ridgers, Christopher
2017-10-01
We compare the reduced non-local electron transport model developed by Schurtz et al. to Vlasov-Fokker-Planck simulations. Two new test cases are considered: the propagation of a heat wave through a high density region into a lower density gas, and a 1-dimensional hohlraum ablation problem. We find the reduced model reproduces the peak heat flux well in the ablation region but significantly over-predicts the coronal preheat. The suitability of the reduced model for computing non-local transport effects other than thermal conductivity is considered by comparing the computed distribution function to the Vlasov-Fokker-Planck distribution function. It is shown that even when the reduced model reproduces the correct heat flux, the distribution function is significantly different to the Vlasov-Fokker-Planck prediction. Two simple modifications are considered which improve agreement between models in the coronal region. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.
Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo
2015-12-01
The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Effect of Initial Conditions on Reproducibility of Scientific Research
Djulbegovic, Benjamin; Hozo, Iztok
2014-01-01
Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705
Effect of initial conditions on reproducibility of scientific research.
Djulbegovic, Benjamin; Hozo, Iztok
2014-06-01
It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings.
Fraysse, Marion; Pinazo, Christel; Faure, Vincent Martin; Fuchs, Rosalie; Lazzari, Paolo; Raimbault, Patrick; Pairaud, Ivane
2013-01-01
Terrestrial inputs (natural and anthropogenic) from rivers, the atmosphere and physical processes strongly impact the functioning of coastal pelagic ecosystems. The objective of this study was to develop a tool for the examination of these impacts on the Marseille coastal area, which experiences inputs from the Rhone River and high rates of atmospheric deposition. Therefore, a new 3D coupled physical/biogeochemical model was developed. Two versions of the biogeochemical model were tested, one model considering only the carbon (C) and nitrogen (N) cycles and a second model that also considers the phosphorus (P) cycle. Realistic simulations were performed for a period of 5 years (2007-2011). The model accuracy assessment showed that both versions of the model were able of capturing the seasonal changes and spatial characteristics of the ecosystem. The model also reproduced upwelling events and the intrusion of Rhone River water into the Bay of Marseille well. Those processes appeared to greatly impact this coastal oligotrophic area because they induced strong increases in chlorophyll-a concentrations in the surface layer. The model with the C, N and P cycles better reproduced the chlorophyll-a concentrations at the surface than did the model without the P cycle, especially for the Rhone River water. Nevertheless, the chlorophyll-a concentrations at depth were better represented by the model without the P cycle. Therefore, the complexity of the biogeochemical model introduced errors into the model results, but it also improved model results during specific events. Finally, this study suggested that in coastal oligotrophic areas, improvements in the description and quantification of the hydrodynamics and the terrestrial inputs should be preferred over increasing the complexity of the biogeochemical model.
Fraysse, Marion; Pinazo, Christel; Faure, Vincent Martin; Fuchs, Rosalie; Lazzari, Paolo; Raimbault, Patrick; Pairaud, Ivane
2013-01-01
Terrestrial inputs (natural and anthropogenic) from rivers, the atmosphere and physical processes strongly impact the functioning of coastal pelagic ecosystems. The objective of this study was to develop a tool for the examination of these impacts on the Marseille coastal area, which experiences inputs from the Rhone River and high rates of atmospheric deposition. Therefore, a new 3D coupled physical/biogeochemical model was developed. Two versions of the biogeochemical model were tested, one model considering only the carbon (C) and nitrogen (N) cycles and a second model that also considers the phosphorus (P) cycle. Realistic simulations were performed for a period of 5 years (2007–2011). The model accuracy assessment showed that both versions of the model were able of capturing the seasonal changes and spatial characteristics of the ecosystem. The model also reproduced upwelling events and the intrusion of Rhone River water into the Bay of Marseille well. Those processes appeared to greatly impact this coastal oligotrophic area because they induced strong increases in chlorophyll-a concentrations in the surface layer. The model with the C, N and P cycles better reproduced the chlorophyll-a concentrations at the surface than did the model without the P cycle, especially for the Rhone River water. Nevertheless, the chlorophyll-a concentrations at depth were better represented by the model without the P cycle. Therefore, the complexity of the biogeochemical model introduced errors into the model results, but it also improved model results during specific events. Finally, this study suggested that in coastal oligotrophic areas, improvements in the description and quantification of the hydrodynamics and the terrestrial inputs should be preferred over increasing the complexity of the biogeochemical model. PMID:24324589
Guyader, Jean-Marie; Bernardin, Livia; Douglas, Naomi H M; Poot, Dirk H J; Niessen, Wiro J; Klein, Stefan
2015-08-01
To evaluate the influence of image registration on apparent diffusion coefficient (ADC) images obtained from abdominal free-breathing diffusion-weighted MR images (DW-MRIs). A comprehensive pipeline based on automatic three-dimensional nonrigid image registrations is developed to compensate for misalignments in DW-MRI datasets obtained from five healthy subjects scanned twice. Motion is corrected both within each image and between images in a time series. ADC distributions are compared with and without registration in two abdominal volumes of interest (VOIs). The effects of interpolations and Gaussian blurring as alternative strategies to reduce motion artifacts are also investigated. Among the four considered scenarios (no processing, interpolation, blurring and registration), registration yields the best alignment scores. Median ADCs vary according to the chosen scenario: for the considered datasets, ADCs obtained without processing are 30% higher than with registration. Registration improves voxelwise reproducibility at least by a factor of 2 and decreases uncertainty (Fréchet-Cramér-Rao lower bound). Registration provides similar improvements in reproducibility and uncertainty as acquiring four times more data. Patient motion during image acquisition leads to misaligned DW-MRIs and inaccurate ADCs, which can be addressed using automatic registration. © 2014 Wiley Periodicals, Inc.
Bardoxolone: augmenting the Yin in chronic kidney disease.
Thomas, Merlin C
2011-10-01
Nrf-2 (NF-E2-related factor 2) is a regulator of anti-oxidant, anti-inflammatory and detoxification pathways. Coordinated augmentation of these key defence pathways via Nrf-2 signalling is being investigated for the treatment of chronic diseases, including diabetes and its complications. The first to reach commercial development is the triterpenoid, bardoxolone methyl. In recent clinical trial, bardoxolone rapidly improved kidney function on average by 5-10 ml/min within 4 weeks of therapy. Importantly, this improvement was sustained during one year of active treatment. This suggests that rather that overworking a failing system, bardoxolone appeared to safely augment renal function, at least to one year. If similar improvements in kidney function can be reproduced in the upcoming BEACON trial, it will represent a major advance on conventional therapy and new way to bring balance to the failing kidney.
Animal models for clinical and gestational diabetes: maternal and fetal outcomes.
Kiss, Ana Ci; Lima, Paula Ho; Sinzato, Yuri K; Takaku, Mariana; Takeno, Marisa A; Rudge, Marilza Vc; Damasceno, Débora C
2009-10-19
Diabetes in pregnant women is associated with an increased risk of maternal and neonatal morbidity and remains a significant medical challenge. Diabetes during pregnancy may be divided into clinical diabetes and gestational diabetes. Experimental models are developed with the purpose of enhancing understanding of the pathophysiological mechanisms of diseases that affect humans. With regard to diabetes in pregnancy, experimental findings from models will lead to the development of treatment strategies to maintain a normal metabolic intrauterine milieu, improving perinatal development by preventing fetal growth restriction or macrosomia. Based on animal models of diabetes during pregnancy previously reported in the medical literature, the present study aimed to compare the impact of streptozotocin-induced severe (glycemia >300 mg/dl) and mild diabetes (glycemia between 120 and 300 mg/dl) on glycemia and maternal reproductive and fetal outcomes of Wistar rats to evaluate whether the animal model reproduces the maternal and perinatal results of clinical and gestational diabetes in humans. On day 5 of life, 96 female Wistar rats were assigned to three experimental groups: control (n = 16), severe (n = 50) and mild diabetes (n = 30). At day 90 of life, rats were mated. On day 21 of pregnancy, rats were killed and their uterine horns were exposed to count implantation and fetus numbers to determine pre- and post-implantation loss rates. The fetuses were classified according to their birth weight. Severe and mild diabetic dams showed different glycemic responses during pregnancy, impairing fetal glycemia and weight, confirming that maternal glycemia is directly associated with fetal development. Newborns from severe diabetic mothers presented growth restriction, but mild diabetic mothers were not associated with an increased rate of macrosomic fetuses. Experimental models of severe diabetes during pregnancy reproduced maternal and fetal outcomes of pregnant women presenting uncontrolled clinical diabetes. On the other hand, the mild diabetes model caused mild hyperglycemia during pregnancy, although it was not enough to reproduce the increased rate of macrosomic fetuses seen in women with gestational diabetes.
Assessment of Surgical Skills and Competency.
Bhatti, Nasir I
2017-10-01
Evaluation of surgical skills and competency are important aspects of the medical education process. Measurable and reproducible methods of assessment with objective feedback are essential components of surgical training. Objective Structured Assessment of Technical Skills (OSATS) is widely used across the medical specialties and otolaryngology-specific tools have been developed and validated for sinus and mastoid surgery. Although assessment of surgical skills can be time-consuming and requires human and financial resources, new evaluation methods and emerging technology may alleviate these barriers while also improving data collection practices. Copyright © 2017 Elsevier Inc. All rights reserved.
Maessen, J G; Phelps, B; Dekker, A L A J; Dijkman, B
2004-05-01
To optimize resynchronization in biventricular pacing with epicardial leads, mapping to determine the best pacing site, is a prerequisite. A port access surgical mapping technique was developed that allowed multiple pace site selection and reproducible lead evaluation and implantation. Pressure-volume loops analysis was used for real time guidance in targeting epicardial lead placement. Even the smallest changes in lead position revealed significantly different functional results. Optimizing the pacing site with this technique allowed functional improvement up to 40% versus random pace site selection.
A one-dimensional interactive soil-atmosphere model for testing formulations of surface hydrology
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Eagleson, Peter S.
1990-01-01
A model representing a soil-atmosphere column in a GCM is developed for off-line testing of GCM soil hydrology parameterizations. Repeating three representative GCM sensitivity experiments with this one-dimensional model demonstrates that, to first order, the model reproduces a GCM's sensitivity to imposed changes in parameterization and therefore captures the essential physics of the GCM. The experiments also show that by allowing feedback between the soil and atmosphere, the model improves on off-line tests that rely on prescribed precipitation, radiation, and other surface forcing.
Decision analysis in clinical cardiology: When is coronary angiography required in aortic stenosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgeson, S.; Meyer, K.B.; Pauker, S.G.
1990-03-15
Decision analysis offers a reproducible, explicit approach to complex clinical decisions. It consists of developing a model, typically a decision tree, that separates choices from chances and that specifies and assigns relative values to outcomes. Sensitivity analysis allows exploration of alternative assumptions. Cost-effectiveness analysis shows the relation between dollars spent and improved health outcomes achieved. In a tutorial format, this approach is applied to the decision whether to perform coronary angiography in a patient who requires aortic valve replacement for critical aortic stenosis.
Middle-ear microsurgery simulation to improve new robotic procedures.
Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian
2014-01-01
Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed.
Middle-Ear Microsurgery Simulation to Improve New Robotic Procedures
Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian
2014-01-01
Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed. PMID:25157373
NASA Astrophysics Data System (ADS)
McLean, N. M.; Condon, D. J.; Bowring, S. A.; Schoene, B.; Dutton, A.; Rubin, K. H.
2015-12-01
The last two decades have seen a grassroots effort by the international geochronology community to "calibrate Earth history through teamwork and cooperation," both as part of the EARTHTIME initiative and though several daughter projects with similar goals. Its mission originally challenged laboratories "to produce temporal constraints with uncertainties approaching 0.1% of the radioisotopic ages," but EARTHTIME has since exceeded its charge in many ways. Both the U-Pb and Ar-Ar chronometers first considered for high-precision timescale calibration now regularly produce dates at the sub-per mil level thanks to instrumentation, laboratory, and software advances. At the same time new isotope systems, including U-Th dating of carbonates, have developed comparable precision. But the larger, inter-related scientific challenges envisioned at EARTHTIME's inception remain - for instance, precisely calibrating the global geologic timescale, estimating rates of change around major climatic perturbations, and understanding evolutionary rates through time - and increasingly require that data from multiple geochronometers be combined. To solve these problems, the next two decades of uranium-daughter geochronology will require further advances in accuracy, precision, and reproducibility. The U-Th system has much in common with U-Pb, in that both parent and daughter isotopes are solids that can easily be weighed and dissolved in acid, and have well-characterized reference materials certified for isotopic composition and/or purity. For U-Pb, improving lab-to-lab reproducibility has entailed dissolving precisely weighed U and Pb metals of known purity and isotopic composition together to make gravimetric solutions, then using these to calibrate widely distributed tracers composed of artificial U and Pb isotopes. To mimic laboratory measurements, naturally occurring U and Pb isotopes were also mixed in proportions to mimic samples of three different ages, to be run as internal standards and as measures of inter-laboratory reproducibility. The U-Th community is undertaking many of the same protocols, and has recently created publicly available gravimetric solutions, and large volumes of three age solutions for widespread distribution and inter-laboratory comparison.
Eichler, Marko; Römer, Robert; Grodrian, Andreas; Lemke, Karen; Nagel, Krees; Klages, Claus‐Peter; Gastrock, Gunter
2017-01-01
Abstract Although the great potential of droplet based microfluidic technologies for routine applications in industry and academia has been successfully demonstrated over the past years, its inherent potential is not fully exploited till now. Especially regarding to the droplet generation reproducibility and stability, two pivotally important parameters for successful applications, there is still a need for improvement. This is even more considerable when droplets are created to investigate tissue fragments or cell cultures (e.g. suspended cells or 3D cell cultures) over days or even weeks. In this study we present microfluidic chips composed of a plasma coated polymer, which allow surfactants‐free, highly reproducible and stable droplet generation from fluids like cell culture media. We demonstrate how different microfluidic designs and different flow rates (and flow rate ratios) affect the reproducibility of the droplet generation process and display the applicability for a wide variety of bio(techno)logically relevant media. PMID:29399017
Guo, Qi; Shen, Shu-Ting
2016-04-29
There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.
Opinion: Is science really facing a reproducibility crisis, and do we need it to?
Fanelli, Daniele
2018-01-01
Efforts to improve the reproducibility and integrity of science are typically justified by a narrative of crisis, according to which most published results are unreliable due to growing problems with research and publication practices. This article provides an overview of recent evidence suggesting that this narrative is mistaken, and argues that a narrative of epochal changes and empowerment of scientists would be more accurate, inspiring, and compelling. PMID:29531051
Long-term reproducibility of relative sensitivity factors obtained with CAMECA Wf
NASA Astrophysics Data System (ADS)
Gui, D.; Xing, Z. X.; Huang, Y. H.; Mo, Z. Q.; Hua, Y. N.; Zhao, S. P.; Cha, L. Z.
2008-12-01
As the wafer size continues to increase and the feature size of the integrated circuits (IC) continues to shrink, process control of IC manufacturing becomes ever more important to reduce the cost of failures caused by the drift of processes or equipments. Characterization tools with high precision and reproducibility are required to capture any abnormality of the process. Although Secondary ion mass spectrometry (SIMS) has been widely used in dopant profile control, it was reported that magnetic sector SIMS, compared to quadrupole SIMS, has lower short-term repeatability and long-term reproducibility due to the high extraction field applied between sample and extraction lens. In this paper, we demonstrate that CAMECA Wf can deliver high long-term reproducibility because of its high-level automation and improved design of immersion lens. The relative standard deviation (R.S.D.) of the relative sensitivity factors (RSF) of three typical elements, i.e., boron (B), phosphorous (P) and nitrogen (N), over 3 years are 3.7%, 5.5% and 4.1%, respectively. The high reproducibility results have a practical implication that deviation can be estimated without testing the standards.
Singularity: Scientific containers for mobility of compute.
Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.
Singularity: Scientific containers for mobility of compute
Kurtzer, Gregory M.; Bauer, Michael W.
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014
Equivalent parameter model of 1-3 piezocomposite with a sandwich polymer
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Wang, Likun; Qin, Lei
2018-06-01
A theoretical model was developed to investigate the performance of 1-3 piezoelectric composites with a sandwich polymer. Effective parameters, such as the electromechanical coupling factor, longitudinal velocity, and characteristic acoustic impedance of the piezocomposite, were predicted using the developed model. The influences of volume fractions and components of the polymer phase on the effective parameters of the piezoelectric composite were studied. The theoretical model was verified experimentally. The proposed model can reproduce the effective parameters of 1-3 piezoelectric composites with a sandwich polymer in the thickness mode. The measured electromechanical coupling factor was improved by more than 9.8% over the PZT/resin 1-3 piezoelectric composite.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, David V.; Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas; Tucker, Susan L.
2014-11-15
Purpose: To determine whether pretreatment CT texture features can improve patient risk stratification beyond conventional prognostic factors (CPFs) in stage III non-small cell lung cancer (NSCLC). Methods and Materials: We retrospectively reviewed 91 cases with stage III NSCLC treated with definitive chemoradiation therapy. All patients underwent pretreatment diagnostic contrast enhanced computed tomography (CE-CT) followed by 4-dimensional CT (4D-CT) for treatment simulation. We used the average-CT and expiratory (T50-CT) images from the 4D-CT along with the CE-CT for texture extraction. Histogram, gradient, co-occurrence, gray tone difference, and filtration-based techniques were used for texture feature extraction. Penalized Cox regression implementing cross-validation wasmore » used for covariate selection and modeling. Models incorporating texture features from the 33 image types and CPFs were compared to those with models incorporating CPFs alone for overall survival (OS), local-regional control (LRC), and freedom from distant metastases (FFDM). Predictive Kaplan-Meier curves were generated using leave-one-out cross-validation. Patients were stratified based on whether their predicted outcome was above or below the median. Reproducibility of texture features was evaluated using test-retest scans from independent patients and quantified using concordance correlation coefficients (CCC). We compared models incorporating the reproducibility seen on test-retest scans to our original models and determined the classification reproducibility. Results: Models incorporating both texture features and CPFs demonstrated a significant improvement in risk stratification compared to models using CPFs alone for OS (P=.046), LRC (P=.01), and FFDM (P=.005). The average CCCs were 0.89, 0.91, and 0.67 for texture features extracted from the average-CT, T50-CT, and CE-CT, respectively. Incorporating reproducibility within our models yielded 80.4% (±3.7% SD), 78.3% (±4.0% SD), and 78.8% (±3.9% SD) classification reproducibility in terms of OS, LRC, and FFDM, respectively. Conclusions: Pretreatment tumor texture may provide prognostic information beyond that obtained from CPFs. Models incorporating feature reproducibility achieved classification rates of ∼80%. External validation would be required to establish texture as a prognostic factor.« less
Qi, H.; Coplen, T.B.; Wassenaar, L.I.
2011-01-01
It is well known that N2 in the ion source of a mass spectrometer interferes with the CO background during the δ18O measurement of carbon monoxide. A similar problem arises with the high-temperature conversion (HTC) analysis of nitrogenous O-bearing samples (e.g. nitrates and keratins) to CO for δ18O measurement, where the sample introduces a significant N2 peak before the CO peak, making determination of accurate oxygen isotope ratios difficult. Although using a gas chromatography (GC) column longer than that commonly provided by manufacturers (0.6 m) can improve the efficiency of separation of CO and N2 and using a valve to divert nitrogen and prevent it from entering the ion source of a mass spectrometer improved measurement results, biased δ18O values could still be obtained. A careful evaluation of the performance of the GC separation column was carried out. With optimal GC columns, the δ18O reproducibility of human hair keratins and other keratin materials was better than ±0.15 ‰ (n = 5; for the internal analytical reproducibility), and better than ±0.10 ‰ (n = 4; for the external analytical reproducibility).
NASA Astrophysics Data System (ADS)
Mountris, K. A.; Bert, J.; Noailly, J.; Rodriguez Aguilera, A.; Valeri, A.; Pradier, O.; Schick, U.; Promayon, E.; Gonzalez Ballester, M. A.; Troccaz, J.; Visvikis, D.
2017-03-01
Prostate volume changes due to edema occurrence during transperineal permanent brachytherapy should be taken under consideration to ensure optimal dose delivery. Available edema models, based on prostate volume observations, face several limitations. Therefore, patient-specific models need to be developed to accurately account for the impact of edema. In this study we present a biomechanical model developed to reproduce edema resolution patterns documented in the literature. Using the biphasic mixture theory and finite element analysis, the proposed model takes into consideration the mechanical properties of the pubic area tissues in the evolution of prostate edema. The model’s computed deformations are incorporated in a Monte Carlo simulation to investigate their effect on post-operative dosimetry. The comparison of Day1 and Day30 dosimetry results demonstrates the capability of the proposed model for patient-specific dosimetry improvements, considering the edema dynamics. The proposed model shows excellent ability to reproduce previously described edema resolution patterns and was validated based on previous findings. According to our results, for a prostate volume increase of 10-20% the Day30 urethra D10 dose metric is higher by 4.2%-10.5% compared to the Day1 value. The introduction of the edema dynamics in Day30 dosimetry shows a significant global dose overestimation identified on the conventional static Day30 dosimetry. In conclusion, the proposed edema biomechanical model can improve the treatment planning of transperineal permanent brachytherapy accounting for post-implant dose alterations during the planning procedure.
Mountris, K A; Bert, J; Noailly, J; Aguilera, A Rodriguez; Valeri, A; Pradier, O; Schick, U; Promayon, E; Ballester, M A Gonzalez; Troccaz, J; Visvikis, D
2017-03-21
Prostate volume changes due to edema occurrence during transperineal permanent brachytherapy should be taken under consideration to ensure optimal dose delivery. Available edema models, based on prostate volume observations, face several limitations. Therefore, patient-specific models need to be developed to accurately account for the impact of edema. In this study we present a biomechanical model developed to reproduce edema resolution patterns documented in the literature. Using the biphasic mixture theory and finite element analysis, the proposed model takes into consideration the mechanical properties of the pubic area tissues in the evolution of prostate edema. The model's computed deformations are incorporated in a Monte Carlo simulation to investigate their effect on post-operative dosimetry. The comparison of Day1 and Day30 dosimetry results demonstrates the capability of the proposed model for patient-specific dosimetry improvements, considering the edema dynamics. The proposed model shows excellent ability to reproduce previously described edema resolution patterns and was validated based on previous findings. According to our results, for a prostate volume increase of 10-20% the Day30 urethra D10 dose metric is higher by 4.2%-10.5% compared to the Day1 value. The introduction of the edema dynamics in Day30 dosimetry shows a significant global dose overestimation identified on the conventional static Day30 dosimetry. In conclusion, the proposed edema biomechanical model can improve the treatment planning of transperineal permanent brachytherapy accounting for post-implant dose alterations during the planning procedure.
Computerized nailfold video capillaroscopy--a new tool for assessment of Raynaud's phenomenon.
Anderson, Marina E; Allen, P Danny; Moore, Tonia; Hillier, Val; Taylor, Christopher J; Herrick, Ariane L
2005-05-01
To develop a computer based nailfold video capillaroscopy system with enhanced image quality and to assess its disease-subgroup resolving power in patients with primary and secondary Raynaud's phenomenon (RP). Using frame registration software, digitized video images from the microscope were combined to form a panoramic mosaic of the nailfold. Capillary dimensions (apex, arterial, venous, and total width) and density were measured onscreen. Significantly, the new system could guarantee analysis of the same set of capillaries by 2 observers. Forty-eight healthy control subjects, 21 patients with primary RP, 40 patients with limited cutaneous systemic sclerosis (lcSSc), and 11 patients with diffuse cutaneous SSc (dcSSc) were studied. Intra- and interobserver variability were calculated in a subset of 30 subjects. The number of loops/mm was significantly lower, and all 4 capillary dimensions significantly greater, in SSc patients versus controls plus primary RP patients (p < 0.001 for all measures). When comparing control (+ primary RP) patients with SSc patients (lcSSc + dcSSc) the most powerful discriminator was found to be the number of loops/mm. Results for intra- and interobserver reproducibility showed that the limits of agreement were closer when both observers measured the same capillaries. The key feature of the newly developed system is that it improves reproducibility of nailfold capillary measurements by allowing reidentification of the same capillaries by different observers. By allowing access to previous measurements, the new system should improve reliability in longitudinal studies, and therefore has the potential of being a valuable outcome measure of microvessel disease/involvement in clinical trials of scleroderma spectrum disorders.
Reproducibility of radiomics for deciphering tumor phenotype with imaging
NASA Astrophysics Data System (ADS)
Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.
2016-03-01
Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.
An Open Science Peer Review Oath
Aleksic, Jelena; Alexa, Adrian; Attwood, Teresa K; Chue Hong, Neil; Dahlö, Martin; Davey, Robert; Dinkel, Holger; Förstner, Konrad U; Grigorov, Ivo; Hériché, Jean-Karim; Lahti, Leo; MacLean, Dan; Markie, Michael L; Molloy, Jenny; Schneider, Maria Victoria; Scott, Camille; Smith-Unna, Richard; Vieira, Bruno Miguel
2015-01-01
One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously) uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions. PMID:25653839
Reproducibility of the spectral components of the electroencephalogram during driver fatigue.
Lal, Saroj K L; Craig, Ashley
2005-02-01
To date, no study has tested the reproducibility of EEG changes that occur during driver fatigue. For the EEG changes to be useful in the development of a fatigue countermeasure device the EEG response during each onset period of fatigue in individuals needs to be reproducible. It should be noted that fatigue during driving is not a continuous process but consists of successive episodes of 'microsleeps' where the subject may go in and out of a fatigue state. The aim of the present study was to investigate the reproducibility of fatigue during driving in both professional and non-professional drivers. Thirty five non-professional drivers and twenty professional drivers were tested during two separate sessions of a driver simulator task. EEG, EOG and behavioural measurements of fatigue were obtained during the driving task. The results showed high reproducibility for the delta and theta bands (r>0.95) in both groups of drivers. The results are discussed in light of implications for future studies and for the development of an EEG based fatigue countermeasure device.
Cherk, Martin H; Ky, Jason; Yap, Kenneth S K; Campbell, Patrina; McGrath, Catherine; Bailey, Michael; Kalff, Victor
2012-08-01
To evaluate the reproducibility of serial re-acquisitions of gated Tl-201 and Tc-99m sestamibi left ventricular ejection fraction (LVEF) measurements obtained on a new generation solid-state cardiac camera system during myocardial perfusion imaging and the importance of manual operator optimization of left ventricular wall tracking. Resting blinded automated (auto) and manual operator optimized (opt) LVEF measurements were measured using ECT toolbox (ECT) and Cedars-Sinai QGS software in two separate cohorts of 55 Tc-99m sestamibi (MIBI) and 50 thallium (Tl-201) myocardial perfusion studies (MPS) acquired in both supine and prone positions on a cadmium zinc telluride (CZT) solid-state camera system. Resting supine and prone automated LVEF measurements were similarly obtained in a further separate cohort of 52 gated cardiac blood pool scans (GCBPS) for validation of methodology and comparison. Appropriate use of Bland-Altman, chi-squared and Levene's equality of variance tests was used to analyse the resultant data comparisons. For all radiotracer and software combinations, manual checking and optimization of valve planes (+/- centre radius with ECT software) resulted in significant improvement in MPS LVEF reproducibility that approached that of planar GCBPS. No difference was demonstrated between optimized MIBI/Tl-201 QGS and planar GCBPS LVEF reproducibility (P = .17 and P = .48, respectively). ECT required significantly more manual optimization compared to QGS software in both supine and prone positions independent of radiotracer used (P < .02). Reproducibility of gated sestamibi and Tl-201 LVEF measurements obtained during myocardial perfusion imaging with ECT toolbox or QGS software packages using a new generation solid-state cardiac camera with improved image quality approaches that of planar GCBPS however requires visual quality control and operator optimization of left ventricular wall tracking for best results. Using this superior cardiac technology, Tl-201 reproducibility also appears at least equivalent to sestamibi for measuring LVEF.
Key parameters in testing biodegradation of bio-based materials in soil.
Briassoulis, D; Mistriotis, A
2018-09-01
Biodegradation of plastics in soil is currently tested by international standard testing methods (e.g. ISO 17556-12 or ASTM D5988-12). Although these testing methods have been developed for plastics, it has been shown in project KBBPPS that they can be extended also to lubricants with small modifications. Reproducibility is a critical issue regarding biodegradation tests in the laboratory. Among the main testing variables are the soil types and nutrients available (mainly nitrogen). For this reason, the effect of the soil type on the biodegradation rates of various bio-based materials (cellulose and lubricants) was tested for five different natural soil types (loam, loamy sand, clay, clay-loam, and silt-loam organic). It was shown that use of samples containing 1 g of C in a substrate of 300 g of soil with the addition of 0.1 g of N as nutrient strongly improves the reproducibility of the test making the results practically independent of the soil type with the exception of the organic soil. The sandy soil was found to need addition of higher amount of nutrients to exhibit similar biodegradation rates as those achieved with the other soil types. Therefore, natural soils can be used for Standard biodegradation tests of bio-based materials yielding reproducible results with the addition of appropriate nutrients. Copyright © 2018 Elsevier Ltd. All rights reserved.
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
Towards robust and repeatable sampling methods in eDNA based studies.
Dickie, Ian A; Boyer, Stephane; Buckley, Hannah; Duncan, Richard P; Gardner, Paul; Hogg, Ian D; Holdaway, Robert J; Lear, Gavin; Makiola, Andreas; Morales, Sergio E; Powell, Jeff R; Weaver, Louise
2018-05-26
DNA based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three-year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods, inappropriate field methods, and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods and reporting is necessary in eDNA-based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimise subjectivity. Standardisation of sampling protocols would be one way to help to improve reproducibility, and have additional benefits in allowing compilation and comparison of data from across studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Lemieux, Louis
2001-07-01
A new fully automatic algorithm for the segmentation of the brain and cerebro-spinal fluid (CSF) from T1-weighted volume MRI scans of the head was specifically developed in the context of serial intra-cranial volumetry. The method is an extension of a previously published brain extraction algorithm. The brain mask is used as a basis for CSF segmentation based on morphological operations, automatic histogram analysis and thresholding. Brain segmentation is then obtained by iterative tracking of the brain-CSF interface. Grey matter (GM), white matter (WM) and CSF volumes are calculated based on a model of intensity probability distribution that includes partial volume effects. Accuracy was assessed using a digital phantom scan. Reproducibility was assessed by segmenting pairs of scans from 20 normal subjects scanned 8 months apart and 11 patients with epilepsy scanned 3.5 years apart. Segmentation accuracy as measured by overlap was 98% for the brain and 96% for the intra-cranial tissues. The volume errors were: total brain (TBV): -1.0%, intra-cranial (ICV):0.1%, CSF: +4.8%. For repeated scans, matching resulted in improved reproducibility. In the controls, the coefficient of reliability (CR) was 1.5% for the TVB and 1.0% for the ICV. In the patients, the Cr for the ICV was 1.2%.
NASA Astrophysics Data System (ADS)
Turner, M. A.
2015-12-01
Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of scientific reproducibility and transparency, and data publication and reuse.
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
Two-Finger Tightness: What Is It? Measuring Torque and Reproducibility in a Simulated Model.
Acker, William B; Tai, Bruce L; Belmont, Barry; Shih, Albert J; Irwin, Todd A; Holmes, James R
2016-05-01
Residents in training are often directed to insert screws using "two-finger tightness" to impart adequate torque but minimize the chance of a screw stripping in bone. This study seeks to quantify and describe two-finger tightness and to assess the variability of its application by residents in training. Cortical bone was simulated using a polyurethane foam block (30-pcf density) that was prepared with predrilled holes for tightening 3.5 × 14-mm long cortical screws and mounted to a custom-built apparatus on a load cell to capture torque data. Thirty-three residents in training, ranging from the first through fifth years of residency, along with 8 staff members, were directed to tighten 6 screws to two-finger tightness in the test block, and peak torque values were recorded. The participants were blinded to their torque values. Stripping torque (2.73 ± 0.56 N·m) was determined from 36 trials and served as a threshold for failed screw placement. The average torques varied substantially with regard to absolute torque values, thus poorly defining two-finger tightness. Junior residents less consistently reproduced torque compared with other groups (0.29 and 0.32, respectively). These data quantify absolute values of two-finger tightness but demonstrate considerable variability in absolute torque values, percentage of stripping torque, and ability to consistently reproduce given torque levels. Increased years in training are weakly correlated with reproducibility, but experience does not seem to affect absolute torque levels. These results question the usefulness of two-finger tightness as a teaching tool and highlight the need for improvement in resident motor skill training and development within a teaching curriculum. Torque measuring devices may be a useful simulation tools for this purpose.
Shrestha, Archana; Koju, Rajendra Prasad; Beresford, Shirley A A; Chan, Kwun Chuen Gary; Connell, Frederik A; Karmacharya, Biraj Man; Shrestha, Pramita; Fitzpatrick, Annette L
2017-08-01
We developed a food frequency questionnaire (FFQ) designed to measure the dietary practices of adult Nepalese. The present study examined the validity and reproducibility of the FFQ. To evaluate the reproducibility of the FFQ, 116 subjects completed two 115-item FFQ across a four-month interval. Six 24-h dietary recalls were collected (1 each month) to assess the validity of the FFQ. Seven major food groups and 23 subgroups were clustered from the FFQ based on macronutrient composition. Spearman correlation coefficients evaluating reproducibility for all food groups were greater than 0.5, with the exceptions of oil. The correlations varied from 0.41 (oil) to 0.81 (vegetables). All crude spearman coefficients for validity were greater than 0.5 except for dairy products, pizzas/pastas and sausage/burgers. The FFQ was found to be reliable and valid for ranking the intake of food groups for Nepalese dietary intake.
Characterization of Residues from the Detonation of Insensitive Munitions
Unfortunately, many energetic compounds are toxic or harmful to the environment and human health. The US Army Cold Regions Research and Engineering...Laboratory and Defence Research and Development Canada Valcartier have developed methods through SERDP and ESTCP programs that enable the reproducible...reproducible method for energetics residues characterization research . SERDP Project ER-2219 is focused on three areas: determining mass DEPOSITION and
Kramer, Christian; Gedeck, Peter; Meuwly, Markus
2013-03-12
Distributed atomic multipole (MTP) moments promise significant improvements over point charges (PCs) in molecular force fields, as they (a) more realistically reproduce the ab initio electrostatic potential (ESP) and (b) allow to capture anisotropic atomic properties such as lone pairs, conjugated systems, and σ holes. The present work focuses on the question of whether multipolar electrostatics instead of PCs in standard force fields leads to quantitative improvements over point charges in reproducing intermolecular interactions. To this end, the interaction energies of two model systems, benzonitrile (BZN) and formamide (FAM) homodimers, are characterized over a wide range of dimer conformations. It is found that although with MTPs the monomer ab initio ESP can be captured better by about an order of magnitude compared to point charges (PCs), this does not directly translate into better describing ab initio interaction energies compared to PCs. Neither ESP-fitted MTPs nor refitted Lennard-Jones (LJ) parameters alone demonstrate a clear superiority of atomic MTPs. We show that only if both electrostatic and LJ parameters are jointly optimized in standard, nonpolarizable force fields, atomic are MTPs clearly beneficial for reproducing ab initio dimerization energies. After an exhaustive exponent scan, we find that for both BZN and FAM, atomic MTPs and a 9-6 LJ potential can reproduce ab initio interaction energies with ∼30% (RMSD 0.13 vs 0.18 kcal/mol) less error than point charges (PCs) and a 12-6 LJ potential. We also find that the improvement due to using MTPs with a 9-6 LJ potential is considerably more pronounced than with a 12-6 LJ potential (≈ 10%; RMSD 0.19 versus 0.21 kcal/mol).
The role of numerical simulation for the development of an advanced HIFU system
NASA Astrophysics Data System (ADS)
Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro
2014-10-01
High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.
Rupp, Rüdiger; Kreilinger, Alex; Rohm, Martin; Kaiser, Vera; Müller-Putz, Gernot R
2012-01-01
Over the last decade the improvement of a missing hand function by application of neuroprostheses in particular the implantable Freehand system has been successfully shown in high spinal cord injured individuals. The clinically proven advantages of the Freehand system is its ease of use, the reproducible generation of two distinct functional grasp patterns and an analog control scheme based on movements of the contralateral shoulder. However, after the Freehand system is not commercially available for more than ten years, alternative grasp neuroprosthesis with a comparable functionality are still missing. Therefore, the aim of this study was to develop a non-invasive neuroprosthesis and to show that a degree of functional restoration can be provided to end users comparable to implanted devices. By introduction of an easy to handle forearm electrode sleeve the reproducible generation of two grasp patterns has been achieved. Generated grasp forces of the palmar grasp are in the range of the implanted system. Though pinch force of the lateral grasp is significantly lower, it can effectively used by a tetraplegic subject to perform functional tasks. The non-invasive grasp neuroprosthesis developed in this work may serve as an easy to apply and inexpensive way to restore a missing hand and finger function at any time after spinal cord injury.
NASA Astrophysics Data System (ADS)
Wang, I.-Ting; Chang, Chih-Cheng; Chiu, Li-Wen; Chou, Teyuh; Hou, Tuo-Hung
2016-09-01
The implementation of highly anticipated hardware neural networks (HNNs) hinges largely on the successful development of a low-power, high-density, and reliable analog electronic synaptic array. In this study, we demonstrate a two-layer Ta/TaO x /TiO2/Ti cross-point synaptic array that emulates the high-density three-dimensional network architecture of human brains. Excellent uniformity and reproducibility among intralayer and interlayer cells were realized. Moreover, at least 50 analog synaptic weight states could be precisely controlled with minimal drifting during a cycling endurance test of 5000 training pulses at an operating voltage of 3 V. We also propose a new state-independent bipolar-pulse-training scheme to improve the linearity of weight updates. The improved linearity considerably enhances the fault tolerance of HNNs, thus improving the training accuracy.
Di Giuseppe, Antonella M A; Giarretta, Nicola; Lippert, Martina; Severino, Valeria; Di Maro, Antimo
2015-02-15
In 2013, following the scandal of the presence of undeclared horse meat in various processed beef products across the Europe, several researches have been undertaken for the safety of consumer health. In this framework, an improved UPLC separation method has been developed to detect the presence of horse myoglobin in raw meat samples. The separation of both horse and beef myoglobins was achieved in only seven minutes. The methodology was improved by preparing mixtures with different composition percentages of horse and beef meat. By using myoglobin as marker, low amounts (0.50mg/0.50g, w/w; ∼0.1%) of horse meat can be detected and quantified in minced raw meat samples with high reproducibility and sensitivity, thus offering a valid alternative to conventional PCR techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.
Feasibility of spirometry in primary care to screen for COPD: a pilot study
Giraud, Violaine; Beauchet, Alain; Gomis, Thierry; Chinet, Thierry
2016-01-01
Background COPD is a frequent but underdiagnosed disease whose diagnosis relies on the spirometric demonstration of bronchial obstruction. Spirometry use by general practitioners could represent the first line in COPD diagnosis. Objective Because duration of spirometry is retarding its development in primary care, we decided to measure the time it requires in the primary-care context in France. Methods Ten volunteer general practitioners were trained during two 3-hour theoretical and practical continuing education sessions. Then, from October 2013 to May 2014, they included patients without any known respiratory disease but at risk of developing COPD (age: ≥40 years, smoker: ≥20 pack-years). The duration of spirometry and its quality were evaluated according to the following acceptability criteria: 1) expiration ≥6 seconds or reaching a plateau; 2) good start with an early peak flow, curve peaked on top and not flat; 3) no artifacts; and 4) reproducibility criteria, ie, forced expiratory volume in 1 second and forced vital capacity differences between the two best spirometry curves ≤0.15 L. Quality of the spirograms was defined as optimal when all the criteria were met and acceptable when all the criteria were satisfied except the reproducibility criterion, otherwise, it was unacceptable. Results For the 152 patients included, the 142 assessable spirometries lasted for 15.2±5.9 minutes. Acceptability criteria 1–3, respectively, were satisfied for 90.1%, 89.4%, and 91.5% of patients and reproducibility criterion 4 for 56.3%. Quality was considered optimal for 58.5% of the curves and acceptable for 30.2%. Conclusion The duration of spirometry renders it poorly compatible with the current primary-care practice in France other than for dedicated consultations. Moreover, the quality of spirometry needs to be improved. PMID:26929617
Kosa, Gergely; Vuoristo, Kiira S; Horn, Svein Jarle; Zimmermann, Boris; Afseth, Nils Kristian; Kohler, Achim; Shapaval, Volha
2018-06-01
Recent developments in molecular biology and metabolic engineering have resulted in a large increase in the number of strains that need to be tested, positioning high-throughput screening of microorganisms as an important step in bioprocess development. Scalability is crucial for performing reliable screening of microorganisms. Most of the scalability studies from microplate screening systems to controlled stirred-tank bioreactors have been performed so far with unicellular microorganisms. We have compared cultivation of industrially relevant oleaginous filamentous fungi and microalga in a Duetz-microtiter plate system to benchtop and pre-pilot bioreactors. Maximal glucose consumption rate, biomass concentration, lipid content of the biomass, biomass, and lipid yield values showed good scalability for Mucor circinelloides (less than 20% differences) and Mortierella alpina (less than 30% differences) filamentous fungi. Maximal glucose consumption and biomass production rates were identical for Crypthecodinium cohnii in microtiter plate and benchtop bioreactor. Most likely due to shear stress sensitivity of this microalga in stirred bioreactor, biomass concentration and lipid content of biomass were significantly higher in the microtiter plate system than in the benchtop bioreactor. Still, fermentation results obtained in the Duetz-microtiter plate system for Crypthecodinium cohnii are encouraging compared to what has been reported in literature. Good reproducibility (coefficient of variation less than 15% for biomass growth, glucose consumption, lipid content, and pH) were achieved in the Duetz-microtiter plate system for Mucor circinelloides and Crypthecodinium cohnii. Mortierella alpina cultivation reproducibility might be improved with inoculation optimization. In conclusion, we have presented suitability of the Duetz-microtiter plate system for the reproducible, scalable, and cost-efficient high-throughput screening of oleaginous microorganisms.
Kohno, Ryosuke; Hotta, Kenji; Matsuura, Taeko; Matsubara, Kana; Nishioka, Shie; Nishio, Teiji; Kawashima, Mitsuhiko; Ogino, Takashi
2011-04-04
We experimentally evaluated the proton beam dose reproducibility, sensitivity, angular dependence and depth-dose relationships for a new Metal Oxide Semiconductor Field Effect Transistor (MOSFET) detector. The detector was fabricated with a thinner oxide layer and was operated at high-bias voltages. In order to accurately measure dose distributions, we developed a practical method for correcting the MOSFET response to proton beams. The detector was tested by examining lateral dose profiles formed by protons passing through an L-shaped bolus. The dose reproducibility, angular dependence and depth-dose response were evaluated using a 190 MeV proton beam. Depth-output curves produced using the MOSFET detectors were compared with results obtained using an ionization chamber (IC). Since accurate measurements of proton dose distribution require correction for LET effects, we developed a simple dose-weighted correction method. The correction factors were determined as a function of proton penetration depth, or residual range. The residual proton range at each measurement point was calculated using the pencil beam algorithm. Lateral measurements in a phantom were obtained for pristine and SOBP beams. The reproducibility of the MOSFET detector was within 2%, and the angular dependence was less than 9%. The detector exhibited a good response at the Bragg peak (0.74 relative to the IC detector). For dose distributions resulting from protons passing through an L-shaped bolus, the corrected MOSFET dose agreed well with the IC results. Absolute proton dosimetry can be performed using MOSFET detectors to a precision of about 3% (1 sigma). A thinner oxide layer thickness improved the LET in proton dosimetry. By employing correction methods for LET dependence, it is possible to measure absolute proton dose using MOSFET detectors.
Hotta, Kenji; Matsuura, Taeko; Matsubara, Kana; Nishioka, Shie; Nishio, Teiji; Kawashima, Mitsuhiko; Ogino, Takashi
2011-01-01
We experimentally evaluated the proton beam dose reproducibility, sensitivity, angular dependence and depth‐dose relationships for a new Metal Oxide Semiconductor Field Effect Transistor (MOSFET) detector. The detector was fabricated with a thinner oxide layer and was operated at high‐bias voltages. In order to accurately measure dose distributions, we developed a practical method for correcting the MOSFET response to proton beams. The detector was tested by examining lateral dose profiles formed by protons passing through an L‐shaped bolus. The dose reproducibility, angular dependence and depth‐dose response were evaluated using a 190 MeV proton beam. Depth‐output curves produced using the MOSFET detectors were compared with results obtained using an ionization chamber (IC). Since accurate measurements of proton dose distribution require correction for LET effects, we developed a simple dose‐weighted correction method. The correction factors were determined as a function of proton penetration depth, or residual range. The residual proton range at each measurement point was calculated using the pencil beam algorithm. Lateral measurements in a phantom were obtained for pristine and SOBP beams. The reproducibility of the MOSFET detector was within 2%, and the angular dependence was less than 9%. The detector exhibited a good response at the Bragg peak (0.74 relative to the IC detector). For dose distributions resulting from protons passing through an L‐shaped bolus, the corrected MOSFET dose agreed well with the IC results. Absolute proton dosimetry can be performed using MOSFET detectors to a precision of about 3% (1 sigma). A thinner oxide layer thickness improved the LET in proton dosimetry. By employing correction methods for LET dependence, it is possible to measure absolute proton dose using MOSFET detectors. PACS number: 87.56.‐v
Sochat, Vanessa
2018-05-01
Here, we present the Scientific Filesystem (SCIF), an organizational format that supports exposure of executables and metadata for discoverability of scientific applications. The format includes a known filesystem structure, a definition for a set of environment variables describing it, and functions for generation of the variables and interaction with the libraries, metadata, and executables located within. SCIF makes it easy to expose metadata, multiple environments, installation steps, files, and entry points to render scientific applications consistent, modular, and discoverable. A SCIF can be installed on a traditional host or in a container technology such as Docker or Singularity. We start by reviewing the background and rationale for the SCIF, followed by an overview of the specification and the different levels of internal modules ("apps") that the organizational format affords. Finally, we demonstrate that SCIF is useful by implementing and discussing several use cases that improve user interaction and understanding of scientific applications. SCIF is released along with a client and integration in the Singularity 2.4 software to quickly install and interact with SCIF. When used inside of a reproducible container, a SCIF is a recipe for reproducibility and introspection of the functions and users that it serves. We use SCIF to evaluate container software, provide metrics, serve scientific workflows, and execute a primary function under different contexts. To encourage collaboration and sharing of applications, we developed tools along with an open source, version-controlled, tested, and programmatically accessible web infrastructure. SCIF and associated resources are available at https://sci-f.github.io. The ease of using SCIF, especially in the context of containers, offers promise for scientists' work to be self-documenting and programatically parseable for maximum reproducibility. SCIF opens up an abstraction from underlying programming languages and packaging logic to work with scientific applications, opening up new opportunities for scientific software development.
NASA Astrophysics Data System (ADS)
He, Pengbo; Li, Qiang; Zhao, Ting; Liu, Xinguo; Dai, Zhongying; Ma, Yuanyuan
2016-12-01
A synchrotron-based heavy-ion accelerator operates in pulse mode at a low repetition rate that is comparable to a patient’s breathing rate. To overcome inefficiencies and interplay effects between the residual motion of the target and the scanned heavy-ion beam delivery process for conventional free breathing (FB)-based gating therapy, a novel respiratory guidance method was developed to help patients synchronize their breathing patterns with the synchrotron excitation patterns by performing short breath holds with the aid of personalized audio-visual biofeedback (BFB) system. The purpose of this study was to evaluate the treatment precision, efficiency and reproducibility of the respiratory guidance method in scanned heavy-ion beam delivery mode. Using 96 breathing traces from eight healthy volunteers who were asked to breathe freely and guided to perform short breath holds with the aid of BFB, a series of dedicated four-dimensional dose calculations (4DDC) were performed on a geometric model which was developed assuming a linear relationship between external surrogate and internal tumor motions. The outcome of the 4DDCs was quantified in terms of the treatment time, dose-volume histograms (DVH) and dose homogeneity index. Our results show that with the respiratory guidance method the treatment efficiency increased by a factor of 2.23-3.94 compared with FB gating, depending on the duty cycle settings. The magnitude of dose inhomogeneity for the respiratory guidance methods was 7.5 times less than that of the non-gated irradiation, and good reproducibility of breathing guidance among different fractions was achieved. Thus, our study indicates that the respiratory guidance method not only improved the overall treatment efficiency of respiratory-gated scanned heavy-ion beam delivery, but also had the advantages of lower dose uncertainty and better reproducibility among fractions.
The MIMIC Code Repository: enabling reproducibility in critical care research.
Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J
2018-01-01
Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
NASA Astrophysics Data System (ADS)
Wigren, Roger; Erlandsson, Ragnar
1996-01-01
We present a method based on pre- and postscanning a piezoelectric tube scanner used in a force probe that improves the reproducibility of the scan lengths. Instead of prescanning in the same direction as when acquiring data (the z direction), which could destroy a sensitive surface, we perform lateral (x/y direction) prescans. As lateral motions of the tube scanner involve out of phase elongations and compressions of the tube in the z direction, these kinds of prescans will have a stabilizing effect on the z motion as well. By adding an additional postscan in the ±z directions, we reduce the piezoelectric creep following the data acquisition scan. When comparing the lengths of z scans with and without the pre/postscan procedure, preceded by a z voltage step 60 s before data acquisition, the deviation between four consecutive scans improved from 12% to 1.4%.
Koncz, Rebecca; Wolfenden, Fiona; Hassed, Craig; Chambers, Richard; Cohen, Julia; Glozier, Nicholas
2016-10-01
The aim of this study was to evaluate the effectiveness of a 6-week mindfulness-based stress release program (SRP) on stress and work engagement in fulltime university employees. Perceived stress, workplace wellbeing, and engagement were measured at baseline and within 1 week of the SRP completion, and contemporaneously 6 weeks apart for a waitlist control group. A second program was implemented to examine reproducibility of results. Fifty participants undertook the SRPs, and 29 participants were waitlisted. A significant improvement in distress, workplace wellbeing, and vigor was observed within the first SRP group, when compared with the control group. The improvement in distress and wellbeing was reproduced in the second SRP group. This study adds to the growing body of research that mindfulness may be an effective method for reducing workplace stress, improving employee wellbeing, and enhancing work engagement.
Roff, E J; Hosking, S L; Barnes, D A
2001-05-01
The recommended contour line (CL) location with the Heidelberg Retina Tomograph (HRT) is on the inner edge of Elschnig's scleral ring. This study investigated HRT parameter reproducibility when: (i) the CL size is altered relative to Elschnig's ring; (ii) the CL is either redrawn or imported between images. Using the HRT, seven 10 degrees images were acquired for 10 normal volunteers and 10 primary open angle glaucoma (POAG) subjects. A CL was drawn on one image for each subject using Elschnig's scleral ring for reference and imported into subsequent images. The CL diameter was then (a) increased by 50 microns; (b) increased by 100 microns; and (c) decreased by 50 microns. To investigate the effect of the method of contour line transfer between images a CL was: (1) defined for one image and imported to 6 subsequent images; (2) drawn separately for each image. Parameter variability improved as the size of the CL increased for the normal group relative to Elschnig's ring but was unchanged in the POAG group. The export/import function (method 1) resulted in better parameter reproducibility than the redrawing method for both groups. The exporting and importing function resulted in better parameter variability for both subject groups and should be used for transferring CLs across images for the same subject. Increasing the overall CL size relative to Elschnig's scleral ring improved the reproducibility of the measured parameters in the normal group. No significant difference in parameter variability was observed for the POAG group. This suggests that the reproducibility of HRT images are affected more by the variation in topography between images than change in CL definition.
Kang, Koung Mi; Choi, Seung Hong; Kim, Dong Eun; Yun, Tae Jin; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sun-Won
2017-07-10
To prospectively evaluate whether cardiac gating can improve the reproducibility of intravoxel incoherent motion (IVIM) parameters in the head and neck, we performed IVIM diffusion-weighted imaging (DWI) using 4 b values (4b), 4 b values with cardiac gating (4b gating) and 17 b values (17b). We performed IVIM DWI twice per person on nine healthy volunteers using 4b, 4b gating and 17b and five patients with head and neck masses using 4b gating and 17b. The ADC, perfusion fraction (f), diffusion coefficient (D) and perfusion-related diffusion coefficient (D*) were calculated in the brain, masticator muscle, parotid gland, submandibular gland, tonsil and masses. Intraclass coefficient (ICC), Bland-Altman analysis (BAA) and coefficient of variation (CV) were used to assess short-term test-retest reproducibility. Kruskal-Wallis test and Mann-Whitney test were used to investigate whether 4b, 4b gating or 17b had significant influences on the parameters. For normal tissues and masses, ICC was excellent for all maps except the D* map. All parameters showed the lowest CV in the 4b gating. BAA also revealed the narrowest 95% limits of agreement using 4b gating for all parameters. In the subgroup analysis, almost all parameters in brain, muscle, parotid gland and submandibular gland showed the best reproducibility using 4b gating. In the muscle, parotid gland and submandibular gland, the values of ADC, f and D were not significantly different between among the three methods. 4b gating was more reproducible with respect to measurements of IVIM parameters in comparison with 4b or 17b.
The impact of preclinical irreproducibility on drug development.
Freedman, L P; Gibson, M C
2015-01-01
The development of novel therapeutics depends and builds upon the validity and reproducibility of previously published data and findings. Yet irreproducibility is pervasive in preclinical life science research and can be traced to cumulative errors or flaws in several areas, including reference materials, study design, laboratory protocols, and data collection and analysis. The expanded development and use of consensus-based standards and well-documented best practices is needed to both enhance reproducibility and drive therapeutic innovations. © 2014 ASCPT.
Hoggan, Rita E.; Zuck, Larry D.; Cannon, W. Roger; ...
2016-05-26
A study of improved methods of processing fuel pellets was undertaken using ceria and zirconia/yttria/alumina as surrogates. Through proper granulation and vertical vibration (tapping) of the parts bag prior to dry bag isostatic pressing (DBIP), reproducibility of diameter profiles among multiple pellets of ceria was improved by almost an order of magnitude. Reproducibility of sintered pellets was sufficiently good to possibly avoid grinding. Deviation from the mean diameter along the length of multiple pellets, as well as, deviation from roundness, decreased after sintering. This is not generally observed with dry pressed pellets. Thus it is possible to machine to tolerancemore » before sintering if grinding is necessary.« less
Noise suppressed partial volume correction for cardiac SPECT/CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, Chung; Liu, Chi, E-mail: chi.liu@yale.edu
Purpose: Partial volume correction (PVC) methods typically improve quantification at the expense of increased image noise and reduced reproducibility. In this study, the authors developed a novel voxel-based PVC method that incorporates anatomical knowledge to improve quantification while suppressing noise for cardiac SPECT/CT imaging. Methods: In the proposed method, the SPECT images were first reconstructed using anatomical-based maximum a posteriori (AMAP) with Bowsher’s prior to penalize noise while preserving boundaries. A sequential voxel-by-voxel PVC approach (Yang’s method) was then applied on the AMAP reconstruction using a template response. This template response was obtained by forward projecting a template derived frommore » a contrast-enhanced CT image, and then reconstructed using AMAP to model the partial volume effects (PVEs) introduced by both the system resolution and the smoothing applied during reconstruction. To evaluate the proposed noise suppressed PVC (NS-PVC), the authors first simulated two types of cardiac SPECT studies: a {sup 99m}Tc-tetrofosmin myocardial perfusion scan and a {sup 99m}Tc-labeled red blood cell (RBC) scan on a dedicated cardiac multiple pinhole SPECT/CT at both high and low count levels. The authors then applied the proposed method on a canine equilibrium blood pool study following injection with {sup 99m}Tc-RBCs at different count levels by rebinning the list-mode data into shorter acquisitions. The proposed method was compared to MLEM reconstruction without PVC, two conventional PVC methods, including Yang’s method and multitarget correction (MTC) applied on the MLEM reconstruction, and AMAP reconstruction without PVC. Results: The results showed that the Yang’s method improved quantification, however, yielded increased noise and reduced reproducibility in the regions with higher activity. MTC corrected for PVE on high count data with amplified noise, although yielded the worst performance among all the methods tested on low-count data. AMAP effectively suppressed noise and reduced the spill-in effect in the low activity regions. However it was unable to reduce the spill-out effect in high activity regions. NS-PVC yielded superior performance in terms of both quantitative assessment and visual image quality while improving reproducibility. Conclusions: The results suggest that NS-PVC may be a promising PVC algorithm for application in low-dose protocols, and in gated and dynamic cardiac studies with low counts.« less
Jaiswal, Alok; Peddinti, Gopal; Akimov, Yevhen; Wennerberg, Krister; Kuznetsov, Sergey; Tang, Jing; Aittokallio, Tero
2017-06-01
Genome-wide loss-of-function profiling is widely used for systematic identification of genetic dependencies in cancer cells; however, the poor reproducibility of RNA interference (RNAi) screens has been a major concern due to frequent off-target effects. Currently, a detailed understanding of the key factors contributing to the sub-optimal consistency is still a lacking, especially on how to improve the reliability of future RNAi screens by controlling for factors that determine their off-target propensity. We performed a systematic, quantitative analysis of the consistency between two genome-wide shRNA screens conducted on a compendium of cancer cell lines, and also compared several gene summarization methods for inferring gene essentiality from shRNA level data. We then devised novel concepts of seed essentiality and shRNA family, based on seed region sequences of shRNAs, to study in-depth the contribution of seed-mediated off-target effects to the consistency of the two screens. We further investigated two seed-sequence properties, seed pairing stability, and target abundance in terms of their capability to minimize the off-target effects in post-screening data analysis. Finally, we applied this novel methodology to identify genetic interactions and synthetic lethal partners of cancer drivers, and confirmed differential essentiality phenotypes by detailed CRISPR/Cas9 experiments. Using the novel concepts of seed essentiality and shRNA family, we demonstrate how genome-wide loss-of-function profiling of a common set of cancer cell lines can be actually made fairly reproducible when considering seed-mediated off-target effects. Importantly, by excluding shRNAs having higher propensity for off-target effects, based on their seed-sequence properties, one can remove noise from the genome-wide shRNA datasets. As a translational application case, we demonstrate enhanced reproducibility of genetic interaction partners of common cancer drivers, as well as identify novel synthetic lethal partners of a major oncogenic driver, PIK3CA, supported by a complementary CRISPR/Cas9 experiment. We provide practical guidelines for improved design and analysis of genome-wide loss-of-function profiling and demonstrate how this novel strategy can be applied towards improved mapping of genetic dependencies of cancer cells to aid development of targeted anticancer treatments.
Wenger, Nikolaus; Moraud, Eduardo Martin; Gandar, Jerome; Musienko, Pavel; Capogrosso, Marco; Baud, Laetitia; Le Goff, Camille G.; Barraud, Quentin; Pavlova, Natalia; Dominici, Nadia; Minev, Ivan R.; Asboth, Leonie; Hirsch, Arthur; Duis, Simone; Kreider, Julie; Mortera, Andrea; Haverbeck, Oliver; Kraus, Silvio; Schmitz, Felix; DiGiovanna, Jack; van den Brand, Rubia; Bloch, Jocelyne; Detemple, Peter; Lacour, Stéphanie P.; Bézard, Erwan; Micera, Silvestro; Courtine, Grégoire
2016-01-01
Electrical neuromodulation of lumbar segments improves motor control after spinal cord injury in animal models and humans. However, the physiological principles underlying the effect of this intervention remain poorly understood, which has limited this therapeutic approach to continuous stimulation applied to restricted spinal cord locations. Here, we developed novel stimulation protocols that reproduce the natural dynamics of motoneuron activation during locomotion. For this, we computed the spatiotemporal activation pattern of muscle synergies during locomotion in healthy rats. Computer simulations identified optimal electrode locations to target each synergy through the recruitment of proprioceptive feedback circuits. This framework steered the design of spatially selective spinal implants and real–time control software that modulate extensor versus flexor synergies with precise temporal resolution. Spatiotemporal neuromodulation therapies improved gait quality, weight–bearing capacities, endurance and skilled locomotion in multiple rodent models of spinal cord injury. These new concepts are directly translatable to strategies to improve motor control in humans. PMID:26779815
Bonavina, Luigi; Laface, Letizia; Picozzi, Stefano; Nencioni, Marco; Siboni, Stefano; Bona, Davide; Sironi, Andrea; Sorba, Francesca; Clemente, Claudio
2010-09-01
With the development of tissue banking, a need for homogeneous methods of collection, processing, and storage of tissue has emerged. We describe the implementation of a biological bank in a high-volume, tertiary care University referral center for esophageal cancer surgery. We also propose an original punch biopsy technique of the surgical specimen. The method proved to be simple, reproducible, and not expensive. Unified standards for specimen collection are necessary to improve results of specimen-based diagnostic testing and research in surgical oncology.
Laroucau, K; Colaneri, C; Jaÿ, M; Corde, Y; Drapeau, A; Durand, B; Zientara, S; Beck, C
2016-06-18
To evaluate the routine complement fixation test (CFT) used to detect Burkholderia mallei antibodies in equine sera, an interlaboratory proficiency test was held with 24 European laboratories, including 22 National Reference Laboratories for glanders. The panels sent to participants were composed of sera with or without B mallei antibodies. This study confirmed the reliability of CFT and highlighted its intralaboratory reproducibility. However, the sensitivity of glanders serodiagnosis and laboratory proficiency may be improved by standardising critical reagents, including antigens, and by developing a standard B mallei serum. British Veterinary Association.
Stereotactic radiotherapy for choroidal melanoma: analysis of eye movement during treatment
NASA Astrophysics Data System (ADS)
Souza, F. M. L.; Gonçalves, O. D.; Batista, D. V. S.; Cardoso, S. C.
2018-03-01
This study aims to analyse the eye’s movement during radiotherapy treatment for choroidal melanoma, as well as the methodology used in the repositioning of the patient between treatments sessions. For this purpose, the procedures used by the hospital staff were analysed on site and videos were recorded during the treatments. The methodology for the fixation of the eye is correct in its objective. However, the repositioning needs improvements in its reproducibility. It is recommended the study to fix the eye by the healthy eye and the feasibility study for the development of a software that assists in patient repositioning.
Moreno-Tetlacuilo, Luz María Ángela; Quezada-Yamamoto, Harumi; Guevara-Ruiseñor, Elsa Susana; Ibarra-Araujo, Nora; Martínez-Gatica, Nora Liliana; Pedraza-Moreno, Roberto
The purpose of this review is to describe and analyze the status of gender violence in medical schools around the world, and its consequences in undergraduate students' health and academic development, mainly on female students. The different modalities reported in the literature are presented: gender discrimination, sexism, and sexual harassment, among others. The increase of women in medical schools has not transcendentally improved their condition in these institutions, where androcentrism and gender regimes that favor gender violence reproduce. This type of violence is a public health, human rights, and academic problem.
[Upper gastrointestinal bleeding: usefulness of prognostic scores].
Badel, S; Dorta, G; Carron, P-N
2011-08-24
Upper gastrointestinal bleeding is a potentially serious event, usually requiring urgent endoscopic treatment. Better stratification of the risk of complication or death could optimize management and improve patient outcomes, while ensuring adequate resource allocation. Several prognostic scores have been developed, in order to identify high risk patients, who require immediate treatment, and patients at low risk for whom endoscopy may be delayed. An ideal prognostic score should be accurate, simple, reproducible, and prospectively validated in different populations. Published scores meet these requirements only partially, and thus can only be used as part of an integrative diagnostic and therapeutic process.
Use of fibroblast growth factor 2 for expansion of chondrocytes and tissue engineering
NASA Technical Reports Server (NTRS)
Vunjak-Novakovic, Gordana (Inventor); Martin, Ivan (Inventor); Freed, Lisa E. (Inventor); Langer, Robert (Inventor)
2003-01-01
The present invention provides an improved method for expanding cells for use in tissue engineering. In particular the method provides specific biochemical factors to supplement cell culture medium during the expansion process in order to reproduce events occurring during embryonic development with the goal of regenerating tissue equivalents that resemble natural tissues both structurally and functionally. These specific biochemical factors improve proliferation of the cells and are capable of de-differentiation mature cells isolated from tissue so that the differentiation potential of the cells is preserved. The bioactive molecules also maintain the responsiveness of the cells to other bioactive molecules. Specifically, the invention provides methods for expanding chondrocytes in the presence of fibroblast growth factor 2 for use in regeneration of cartilage tissue.
Köster, Andreas; Spura, Thomas; Rutkai, Gábor; Kessler, Jan; Wiebeler, Hendrik; Vrabec, Jadran; Kühne, Thomas D
2016-07-15
The accuracy of water models derived from ab initio molecular dynamics simulations by means on an improved force-matching scheme is assessed for various thermodynamic, transport, and structural properties. It is found that although the resulting force-matched water models are typically less accurate than fully empirical force fields in predicting thermodynamic properties, they are nevertheless much more accurate than generally appreciated in reproducing the structure of liquid water and in fact superseding most of the commonly used empirical water models. This development demonstrates the feasibility to routinely parametrize computationally efficient yet predictive potential energy functions based on accurate ab initio molecular dynamics simulations for a large variety of different systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Studying Tidal Effects In Planetary Systems With Posidonius. A N-Body Simulator Written In Rust.
NASA Astrophysics Data System (ADS)
Blanco-Cuaresma, Sergi; Bolmont, Emeline
2017-10-01
Planetary systems with several planets in compact orbital configurations such as TRAPPIST-1 are surely affected by tidal effects. Its study provides us with important insight about its evolution. We developed a second generation of a N-body code based on the tidal model used in Mercury-T, re-implementing and improving its functionalities using Rust as programming language (including a Python interface for easy use) and the WHFAST integrator. The new open source code ensures memory safety, reproducibility of numerical N-body experiments, it improves the spin integration compared to Mercury-T and allows to take into account a new prescription for the dissipation of tidal inertial waves in the convective envelope of stars. Posidonius is also suitable for binary system simulations with evolving stars.
NASA Astrophysics Data System (ADS)
André, M. P.; Galperin, M.; Berry, A.; Ojeda-Fournier, H.; O'Boyle, M.; Olson, L.; Comstock, C.; Taylor, A.; Ledgerwood, M.
Our computer-aided diagnostic (CADx) tool uses advanced image processing and artificial intelligence to analyze findings on breast sonography images. The goal is to standardize reporting of such findings using well-defined descriptors and to improve accuracy and reproducibility of interpretation of breast ultrasound by radiologists. This study examined several factors that may impact accuracy and reproducibility of the CADx software, which proved to be highly accurate and stabile over several operating conditions.
A highly sensitive and versatile virus titration assay in the 96-well microplate format.
Borisevich, V; Nistler, R; Hudman, D; Yamshchikov, G; Seregin, A; Yamshchikov, V
2008-02-01
This report describes a fast, reproducible, inexpensive and convenient assay system for virus titration in the 96-well format. The micromethod substantially increases assay throughput and improves the data reproducibility. A highly simplified variant of virus quantification is based on immunohistochemical detection of virus amplification foci obtained without use of agarose or semisolid overlays. It can be incorporated into several types of routine virological assays successfully replacing the laborious and time-consuming conventional methods based on plaque formation under semisolid overlays. The method does not depend on the development of CPE and can be accommodated to assay viruses with substantial differences in growth properties. The use of enhanced immunohistochemical detection enabled a five- to six-fold reduction of the total assay time. The micromethod was specifically developed to take advantage of multichannel pipettor use to simplify handling of a large number of samples. The method performs well with an inexpensive low-power binocular, thus offering a routine assay system usable outside of specialized laboratory setting, such as for testing of clinical or field samples. When used in focus reduction-neutralization tests (FRNT), the method accommodates very small volumes of immune serum, which is often a decisive factor in experiments involving small rodent models.
Lee, Eun Jin; Hyun, Jiye; Choi, Yong-Ho; Hurh, Byung-Serk; Choi, Sang-Ho; Lee, Inhyung
2018-06-01
Doenjang (Korean fermented soybean paste) with an improved flavor and safety was prepared by the simultaneous fermentation of autochthonous mixed starters at the pilot plan scale. First, whole soybean meju was fermented by coculturing safety-verified starters Aspergillus oryzae MJS14 and Bacillus amyloliquefaciens zip6 or Bacillus subtilis D119C. These fermented whole soybean meju were aged in a brine solution after the additional inoculation of Tetragenococcus halophilus 7BDE22 and Zygosaccharomyces rouxii SMY045 to yield doenjang. Four doenjang batches prepared using a combination of mold, bacilli, lactic acid bacteria, and yeast starters were free of safety issues and had the general properties of traditional doenjang, a rich flavor and taste. All doenjang batches received a high consumer acceptability score, especially the ABsT and ABsTZ batches. This study suggests that flavor-rich doenjang similar to traditional doenjang can be manufactured safely and reproducibly in industry by mimicking the simultaneous fermentation of autochthonous mixed starters as in traditional doenjang fermentation. The development of a pilot plant process for doenjang fermentation using safety-verified autochthonous mixed starter will facilitate the manufacture of flavor-rich doenjang similar to traditional doenjang safely and reproducibly in industry. © 2018 Institute of Food Technologists®.
Land surface albedo and vegetation feedbacks enhanced the millennium drought in south-east Australia
NASA Astrophysics Data System (ADS)
Evans, Jason P.; Meng, Xianhong; McCabe, Matthew F.
2017-01-01
In this study, we have examined the ability of a regional climate model (RCM) to simulate the extended drought that occurred throughout the period of 2002 through 2007 in south-east Australia. In particular, the ability to reproduce the two drought peaks in 2002 and 2006 was investigated. Overall, the RCM was found to reproduce both the temporal and the spatial structure of the drought-related precipitation anomalies quite well, despite using climatological seasonal surface characteristics such as vegetation fraction and albedo. This result concurs with previous studies that found that about two-thirds of the precipitation decline can be attributed to the El Niño-Southern Oscillation (ENSO). Simulation experiments that allowed the vegetation fraction and albedo to vary as observed illustrated that the intensity of the drought was underestimated by about 10 % when using climatological surface characteristics. These results suggest that in terms of drought development, capturing the feedbacks related to vegetation and albedo changes may be as important as capturing the soil moisture-precipitation feedback. In order to improve our modelling of multi-year droughts, the challenge is to capture all these related surface changes simultaneously, and provide a comprehensive description of land surface-precipitation feedback during the droughts development.
van 't Klooster, Ronald; de Koning, Patrick J H; Dehnavi, Reza Alizadeh; Tamsma, Jouke T; de Roos, Albert; Reiber, Johan H C; van der Geest, Rob J
2012-01-01
To develop and validate an automated segmentation technique for the detection of the lumen and outer wall boundaries in MR vessel wall studies of the common carotid artery. A new segmentation method was developed using a three-dimensional (3D) deformable vessel model requiring only one single user interaction by combining 3D MR angiography (MRA) and 2D vessel wall images. This vessel model is a 3D cylindrical Non-Uniform Rational B-Spline (NURBS) surface which can be deformed to fit the underlying image data. Image data of 45 subjects was used to validate the method by comparing manual and automatic segmentations. Vessel wall thickness and volume measurements obtained by both methods were compared. Substantial agreement was observed between manual and automatic segmentation; over 85% of the vessel wall contours were segmented successfully. The interclass correlation was 0.690 for the vessel wall thickness and 0.793 for the vessel wall volume. Compared with manual image analysis, the automated method demonstrated improved interobserver agreement and inter-scan reproducibility. Additionally, the proposed automated image analysis approach was substantially faster. This new automated method can reduce analysis time and enhance reproducibility of the quantification of vessel wall dimensions in clinical studies. Copyright © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Eid, Sameh; Saleh, Noureldin; Zalewski, Adam; Vedani, Angelo
2014-12-01
Carbohydrates play a key role in a variety of physiological and pathological processes and, hence, represent a rich source for the development of novel therapeutic agents. Being able to predict binding mode and binding affinity is an essential, yet lacking, aspect of the structure-based design of carbohydrate-based ligands. We assembled a diverse data set comprising 273 carbohydrate-protein crystal structures with known binding affinity and evaluated the prediction accuracy of a large collection of well-established scoring and free-energy functions, as well as combinations thereof. Unfortunately, the tested functions were not capable of reproducing binding affinities in the studied complexes. To simplify the complex free-energy surface of carbohydrate-protein systems, we classified the studied proteins according to the topology and solvent exposure of the carbohydrate-binding site into five distinct categories. A free-energy model based on the proposed classification scheme reproduced binding affinities in the carbohydrate data set with an r 2 of 0.71 and root-mean-squared-error of 1.25 kcal/mol ( N = 236). The improvement in model performance underlines the significance of the differences in the local micro-environments of carbohydrate-binding sites and demonstrates the usefulness of calibrating free-energy functions individually according to binding-site topology and solvent exposure.
Reproducibility, Controllability, and Optimization of Lenr Experiments
NASA Astrophysics Data System (ADS)
Nagel, David J.
2006-02-01
Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.
Reproducibility of a four-point clinical severity score for glabellar frown lines.
Honeck, P; Weiss, C; Sterry, W; Rzany, B
2003-08-01
Focal injections of botulinum toxin A are used successfully for the treatment of hyperkinetic facial wrinkles. Efficacy can be measured by several methods. However, so far none has been investigated for its reproducibility. Objectives To investigate the reproducibility of a clinical 0-3 score for glabellar frown lines. In the first part of the study, a standardized photographic documentation of glabellar frown lines was produced. Based on the results of this phase, a consensus atlas of glabellar frown lines was developed and participants were trained using this atlas. In the main study, 50 standardized photographs were shown on two consecutive days to 28 dermatologists. The reproducibility of the score was investigated by conventional kappa statistics. In the main study, we found an unweighted kappa according to Fleiss of 0.62 for interobserver reproducibility. Intraobserver reproducibility showed an unweighted kappa according to Cohen of between 0.57 and 0.91 for each observer, and a weighted kappa according to Cicchetti and Allison of between 0.68 and 0.94. The clinical 0-3 score for glabellar frown lines shows a good inter- and intraobserver reproducibility.
caCORRECT2: Improving the accuracy and reliability of microarray data in the presence of artifacts
2011-01-01
Background In previous work, we reported the development of caCORRECT, a novel microarray quality control system built to identify and correct spatial artifacts commonly found on Affymetrix arrays. We have made recent improvements to caCORRECT, including the development of a model-based data-replacement strategy and integration with typical microarray workflows via caCORRECT's web portal and caBIG grid services. In this report, we demonstrate that caCORRECT improves the reproducibility and reliability of experimental results across several common Affymetrix microarray platforms. caCORRECT represents an advance over state-of-art quality control methods such as Harshlighting, and acts to improve gene expression calculation techniques such as PLIER, RMA and MAS5.0, because it incorporates spatial information into outlier detection as well as outlier information into probe normalization. The ability of caCORRECT to recover accurate gene expressions from low quality probe intensity data is assessed using a combination of real and synthetic artifacts with PCR follow-up confirmation and the affycomp spike in data. The caCORRECT tool can be accessed at the website: http://cacorrect.bme.gatech.edu. Results We demonstrate that (1) caCORRECT's artifact-aware normalization avoids the undesirable global data warping that happens when any damaged chips are processed without caCORRECT; (2) When used upstream of RMA, PLIER, or MAS5.0, the data imputation of caCORRECT generally improves the accuracy of microarray gene expression in the presence of artifacts more than using Harshlighting or not using any quality control; (3) Biomarkers selected from artifactual microarray data which have undergone the quality control procedures of caCORRECT are more likely to be reliable, as shown by both spike in and PCR validation experiments. Finally, we present a case study of the use of caCORRECT to reliably identify biomarkers for renal cell carcinoma, yielding two diagnostic biomarkers with potential clinical utility, PRKAB1 and NNMT. Conclusions caCORRECT is shown to improve the accuracy of gene expression, and the reproducibility of experimental results in clinical application. This study suggests that caCORRECT will be useful to clean up possible artifacts in new as well as archived microarray data. PMID:21957981
NASA Astrophysics Data System (ADS)
Madhulatha, A.; Rajeevan, M.; Bhowmik, S. K. Roy; Das, A. K.
2018-01-01
The primary goal of present study is to investigate the impact of assimilation of conventional and satellite radiance observations in simulating the mesoscale convective system (MCS) formed over south east India. An assimilation methodology based on Weather Research and Forecasting model three dimensional variational data assimilation is considered. Few numerical experiments are carried out to examine the individual and combined impact of conventional and non-conventional (satellite radiance) observations. After the successful inclusion of additional observations, strong analysis increments of temperature and moisture fields are noticed and contributed to significant improvement in model's initial fields. The resulting model simulations are able to successfully reproduce the prominent synoptic features responsible for the initiation of MCS. Among all the experiments, the final experiment in which both conventional and satellite radiance observations assimilated has showed considerable impact on the prediction of MCS. The location, genesis, intensity, propagation and development of rain bands associated with the MCS are simulated reasonably well. The biases of simulated temperature, moisture and wind fields at surface and different pressure levels are reduced. Thermodynamic, dynamic and vertical structure of convective cells associated with the passage of MCS are well captured. Spatial distribution of rainfall is fairly reproduced and comparable to TRMM observations. It is demonstrated that incorporation of conventional and satellite radiance observations improved the local and synoptic representation of temperature, moisture fields from surface to different levels of atmosphere. This study highlights the importance of assimilation of conventional and satellite radiances in improving the models initial conditions and simulation of MCS.
Understanding reproducibility of human IVF traits to predict next IVF cycle outcome.
Wu, Bin; Shi, Juanzi; Zhao, Wanqiu; Lu, Suzhen; Silva, Marta; Gelety, Timothy J
2014-10-01
Evaluating the failed IVF cycle often provides useful prognostic information. Before undergoing another attempt, patients experiencing an unsuccessful IVF cycle frequently request information about the probability of future success. Here, we introduced the concept of reproducibility and formulae to predict the next IVF cycle outcome. The experimental design was based on the retrospective review of IVF cycle data from 2006 to 2013 in two different IVF centers and statistical analysis. The reproducibility coefficients (r) of IVF traits including number of oocytes retrieved, oocyte maturity, fertilization, embryo quality and pregnancy were estimated using the interclass correlation coefficient between the repeated IVF cycle measurements for the same patient by variance component analysis. The formulae were designed to predict next IVF cycle outcome. The number of oocytes retrieved from patients and their fertilization rate had the highest reproducibility coefficients (r = 0.81 ~ 0.84), which indicated a very close correlation between the first retrieval cycle and subsequent IVF cycles. Oocyte maturity and number of top quality embryos had middle level reproducibility (r = 0.38 ~ 0.76) and pregnancy rate had a relative lower reproducibility (r = 0.23 ~ 0.27). Based on these parameters, the next outcome for these IVF traits might be accurately predicted by the designed formulae. The introduction of the concept of reproducibility to our human IVF program allows us to predict future IVF cycle outcomes. The traits of oocyte numbers retrieved, oocyte maturity, fertilization, and top quality embryos had higher or middle reproducibility, which provides a basis for accurate prediction of future IVF outcomes. Based on this prediction, physicians may counsel their patients or change patient's stimulation plans, and laboratory embryologists may improve their IVF techniques accordingly.
Accessing the reproducibility and specificity of pepsin and other aspartic proteases.
Ahn, Joomi; Cao, Min-Jie; Yu, Ying Qing; Engen, John R
2013-06-01
The aspartic protease pepsin is less specific than other endoproteinases. Because aspartic proteases like pepsin are active at low pH, they are utilized in hydrogen deuterium exchange mass spectrometry (HDX MS) experiments for digestion under hydrogen exchange quench conditions. We investigated the reproducibility, both qualitatively and quantitatively, of online and offline pepsin digestion to understand the compliment of reproducible pepsin fragments that can be expected during a typical pepsin digestion. The collection of reproducible peptides was identified from >30 replicate digestions of the same protein and it was found that the number of reproducible peptides produced during pepsin digestion becomes constant above 5-6 replicate digestions. We also investigated a new aspartic protease from the stomach of the rice field eel (Monopterus albus Zuiew) and compared digestion efficiency and specificity to porcine pepsin and aspergillopepsin. Unique cleavage specificity was found for rice field eel pepsin at arginine, asparagine, and glycine. Different peptides produced by the various proteases can enhance protein sequence coverage and improve the spatial resolution of HDX MS data. This article is part of a Special Issue entitled: Mass spectrometry in structural biology. Copyright © 2012 Elsevier B.V. All rights reserved.
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
An empirical analysis of journal policy effectiveness for computational reproducibility
Seiler, Jennifer; Ma, Zhaokun
2018-01-01
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050
Undefined cellulase formulations hinder scientific reproducibility
Himmel, Michael E.; Abbas, Charles A.; Baker, John O.; ...
2017-11-28
In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparationsmore » may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.« less
Undefined cellulase formulations hinder scientific reproducibility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Himmel, Michael E.; Abbas, Charles A.; Baker, John O.
In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparationsmore » may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.« less
Nucleosynthesis in Hot Bubbles of SNe-Origin of EMP Stars: HNe or SNe ?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Izutani, Natsuko; Umeda, Hideyuki; Yoshida, Takashi
2010-08-12
The observational trends of extremely metal-poor (EMP) stars reflect SN nucleosynthesis of Population III, or almost metal-free stars. The observation of EMP stars can be reproduced by HNe, not by normal SNe. However, if the innermost neutron-rich or proton-rich matter is ejected, the abundance patterns of ejected matter are changed, and there is a possibility that normal SNe can also reproduce the observations of EMP stars. In this paper, we calculate nucleosynthesis with various Y{sub e} and entropy taking into account neutrino processes. We investigate whether normal SNe with this innermost matter can reproduce the observations of EMP stars. Wemore » find that neutron-rich (Y{sub e} = 0.45-0.50) and proton-rich (Y{sub e} = 0.51-0.55) matters can improve Zn and Co, but tend to overproduce other Fe-peak elements. On the other hand, HNe can naturally reproduce the observations of EMP stars.« less
Planar heterojunction perovskite solar cells with superior reproducibility
Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu
2014-01-01
Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945
Validation of the French version of the Hospital Survey on Patient Safety Culture questionnaire.
Occelli, P; Quenon, J-L; Kret, M; Domecq, S; Delaperche, F; Claverie, O; Castets-Fontaine, B; Amalberti, R; Auroy, Y; Parneix, P; Michel, P
2013-09-01
To assess the psychometric properties of the French version of the Hospital Survey on Patient Safety Culture questionnaire (HSOPSC) and study the hierarchical structure of the measured dimensions. Cross-sectional survey of the safety culture. 18 acute care units of seven hospitals in South-western France. Full- and part-time healthcare providers who worked in the units. None. Item responses measured with 5-point agreement or frequency scales. Data analyses A principal component analysis was used to identify the emerging components. Two structural equation modeling methods [LInear Structural RELations (LISREL) and Partial Least Square (PLS)] were used to verify the model and to study the relative importance of the dimensions. Internal consistency of the retained dimensions was studied. A test-retest was performed to assess reproducibility of the items. Overall response rate was 77% (n = 401). A structure in 40 items grouped in 10 dimensions was proposed. The LISREL approach showed acceptable data fit of the proposed structure. The PLS approach indicated that three dimensions had the most impact on the safety culture: 'Supervisor/manager expectations & actions promoting safety' 'Organizational learning-continuous improvement' and 'Overall perceptions of safety'. Internal consistency was above 0.70 for six dimensions. Reproducibility was considered good for four items. The French HSOPSC questionnaire showed acceptable psychometric properties. Classification of the dimensions should guide future development of safety culture improving action plans.
Functional connectivity density mapping: comparing multiband and conventional EPI protocols.
Cohen, Alexander D; Tomasi, Dardo; Shokri-Kojori, Ehsan; Nencka, Andrew S; Wang, Yang
2018-06-01
Functional connectivity density mapping (FCDM) is a newly developed data-driven technique that quantifies the number of local and global functional connections for each voxel in the brain. In this study, we evaluated reproducibility, sensitivity, and specificity of both local functional connectivity density (lFCD) and global functional connectivity density (gFCD). We compared these metrics using the human connectome project (HCP) compatible high-resolution (2 mm isotropic, TR = 0.8 s) multiband (MB), and more typical, lower resolution (3.5 mm isotropic, TR = 2.0 s) single-band (SB) resting state functional MRI (rs-fMRI) acquisitions. Furthermore, in order to be more clinically feasible, only rs-fMRI scans that lasted seven minutes were tested. Subjects were scanned twice within a two-week span. We found sensitivity and specificity increased and reproducibility either increased or did not change for the MB compared to the SB acquisitions. The MB scans also showed improved gray matter/white matter contrast compared to the SB scans. The lFCD and gFCD patterns were similar across MB and SB scans and confined predominantly to gray matter. We also observed a strong spatial correlation of FCD between MB and SB scans indicating the two acquisitions provide similar information. These findings indicate high-resolution MB acquisitions improve the quality of FCD data, and seven minute rs-fMRI scan can provide robust FCD measurements.
Selective PEGylation of Parylene-C/SiO2 Substrates for Improved Astrocyte Cell Patterning.
Raos, B J; Doyle, C S; Simpson, M C; Graham, E S; Unsworth, C P
2018-02-09
Controlling the spatial distribution of glia and neurons in in vitro culture offers the opportunity to study how cellular interactions contribute to large scale network behaviour. A recently developed approach to cell-patterning uses differential adsorption of animal-serum protein on parylene-C and SiO 2 surfaces to enable patterning of neurons and glia. Serum, however, is typically poorly defined and generates reproducibility challenges. Alternative activation methods are highly desirable to enable patterning without relying on animal serum. We take advantage of the innate contrasting surface chemistries of parylene-C and SiO 2 to enable selective bonding of polyethylene glycol SiO 2 surfaces, i.e. PEGylation, rendering them almost completely repulsive to cell adhesion. As the reagents used in the PEGylation protocol are chemically defined, the reproducibility and batch-to-batch variability complications associated with the used of animal serum are avoided. We report that PEGylated parylene-C/SiO 2 substrates achieve a contrast in astrocyte density of 65:1 whereas the standard serum-immersion protocol results in a contrast of 5.6:1. Furthermore, single-cell isolation was significantly improved on PEGylated substrates when astrocytes were grown on close-proximity parylene-C nodes, whereas isolation was limited on serum-activated substrates due tolerance for cell adhesion on serum-adsorbed SiO 2 surfaces.
Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang
2015-01-01
Digital PCR has developed rapidly since it was first reported in the 1990s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products. PMID:26239916
Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang
2015-08-04
Digital PCR has developed rapidly since it was first reported in the 1990 s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products.
Optical Verification Laboratory Demonstration System for High Security Identification Cards
NASA Technical Reports Server (NTRS)
Javidi, Bahram
1997-01-01
Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the primary pattern [1-3]. We have demonstrated experimentally an optical processor for security verification of objects, products, and persons. This demonstration is very important to encourage industries to consider the proposed system for research and development.
Modeling charge transport in organic photovoltaic materials.
Nelson, Jenny; Kwiatkowski, Joe J; Kirkpatrick, James; Frost, Jarvist M
2009-11-17
The performance of an organic photovoltaic cell depends critically on the mobility of charge carriers within the constituent molecular semiconductor materials. However, a complex combination of phenomena that span a range of length and time scales control charge transport in disordered organic semiconductors. As a result, it is difficult to rationalize charge transport properties in terms of material parameters. Until now, efforts to improve charge mobilities in molecular semiconductors have proceeded largely by trial and error rather than through systematic design. However, recent developments have enabled the first predictive simulation studies of charge transport in disordered organic semiconductors. This Account describes a set of computational methods, specifically molecular modeling methods, to simulate molecular packing, quantum chemical calculations of charge transfer rates, and Monte Carlo simulations of charge transport. Using case studies, we show how this combination of methods can reproduce experimental mobilities with few or no fitting parameters. Although currently applied to material systems of high symmetry or well-defined structure, further developments of this approach could address more complex systems such anisotropic or multicomponent solids and conjugated polymers. Even with an approximate treatment of packing disorder, these computational methods simulate experimental mobilities within an order of magnitude at high electric fields. We can both reproduce the relative values of electron and hole mobility in a conjugated small molecule and rationalize those values based on the symmetry of frontier orbitals. Using fully atomistic molecular dynamics simulations of molecular packing, we can quantitatively replicate vertical charge transport along stacks of discotic liquid crystals which vary only in the structure of their side chains. We can reproduce the trends in mobility with molecular weight for self-organizing polymers using a cheap, coarse-grained structural simulation method. Finally, we quantitatively reproduce the field-effect mobility in disordered C60 films. On the basis of these results, we conclude that all of the necessary building blocks are in place for the predictive simulation of charge transport in macromolecular electronic materials and that such methods can be used as a tool toward the future rational design of functional organic electronic materials.
Arulandhu, Alfred J.; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M.; Prins, Theo W.; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B.; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara
2017-01-01
Abstract DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. PMID:29020743
Arulandhu, Alfred J; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M; Prins, Theo W; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara; Kok, Esther
2017-10-01
DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. © The Authors 2017. Published by Oxford University Press.
Improvement of psoriatic arthritis in a patient treated with bromocriptine for hyperprolactinemia.
Buskila, D; Sukenik, S; Holcberg, G; Horowitz, J
1991-04-01
We describe a woman with psoriatic arthritis who experienced a remarkable improvement of her skin and joint disease while only taking bromocriptine therapy. She was treated with bromocriptine for primary infertility due to hyperprolactinemia. The improvement in joint symptoms appeared to parallel that of the skin. More studies are required to determine the reproducibility of this observation.
2012-06-13
plethysmography. Overall, the presented results show that the animal aerosol system was stable and highly reproducible between different studies and over...develop and deliver low-doses of B. anthracis spores via inhalation in a reproducible manner. The pilot feasibility study (see Table 1 for results) enabled...results presented in Figure 3 show that exposures produced by the aerosol system were stable and reproducible from day-to-day. In all testing, the
Mathematical study on robust tissue pattern formation in growing epididymal tubule.
Hirashima, Tsuyoshi
2016-10-21
Tissue pattern formation during development is a reproducible morphogenetic process organized by a series of kinetic cellular activities, leading to the building of functional and stable organs. Recent studies focusing on mechanical aspects have revealed physical mechanisms on how the cellular activities contribute to the formation of reproducible tissue patterns; however, the understanding for what factors achieve the reproducibility of such patterning and how it occurs is far from complete. Here, I focus on a tube pattern formation during murine epididymal development, and show that two factors influencing physical design for the patterning, the proliferative zone within the tubule and the viscosity of tissues surrounding to the tubule, control the reproducibility of epididymal tubule pattern, using a mathematical model based on experimental data. Extensive numerical simulation of the simple mathematical model revealed that a spatially localized proliferative zone within the tubule, observed in experiments, results in more reproducible tubule pattern. Moreover, I found that the viscosity of tissues surrounding to the tubule imposes a trade-off regarding pattern reproducibility and spatial accuracy relating to the region where the tubule pattern is formed. This indicates an existence of optimality in material properties of tissues for the robust patterning of epididymal tubule. The results obtained by numerical analysis based on experimental observations provide a general insight on how physical design realizes robust tissue pattern formation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Aerosol modelling and validation during ESCOMPTE 2001
NASA Astrophysics Data System (ADS)
Cousin, F.; Liousse, C.; Cachier, H.; Bessagnet, B.; Guillaume, B.; Rosset, R.
The ESCOMPTE 2001 programme (Atmospheric Research. 69(3-4) (2004) 241) has resulted in an exhaustive set of dynamical, radiative, gas and aerosol observations (surface and aircraft measurements). A previous paper (Atmospheric Research. (2004) in press) has dealt with dynamics and gas-phase chemistry. The present paper is an extension to aerosol formation, transport and evolution. To account for important loadings of primary and secondary aerosols and their transformation processes in the ESCOMPTE domain, the ORISAM aerosol module (Atmospheric Environment. 35 (2001) 4751) was implemented on-line in the air-quality Meso-NH-C model. Additional developments have been introduced in ORganic and Inorganic Spectral Aerosol Module (ORISAM) to improve the comparison between simulations and experimental surface and aircraft field data. This paper discusses this comparison for a simulation performed during one selected day, 24 June 2001, during the Intensive Observation Period IOP2b. Our work relies on BC and OCp emission inventories specifically developed for ESCOMPTE. This study confirms the need for a fine resolution aerosol inventory with spectral chemical speciation. BC levels are satisfactorily reproduced, thus validating our emission inventory and its processing through Meso-NH-C. However, comparisons for reactive species generally denote an underestimation of concentrations. Organic aerosol levels are rather well simulated though with a trend to underestimation in the afternoon. Inorganic aerosol species are underestimated for several reasons, some of them have been identified. For sulphates, primary emissions were introduced. Improvement was obtained too for modelled nitrate and ammonium levels after introducing heterogeneous chemistry. However, no modelling of terrigeneous particles is probably a major cause for nitrates and ammonium underestimations. Particle numbers and size distributions are well reproduced, but only in the submicrometer range. Our work points out to the need of introducing coarse dust particles to further improve the simulation of PM-10 concentrations and more accurate modelling of gas-particle interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, J.; Zhang, J. T.; Ping, Q.
2013-09-11
The temperature primary standard over the range from the melting point of gallium to the freezing point of silver in National institute of Metrology (NIM), China, was established in the early 1990s. The performance of all of fixed-point furnaces degraded and needs to be updated due to many years of use. Nowadays, the satisfactory fixed point materials can be available with the development of the modern purification techniques. NIM plans to use a group of three cells for each defining fixed point temperature. In this way the eventual drift of individual cells can be evidenced by periodic intercomparison and thismore » will increase the reliability in disseminating the ITS-90 in China. This article describes the recent improvements in realization of ITS-90 over temperature range from the melting point of gallium to the freezing point of silver at NIM. Taking advantages of the technological advances in the design and manufacture of furnaces, the new three-zone furnaces and the open-type fixed points were developed from the freezing point of indium to the freezing point of silver, and a furnace with the three-zone semiconductor cooling was designed to automatically realize the melting point of gallium. The reproducibility of the new melting point of gallium and the new open-type freezing points of In, Sn, Zn. Al and Ag is improved, especially the freezing points of Al and Ag with the reproducibility of 0.2mK and 0.5mK respectively. The expanded uncertainty in the realization of these defining fixed point temperatures is 0.34mK, 0.44mK, 0.54mK, 0.60mK, 1.30mK and 1.88mK respectively.« less
Improvements in the EQ-10 electrodeless Z-pinch EUV source for metrology applications
NASA Astrophysics Data System (ADS)
Horne, Stephen F.; Gustafson, Deborah; Partlow, Matthew J.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.
2011-04-01
Now that EUV lithography systems are beginning to ship into the fabs for next generation chips it is more critical that the EUV infrastructure developments are keeping pace. Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinch™ light source since 2005. The source is currently being used for metrology, mask inspection, and resist development. These applications require especially stable performance in both power and source size. Over the last 5 years Energetiq has made many source modifications which have included better thermal management as well as high pulse rate operation6. Recently we have further increased the system power handling and electrical pulse reproducibility. The impact of these modifications on source performance will be reported.
Smart insole sensors for sports and rehabilitation
NASA Astrophysics Data System (ADS)
Tamm, Tarmo; Pärlin, Karel; Tiimus, Tõnis; Leemets, Kaur; Terasmaa, Tõnis; Must, Indrek
2014-04-01
A light-weight, soft, robust and low cost sensory system integrated into the inner soles of footwear is being developed that channels information to a mobile device, allowing to assess the ergonomics of the technique applied and to achieve improved performance in several fields of sport, to develop orthopedic footwear or monitor elevated plantar pressures for several fields of medicine, including early detection of diabetic foot ulceration. The advantages and disadvantages of several sensory material types were considered in the present work, focusing on signal reproducibility for periodic pressure measurements, response frequency and long-term stability, especially after extended load periods. Promising results were obtained for both capacitive and resistive sensory materials, utilizing virtually the same electronics platform for both types.
Reproducibility measurements of three methods for calculating in vivo MR-based knee kinematics.
Lansdown, Drew A; Zaid, Musa; Pedoia, Valentina; Subburaj, Karupppasamy; Souza, Richard; Benjamin, C; Li, Xiaojuan
2015-08-01
To describe three quantification methods for magnetic resonance imaging (MRI)-based knee kinematic evaluation and to report on the reproducibility of these algorithms. T2 -weighted, fast-spin echo images were obtained of the bilateral knees in six healthy volunteers. Scans were repeated for each knee after repositioning to evaluate protocol reproducibility. Semiautomatic segmentation defined regions of interest for the tibia and femur. The posterior femoral condyles and diaphyseal axes were defined using the previously defined tibia and femur. All segmentation was performed twice to evaluate segmentation reliability. Anterior tibial translation (ATT) and internal tibial rotation (ITR) were calculated using three methods: a tibial-based registration system, a combined tibiofemoral-based registration method with all manual segmentation, and a combined tibiofemoral-based registration method with automatic definition of condyles and axes. Intraclass correlation coefficients and standard deviations across multiple measures were determined. Reproducibility of segmentation was excellent (ATT = 0.98; ITR = 0.99) for both combined methods. ATT and ITR measurements were also reproducible across multiple scans in the combined registration measurements with manual (ATT = 0.94; ITR = 0.94) or automatic (ATT = 0.95; ITR = 0.94) condyles and axes. The combined tibiofemoral registration with automatic definition of the posterior femoral condyle and diaphyseal axes allows for improved knee kinematics quantification with excellent in vivo reproducibility. © 2014 Wiley Periodicals, Inc.
Karamitsos, Theodoros D; Hudsmith, Lucy E; Selvanayagam, Joseph B; Neubauer, Stefan; Francis, Jane M
2007-01-01
Accurate and reproducible measurement of left ventricular (LV) mass and function is a significant strength of Cardiovascular Magnetic Resonance (CMR). Reproducibility and accuracy of these measurements is usually reported between experienced operators. However, an increasing number of inexperienced operators are now training in CMR and are involved in post-processing analysis. The aim of the study was to assess the interobserver variability of the manual planimetry of LV contours amongst two experienced and six inexperienced operators before and after a two months training period. Ten healthy normal volunteers (5 men, mean age 34+/-14 years) comprised the study population. LV volumes, mass, and ejection fraction were manually evaluated using Argus software (Siemens Medical Solutions, Erlangen, Germany) for each subject, once by the two experienced and twice by the six inexperienced operators. The mean values of experienced operators were considered the reference values. The agreement between operators was evaluated by means of Bland-Altman analysis. Training involved standardized data acquisition, simulated off-line analysis and mentoring. The trainee operators demonstrated improvement in the measurement of all the parameters compared to the experienced operators. The mean ejection fraction variability improved from 7.2% before training to 3.7% after training (p=0.03). The parameter in which the trainees showed the least improvement was LV mass (from 7.7% to 6.7% after training). The basal slice selection and contour definition were the main sources of errors. An intensive two month training period significantly improved the accuracy of LV functional measurements. Adequate training of new CMR operators is of paramount importance in our aim to maintain the accuracy and high reproducibility of CMR in LV function analysis.
Automated acoustic matrix deposition for MALDI sample preparation.
Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M
2006-02-01
Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.
A murine model of targeted infusion for intracranial tumors.
Kim, Minhyung; Barone, Tara A; Fedtsova, Natalia; Gleiberman, Anatoli; Wilfong, Chandler D; Alosi, Julie A; Plunkett, Robert J; Gudkov, Andrei; Skitzki, Joseph J
2016-01-01
Historically, intra-arterial (IA) drug administration for malignant brain tumors including glioblastoma multiforme (GBM) was performed as an attempt to improve drug delivery. With the advent of percutaneous neuorovascular techniques and modern microcatheters, intracranial drug delivery is readily feasible; however, the question remains whether IA administration is safe and more effective compared to other delivery modalities such as intravenous (IV) or oral administrations. Preclinical large animal models allow for comparisons between treatment routes and to test novel agents, but can be expensive and difficult to generate large numbers and rapid results. Accordingly, we developed a murine model of IA drug delivery for GBM that is reproducible with clear readouts of tumor response and neurotoxicities. Herein, we describe a novel mouse model of IA drug delivery accessing the internal carotid artery to treat ipsilateral implanted GBM tumors that is consistent and reproducible with minimal experience. The intent of establishing this unique platform is to efficiently interrogate targeted anti-tumor agents that may be designed to take advantage of a directed, regional therapy approach for brain tumors.
Mischak, Harald; Vlahou, Antonia; Ioannidis, John P A
2013-04-01
Mass spectrometry platforms have attracted a lot of interest in the last 2 decades as profiling tools for native peptides and proteins with clinical potential. However, limitations associated with reproducibility and analytical robustness, especially pronounced with the initial SELDI systems, hindered the application of such platforms in biomarker qualification and clinical implementation. The scope of this article is to give a short overview on data available on performance and on analytical robustness of the different platforms for peptide profiling. Using the CE-MS platform as a paradigm, data on analytical performance are described including reproducibility (short-term and intermediate repeatability), stability, interference, quantification capabilities (limits of detection), and inter-laboratory variability. We discuss these issues by using as an example our experience with the development of a 273-peptide marker for chronic kidney disease. Finally, we discuss pros and cons and means for improvement and emphasize the need to test in terms of comparative clinical performance and impact, different platforms that pass reasonably well analytical validation tests. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Nichols, B. Nolan; Pohl, Kilian M.
2017-01-01
Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study. PMID:26267019
Facile one-step synthesis of Ag@Fe3O4 core-shell nanospheres for reproducible SERS substrates
NASA Astrophysics Data System (ADS)
Sun, Lijuan; He, Jiang; An, Songsong; Zhang, Junwei; Ren, Dong
2013-08-01
A facile approach has been developed to synthesize Ag@Fe3O4 core-shell nanospheres, in which the Ag nanoparticle core was well wrapped by a permeable Fe3O4 shell. An in situ reduction of AgNO3 and Fe(NO3)3 was the basis of this one-step method with ethylene glycol as the reducing agent. The as-obtained Ag@Fe3O4 nanospheres were a highly efficient surface-enhanced Raman scattering (SERS) substrate; high reproducibility, stability, and reusability were obtained by employing 4-aminothiophenol (4-ATP) and rhodamine 6G (R6G) as the Raman probe molecules. It was revealed that the SERS signals of 4-ATP and R6G on the Ag@Fe3O4 nanospheres were much stronger than those on the pure Ag nanoparticles, demonstrating that the magnetic enrichment procedures can improve SERS detection sensitivity efficiently. A highly efficient and recyclable SERS substrate was produced by the new model system that has potential applications in chemical and biomolecular assays.
Temperature stress and plant sexual reproduction: uncovering the weakest links.
Zinn, Kelly E; Tunc-Ozdemir, Meral; Harper, Jeffrey F
2010-04-01
The reproductive (gametophytic) phase in flowering plants is often highly sensitive to hot or cold temperature stresses, with even a single hot day or cold night sometimes being fatal to reproductive success. This review describes studies of temperature stress on several crop plants, which suggest that pollen development and fertilization may often be the most sensitive reproductive stage. Transcriptome and proteomic studies on several plant species are beginning to identify stress response pathways that function during pollen development. An example is provided here of genotypic differences in the reproductive stress tolerance between two ecotypes of Arabidopsis thaliana Columbia (Col) and Hilversum (Hi-0), when reproducing under conditions of hot days and cold nights. Hi-0 exhibited a more severe reduction in seed set, correlated with a reduction in pollen tube growth potential and tropism defects. Hi-0 thus provides an Arabidopsis model to investigate strategies for improved stress tolerance in pollen. Understanding how different plants cope with stress during reproductive development offers the potential to identify genetic traits that could be manipulated to improve temperature tolerance in selected crop species being cultivated in marginal climates.
Tellurium notebooks-An environment for reproducible dynamical modeling in systems biology.
Medley, J Kyle; Choi, Kiri; König, Matthias; Smith, Lucian; Gu, Stanley; Hellerstein, Joseph; Sealfon, Stuart C; Sauro, Herbert M
2018-06-01
The considerable difficulty encountered in reproducing the results of published dynamical models limits validation, exploration and reuse of this increasingly large biomedical research resource. To address this problem, we have developed Tellurium Notebook, a software system for model authoring, simulation, and teaching that facilitates building reproducible dynamical models and reusing models by 1) providing a notebook environment which allows models, Python code, and narrative to be intermixed, 2) supporting the COMBINE archive format during model development for capturing model information in an exchangeable format and 3) enabling users to easily simulate and edit public COMBINE-compliant models from public repositories to facilitate studying model dynamics, variants and test cases. Tellurium Notebook, a Python-based Jupyter-like environment, is designed to seamlessly inter-operate with these community standards by automating conversion between COMBINE standards formulations and corresponding in-line, human-readable representations. Thus, Tellurium brings to systems biology the strategy used by other literate notebook systems such as Mathematica. These capabilities allow users to edit every aspect of the standards-compliant models and simulations, run the simulations in-line, and re-export to standard formats. We provide several use cases illustrating the advantages of our approach and how it allows development and reuse of models without requiring technical knowledge of standards. Adoption of Tellurium should accelerate model development, reproducibility and reuse.
Pathway engineering to improve ethanol production by thermophilic bacteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynd, L.R.
1998-12-31
Continuation of a research project jointly funded by the NSF and DOE is proposed. The primary project goal is to develop and characterize strains of C. thermocellum and C. thermosaccharolyticum having ethanol selectivity similar to more convenient ethanol-producing organisms. An additional goal is to document the maximum concentration of ethanol that can be produced by thermophiles. These goals build on results from the previous project, including development of most of the genetic tools required for pathway engineering in the target organisms. As well, we demonstrated that the tolerance of C. thermosaccharolyticum to added ethanol is sufficiently high to allow practicalmore » utilization should similar tolerance to produced ethanol be demonstrated, and that inhibition by neutralizing agents may explain the limited concentrations of ethanol produced in studies to date. Task 1 involves optimization of electrotransformation, using either modified conditions or alternative plasmids to improve upon the low but reproducible transformation, frequencies we have obtained thus far.« less
Acquisition plan for Digital Document Storage (DDS) prototype system
NASA Technical Reports Server (NTRS)
1990-01-01
NASA Headquarters maintains a continuing interest in and commitment to exploring the use of new technology to support productivity improvements in meeting service requirements tasked to the NASA Scientific and Technical Information (STI) Facility, and to support cost effective approaches to the development and delivery of enhanced levels of service provided by the STI Facility. The DDS project has been pursued with this interest and commitment in mind. It is believed that DDS will provide improved archival blowback quality and service for ad hoc requests for paper copies of documents archived and serviced centrally at the STI Facility. It will also develop an operating capability to scan, digitize, store, and reproduce paper copies of 5000 NASA technical reports archived annually at the STI Facility and serviced to the user community. Additionally, it will provide NASA Headquarters and field installations with on-demand, remote, electronic retrieval of digitized, bilevel, bit mapped report images along with branched, nonsequential retrieval of report subparts.
NASA Astrophysics Data System (ADS)
Pawcenis, Dominika; Koperska, Monika A.; Milczarek, Jakub M.; Łojewski, Tomasz; Łojewska, Joanna
2014-02-01
A direct goal of this paper was to improve the methods of sample preparation and separation for analyses of fibroin polypeptide with the use of size exclusion chromatography (SEC). The motivation for the study arises from our interest in natural polymers included in historic textile and paper artifacts, and is a logical response to the urgent need for developing rationale-based methods for materials conservation. The first step is to develop a reliable analytical tool which would give insight into fibroin structure and its changes caused by both natural and artificial ageing. To investigate the influence of preparation conditions, two sets of artificially aged samples were prepared (with and without NaCl in sample solution) and measured by the means of SEC with multi angle laser light scattering detector. It was shown that dialysis of fibroin dissolved in LiBr solution allows removal of the salt which destroys stacks chromatographic columns and prevents reproducible analyses. Salt rich (NaCl) water solutions of fibroin improved the quality of chromatograms.
Light emitting fabric technologies for photodynamic therapy.
Mordon, Serge; Cochrane, Cédric; Tylcz, Jean Baptiste; Betrouni, Nacim; Mortier, Laurent; Koncar, Vladan
2015-03-01
Photodynamic therapy (PDT) is considered to be a promising method for treating various types of cancer. A homogeneous and reproducible illumination during clinical PDT plays a determinant role in preventing under- or over-treatment. The development of flexible light sources would considerably improve the homogeneity of light delivery. The integration of optical fiber into flexible structures could offer an interesting alternative. This paper aims to describe different methods proposed to develop Side Emitting Optical Fibers (SEOF), and how these SEOF can be integrated in a flexible structure to improve light illumination of the skin during PDT. Four main techniques can be described: (i) light blanket integrating side-glowing optical fibers, (ii) light emitting panel composed of SEOF obtained by micro-perforations of the cladding, (iii) embroidery-based light emitting fabric, and (iv) woven-based light emitting fabric. Woven-based light emitting fabrics give the best performances: higher fluence rate, best homogeneity of light delivery, good flexibility. Copyright © 2014 Elsevier B.V. All rights reserved.
A novel fully-humanised 3D skin equivalent to model early melanoma invasion
Hill, David S; Robinson, Neil D P; Caley, Matthew P; Chen, Mei; O’Toole, Edel A; Armstrong, Jane L; Przyborski, Stefan; Lovat, Penny E
2015-01-01
Metastatic melanoma remains incurable, emphasising the acute need for improved research models to investigate the underlying biological mechanisms mediating tumour invasion and metastasis, and to develop more effective targeted therapies to improve clinical outcome. Available animal models of melanoma do not accurately reflect human disease and current in vitro human skin equivalent models incorporating melanoma cells are not fully representative of the human skin microenvironment. We have developed a robust and reproducible, fully-humanised 3D skin equivalent comprising a stratified, terminally differentiated epidermis and a dermal compartment consisting of fibroblast-generated extracellular matrix. Melanoma cells incorporated into the epidermis were able to invade through the basement membrane and into the dermis, mirroring early tumour invasion in vivo. Comparison of our novel 3D melanoma skin equivalent with melanoma in situ and metastatic melanoma indicates this model accurately recreates features of disease pathology, making it a physiologically representative model of early radial and vertical growth phase melanoma invasion. PMID:26330548
Mapping population-based structural connectomes.
Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu
2018-05-15
Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.
The Road to Reproducibility in Animal Research.
Jilka, Robert L
2016-07-01
Reproducibility of research findings is the hallmark of scientific advance. However, the recently noted lack of reproducibility and transparency of published research using animal models of human biology and disease has alarmed funders, scientists, and the public. Improved reporting of methodology and better use of statistical tools are needed to enhance the quality and utility of published research. Reporting guidelines like Animal Research: Reporting In Vivo Experiments (ARRIVE) have been devised to achieve these goals, but most biomedical research journals, including the JBMR, have not been able to obtain high compliance. Cooperative efforts among authors, reviewers and editors-empowered by increased awareness of their responsibilities, and enabled by user-friendly guidelines-are needed to solve this problem. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.
Do neural nets learn statistical laws behind natural language?
Takahashi, Shuntaro; Tanaka-Ishii, Kumiko
2017-01-01
The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf's law and Heaps' law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf's law and Heaps' law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks.
Do neural nets learn statistical laws behind natural language?
Takahashi, Shuntaro
2017-01-01
The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf’s law and Heaps’ law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf’s law and Heaps’ law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks. PMID:29287076
Gabay, Yafit; Karni, Avi; Banai, Karen
2017-01-01
Speech perception can improve substantially with practice (perceptual learning) even in adults. Here we compared the effects of four training protocols that differed in whether and how task difficulty was changed during a training session, in terms of the gains attained and the ability to apply (transfer) these gains to previously un-encountered items (tokens) and to different talkers. Participants trained in judging the semantic plausibility of sentences presented as time-compressed speech and were tested on their ability to reproduce, in writing, the target sentences; trail-by-trial feedback was afforded in all training conditions. In two conditions task difficulty (low or high compression) was kept constant throughout the training session, whereas in the other two conditions task difficulty was changed in an adaptive manner (incrementally from easy to difficult, or using a staircase procedure). Compared to a control group (no training), all four protocols resulted in significant post-training improvement in the ability to reproduce the trained sentences accurately. However, training in the constant-high-compression protocol elicited the smallest gains in deciphering and reproducing trained items and in reproducing novel, untrained, items after training. Overall, these results suggest that training procedures that start off with relatively little signal distortion (“easy” items, not far removed from standard speech) may be advantageous compared to conditions wherein severe distortions are presented to participants from the very beginning of the training session. PMID:28545039
Tracking maize pollen development by the Leaf Collar Method.
Begcy, Kevin; Dresselhaus, Thomas
2017-12-01
An easy and highly reproducible nondestructive method named the Leaf Collar Method is described to identify and characterize the different stages of pollen development in maize. In plants, many cellular events such as meiosis, asymmetric cell division, cell cycle regulation, cell fate determination, nucleus movement, vacuole formation, chromatin condensation and epigenetic modifications take place during pollen development. In maize, pollen development occurs in tassels that are confined within the internal stalk of the plant. Hence, identification of the different pollen developmental stages as a tool to investigate above biological processes is impossible without dissecting the entire plant. Therefore, an efficient and reproducible method is necessary to isolate homogeneous cell populations at individual stages throughout pollen development without destroying the plant. Here, we describe a method to identify the various stages of pollen development in maize. Using the Leaf Collar Method in the maize inbreed line B73, we have determined the duration of each stage from pollen mother cells before meiosis to mature tricellular pollen. Anther and tassel size as well as percentage of pollen stages were correlated with vegetative stages, which are easily recognized. The identification of stage-specific genes indicates the reproducibility of the method. In summary, we present an easy and highly reproducible nondestructive method to identify and characterize the different stages of pollen development in maize. This method now opens the way for many subsequent physiological, morphological and molecular analyses to study, for instance, transcriptomics, metabolomics, DNA methylation and chromatin patterns during normal and stressful conditions throughout pollen development in one of the economically most important grass species.
NASA Astrophysics Data System (ADS)
Erickson, T.
2016-12-01
Deriving actionable information from Earth observation data obtained from sensors or models can be quite complicated, and sharing those insights with others in a form that they can understand, reproduce, and improve upon is equally difficult. Journal articles, even if digital, commonly present just a summary of an analysis that cannot be understood in depth or reproduced without major effort on the part of the reader. Here we show a method of improving scientific literacy by pairing a recently developed scientific presentation technology (Jupyter Notebooks) with a petabyte-scale platform for accessing and analyzing Earth observation and model data (Google Earth Engine). Jupyter Notebooks are interactive web documents that mix live code with annotations such as rich-text markup, equations, images, videos, hyperlinks and dynamic output. Notebooks were first introduced as part of the IPython project in 2011, and have since gained wide acceptance in the scientific programming community, initially among Python programmers but later by a wide range of scientific programming languages. While Jupyter Notebooks have been widely adopted for general data analysis, data visualization, and machine learning, to date there have been relatively few examples of using Jupyter Notebooks to analyze geospatial datasets. Google Earth Engine is cloud-based platform for analyzing geospatial data, such as satellite remote sensing imagery and/or Earth system model output. Through its Python API, Earth Engine makes petabytes of Earth observation data accessible, and provides hundreds of algorithmic building blocks that can be chained together to produce high-level algorithms and outputs in real-time. We anticipate that this technology pairing will facilitate a better way of creating, documenting, and sharing complex analyses that derive information on our Earth that can be used to promote broader understanding of the complex issues that it faces. http://jupyter.orghttps://earthengine.google.com
Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.
2016-01-01
Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts. PMID:27529701
Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W
2016-01-01
Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts.
Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W.; Ramey, John; Davis, Mark M.; Kalams, Spyros A.; De Rosa, Stephen C.; Gottardo, Raphael
2014-01-01
Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational cytometry and facilitate reproducible analysis in a unified environment. PMID:25167361
Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W; Ramey, John; Davis, Mark M; Kalams, Spyros A; De Rosa, Stephen C; Gottardo, Raphael
2014-08-01
Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational cytometry and facilitate reproducible analysis in a unified environment.
Data standards can boost metabolomics research, and if there is a will, there is a way.
Rocca-Serra, Philippe; Salek, Reza M; Arita, Masanori; Correa, Elon; Dayalan, Saravanan; Gonzalez-Beltran, Alejandra; Ebbels, Tim; Goodacre, Royston; Hastings, Janna; Haug, Kenneth; Koulman, Albert; Nikolski, Macha; Oresic, Matej; Sansone, Susanna-Assunta; Schober, Daniel; Smith, James; Steinbeck, Christoph; Viant, Mark R; Neumann, Steffen
2016-01-01
Thousands of articles using metabolomics approaches are published every year. With the increasing amounts of data being produced, mere description of investigations as text in manuscripts is not sufficient to enable re-use anymore: the underlying data needs to be published together with the findings in the literature to maximise the benefit from public and private expenditure and to take advantage of an enormous opportunity to improve scientific reproducibility in metabolomics and cognate disciplines. Reporting recommendations in metabolomics started to emerge about a decade ago and were mostly concerned with inventories of the information that had to be reported in the literature for consistency. In recent years, metabolomics data standards have developed extensively, to include the primary research data, derived results and the experimental description and importantly the metadata in a machine-readable way. This includes vendor independent data standards such as mzML for mass spectrometry and nmrML for NMR raw data that have both enabled the development of advanced data processing algorithms by the scientific community. Standards such as ISA-Tab cover essential metadata, including the experimental design, the applied protocols, association between samples, data files and the experimental factors for further statistical analysis. Altogether, they pave the way for both reproducible research and data reuse, including meta-analyses. Further incentives to prepare standards compliant data sets include new opportunities to publish data sets, but also require a little "arm twisting" in the author guidelines of scientific journals to submit the data sets to public repositories such as the NIH Metabolomics Workbench or MetaboLights at EMBL-EBI. In the present article, we look at standards for data sharing, investigate their impact in metabolomics and give suggestions to improve their adoption.
Fortney, Kristen; Griesman, Joshua; Kotlyar, Max; Pastrello, Chiara; Angeli, Marc; Sound-Tsao, Ming; Jurisica, Igor
2015-01-01
Repurposing FDA-approved drugs with the aid of gene signatures of disease can accelerate the development of new therapeutics. A major challenge to developing reliable drug predictions is heterogeneity. Different gene signatures of the same disease or drug treatment often show poor overlap across studies, as a consequence of both biological and technical variability, and this can affect the quality and reproducibility of computational drug predictions. Existing algorithms for signature-based drug repurposing use only individual signatures as input. But for many diseases, there are dozens of signatures in the public domain. Methods that exploit all available transcriptional knowledge on a disease should produce improved drug predictions. Here, we adapt an established meta-analysis framework to address the problem of drug repurposing using an ensemble of disease signatures. Our computational pipeline takes as input a collection of disease signatures, and outputs a list of drugs predicted to consistently reverse pathological gene changes. We apply our method to conduct the largest and most systematic repurposing study on lung cancer transcriptomes, using 21 signatures. We show that scaling up transcriptional knowledge significantly increases the reproducibility of top drug hits, from 44% to 78%. We extensively characterize drug hits in silico, demonstrating that they slow growth significantly in nine lung cancer cell lines from the NCI-60 collection, and identify CALM1 and PLA2G4A as promising drug targets for lung cancer. Our meta-analysis pipeline is general, and applicable to any disease context; it can be applied to improve the results of signature-based drug repurposing by leveraging the large number of disease signatures in the public domain. PMID:25786242
NASA Astrophysics Data System (ADS)
Lee, Sungkyu
2001-08-01
Quartz tuning fork blanks with improved impact-resistant characteristics for use in Qualcomm mobile station modem (MSM)-3000 central processing unit (CPU) chips for code division multiple access (CDMA), personal communication system (PCS), and global system for mobile communication (GSM) systems were designed using finite element method (FEM) analysis and suitable processing conditions were determined for the reproducible precision etching of a Z-cut quartz wafer into an array of tuning forks. Negative photoresist photolithography for the additive process was used in preference to positive photoresist photolithography for the subtractive process to etch the array of quartz tuning forks. The tuning fork pattern was transferred via a conventional photolithographical chromium/quartz glass template using a standard single-sided aligner and subsequent negative photoresist development. A tightly adhering and pinhole-free 600/2000 Å chromium/gold mask was coated over the developed photoresist pattern which was subsequently stripped in acetone. This procedure was repeated on the back surface of the wafer. With the protective metallization area of the tuning fork geometry thus formed, etching through the quartz wafer was performed at 80°C in a ± 1.5°C controlled bath containing a concentrated solution of ammonium bifluoride to remove the unwanted areas of the quartz wafer. The quality of the quartz wafer surface finish after quartz etching depended primarily on the surface finish of the quartz wafer prior to etching and the quality of quartz crystals used. Selective etching of a 100 μm quartz wafer could be achieved within 90 min at 80°C. A selective etching procedure with reproducible precision has thus been established and enables the photolithographic mass production of miniature tuning fork resonators.
CAMECA IMS 1300-HR3: The New Generation Ion Microprobe
NASA Astrophysics Data System (ADS)
Peres, P.; Choi, S. Y.; Renaud, L.; Saliot, P.; Larson, D. J.
2016-12-01
The success of secondary ion mass spectrometry (SIMS) in Geo- and Cosmo-chemistry relies on its performance in terms of: 1) very high sensitivity (mandatory for high precision measurements or to achieve low detection limits); 2) a broad mass range of elemental and isotopic species, from low mass (H) to high mass (U and above); 3) in-situ analysis of any solid flat polished surface; and 4) high spatial resolution from tens of microns down to sub-micron scale. The IMS 1300-HR3 (High Reproducibility, High spatial Resolution, High mass Resolution) is the latest generation of CAMECA's large geometry magnetic sector SIMS (or ion microprobe), successor to the internationally recognized IMS 1280-HR. The 1300-HR3delivers unmatched analytical performance for a wide range of applications (stable isotopes, geochronology, trace elements, nuclear safeguards and environmental studies…) due to: • High brightness RF-plasma oxygen ion source with enhanced beam density and current stability, dramatically improving spatial resolution, data reproducibility, and throughput • Automated sample loading system with motorized sample height (Z) adjustment, significantly increasing analysis precision, ease-of-use, and productivity • UV-light microscope for enhanced optical image resolution, together with dedicated software for easy sample navigation (developed by University of Wisconsin, USA) • Low noise 1012Ω resistor Faraday cup preamplifier boards for measuring low signal intensities In addition, improvements in electronics and software have been integrated into the new instrument. In order to meet a growing demand from geochronologists, CAMECA also introduces the KLEORA, which is a fully optimized ion microprobe for advanced mineral dating derived from the IMS 1300-HR3. Instrumental developments as well as data obtained for stable isotope and U-Pb dating applications will be presented in detail.
Andrade, E L; Bento, A F; Cavalli, J; Oliveira, S K; Freitas, C S; Marcon, R; Schwanke, R C; Siqueira, J M; Calixto, J B
2016-10-24
This review presents a historical overview of drug discovery and the non-clinical stages of the drug development process, from initial target identification and validation, through in silico assays and high throughput screening (HTS), identification of leader molecules and their optimization, the selection of a candidate substance for clinical development, and the use of animal models during the early studies of proof-of-concept (or principle). This report also discusses the relevance of validated and predictive animal models selection, as well as the correct use of animal tests concerning the experimental design, execution and interpretation, which affect the reproducibility, quality and reliability of non-clinical studies necessary to translate to and support clinical studies. Collectively, improving these aspects will certainly contribute to the robustness of both scientific publications and the translation of new substances to clinical development.
Kwakwa, Kristin A; Vanderburgh, Joseph P; Guelcher, Scott A; Sterling, Julie A
2017-08-01
Bone is a structurally unique microenvironment that presents many challenges for the development of 3D models for studying bone physiology and diseases, including cancer. As researchers continue to investigate the interactions within the bone microenvironment, the development of 3D models of bone has become critical. 3D models have been developed that replicate some properties of bone, but have not fully reproduced the complex structural and cellular composition of the bone microenvironment. This review will discuss 3D models including polyurethane, silk, and collagen scaffolds that have been developed to study tumor-induced bone disease. In addition, we discuss 3D printing techniques used to better replicate the structure of bone. 3D models that better replicate the bone microenvironment will help researchers better understand the dynamic interactions between tumors and the bone microenvironment, ultimately leading to better models for testing therapeutics and predicting patient outcomes.
Animal models for clinical and gestational diabetes: maternal and fetal outcomes
Kiss, Ana CI; Lima, Paula HO; Sinzato, Yuri K; Takaku, Mariana; Takeno, Marisa A; Rudge, Marilza VC; Damasceno, Débora C
2009-01-01
Background Diabetes in pregnant women is associated with an increased risk of maternal and neonatal morbidity and remains a significant medical challenge. Diabetes during pregnancy may be divided into clinical diabetes and gestational diabetes. Experimental models are developed with the purpose of enhancing understanding of the pathophysiological mechanisms of diseases that affect humans. With regard to diabetes in pregnancy, experimental findings from models will lead to the development of treatment strategies to maintain a normal metabolic intrauterine milieu, improving perinatal development by preventing fetal growth restriction or macrosomia. Based on animal models of diabetes during pregnancy previously reported in the medical literature, the present study aimed to compare the impact of streptozotocin-induced severe (glycemia >300 mg/dl) and mild diabetes (glycemia between 120 and 300 mg/dl) on glycemia and maternal reproductive and fetal outcomes of Wistar rats to evaluate whether the animal model reproduces the maternal and perinatal results of clinical and gestational diabetes in humans. Methods On day 5 of life, 96 female Wistar rats were assigned to three experimental groups: control (n = 16), severe (n = 50) and mild diabetes (n = 30). At day 90 of life, rats were mated. On day 21 of pregnancy, rats were killed and their uterine horns were exposed to count implantation and fetus numbers to determine pre- and post-implantation loss rates. The fetuses were classified according to their birth weight. Results Severe and mild diabetic dams showed different glycemic responses during pregnancy, impairing fetal glycemia and weight, confirming that maternal glycemia is directly associated with fetal development. Newborns from severe diabetic mothers presented growth restriction, but mild diabetic mothers were not associated with an increased rate of macrosomic fetuses. Conclusion Experimental models of severe diabetes during pregnancy reproduced maternal and fetal outcomes of pregnant women presenting uncontrolled clinical diabetes. On the other hand, the mild diabetes model caused mild hyperglycemia during pregnancy, although it was not enough to reproduce the increased rate of macrosomic fetuses seen in women with gestational diabetes. PMID:19840387
Schophuizen, Carolien M S; De Napoli, Ilaria E; Jansen, Jitske; Teixeira, Sandra; Wilmer, Martijn J; Hoenderop, Joost G J; Van den Heuvel, Lambert P W; Masereeuw, Rosalinde; Stamatialis, Dimitrios
2015-03-01
The need for improved renal replacement therapies has stimulated innovative research for the development of a cell-based renal assist device. A key requirement for such a device is the formation of a "living membrane", consisting of a tight kidney cell monolayer with preserved functional organic ion transporters on a suitable artificial membrane surface. In this work, we applied a unique conditionally immortalized proximal tubule epithelial cell (ciPTEC) line with an optimized coating strategy on polyethersulfone (PES) membranes to develop a living membrane with a functional proximal tubule epithelial cell layer. PES membranes were coated with combinations of 3,4-dihydroxy-l-phenylalanine and human collagen IV (Coll IV). The optimal coating time and concentrations were determined to achieve retention of vital blood components while preserving high water transport and optimal ciPTEC adhesion. The ciPTEC monolayers obtained were examined through immunocytochemistry to detect zona occludens 1 tight junction proteins. Reproducible monolayers were formed when using a combination of 2 mg ml(-1) 3,4-dihydroxy-l-phenylalanine (4 min coating, 1h dissolution) and 25 μg ml(-1) Coll IV (4 min coating). The successful transport of (14)C-creatinine through the developed living membrane system was used as an indication for organic cation transporter functionality. The addition of metformin or cimetidine significantly reduced the creatinine transepithelial flux, indicating active creatinine uptake in ciPTECs, most likely mediated by the organic cation transporter, OCT2 (SLC22A2). In conclusion, this study shows the successful development of a living membrane consisting of a reproducible ciPTEC monolayer on PES membranes, an important step towards the development of a bioartificial kidney. Copyright © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Cassel, Jean-Christophe; Mathis, Chantal; Majchrzak, Monique; Moreau, Pierre-Henri; Dalrymple-Alford, John C
2008-01-01
One century after Alzheimer's initial report, a variety of animal models of Alzheimer's disease (AD) are being used to mimic one or more pathological signs viewed as critical for the evolution of cognitive decline in dementia. Among the most common are, (a) traditional lesion models aimed at reproducing the degeneration of one of two key brain regions affected in AD, namely the cholinergic basal forebrain (CBF) and the transentorhinal region, and (b) transgenic mouse models aimed at reproducing AD histopathological hallmarks, namely amyloid plaques and neurofibrillary tangles. These models have provided valuable insights into the development and consequences of the pathology, but they have not consistently reproduced the severity of memory deficits exhibited in AD. The reasons for this lack of correspondence with the severity of expected deficits may include the limited replication of multiple neuropathology in potentially key brain regions. A recent lesion model in the rat found that severe memory impairment was obtained only when the two traditional lesions were combined together (i.e. conjoint CBF and entorhinal cortex lesions), indicative of a dramatic impact on cognitive function when there is coexisting, rather than isolated, damage in these two brain regions. It is proposed that combining AD transgenic mouse models with additional experimental damage to both the CBF and entorhinal regions might provide a unique opportunity to further understand the evolution of the disease and improve treatments of severe cognitive dysfunction in neurodegenerative dementias. (c) 2008 S. Karger AG, Basel
Solenoid Driven Pressure Valve System: Toward Versatile Fluidic Control in Paper Microfluidics.
Kim, Taehoon H; Hahn, Young Ki; Lee, Jungmin; van Noort, Danny; Kim, Minseok S
2018-02-20
As paper-based diagnostics has become predominantly driven by more advanced microfluidic technology, many of the research efforts are still focused on developing reliable and versatile fluidic control devices, apart from improving sensitivity and reproducibility. In this work, we introduce a novel and robust paper fluidic control system enabling versatile fluidic control. The system comprises a linear push-pull solenoid and an Arduino Uno microcontroller. The precisely controlled pressure exerted on the paper stops the flow. We first determined the stroke distance of the solenoid to obtain a constant pressure while examining the fluidic time delay as a function of the pressure. Results showed that strips of grade 1 chromatography paper had superior reproducibility in fluid transport. Next, we characterized the reproducibility of the fluidic velocity which depends on the type and grade of paper used. As such, we were able to control the flow velocity on the paper and also achieve a complete stop of flow with a pressure over 2.0 MPa. Notably, after the actuation of the pressure driven valve (PDV), the previously pressed area regained its original flow properties. This means that, even on a previously pressed area, multiple valve operations can be successfully conducted. To the best of our knowledge, this is the first demonstration of an active and repetitive valve operation in paper microfluidics. As a proof of concept, we have chosen to perform a multistep detection system in the form of an enzyme-linked immunosorbent assay with mouse IgG as the target analyte.
Evolvix BEST Names for semantic reproducibility across code2brain interfaces.
Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha
2017-01-01
Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Technical Reports Server (NTRS)
Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.
2016-01-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Spaulding, R.; Becker, J. C.
2016-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Bankova, Andriyana; Andres, Yvonne; Horn, Michael P.; Alberio, Lorenzo
2017-01-01
Background Immunoassays are crucial in the work-up of patients with suspected heparin-induced thrombocytopenia (HIT) and rapid tests have been recently developed. However, comparative data on diagnostic accuracy, reproducibility, and analytical costs of different immunoassays in clinical practice are limited. Methods Samples of 179 consecutive patients evaluated for suspected HIT in clinical practice using a polyspecific enzyme-linked immunoabsorbent assay (GTI diagnostics; ELISA) and a rapid particle gel immunoassay (PaGIA), were additionally analysed with a IgG-specific chemiluminescent immunoassay (AcuStar HIT-IgG). Presence of HIT was defined as a positive functional heparin-induced platelet aggregation test. Diagnostic accuracy was determined for low, intermediate and high thresholds as previously established (ELISA: optical density 0.4, 1.3, and 2.0 respectively; PaGIA: positive/negative, titre of 4, titre of 32; AcuStar HIT-IgG: 1.0 U/ml, 2.8, 9.4) and reproducibility was assessed by repeated measurements. Costs of test determination were calculated taking reagents, controls, and working time of technicians according to Swiss health care system into account. Results Data on PaGIA results were available for 171 patients (95.5%), ELISA for 144 patients (80.4%), and AcuStar HIT-IgG for 179 patients (100%). Sensitivity was above 95% for all assays at low and intermediate thresholds. Specificity increased with higher thresholds and was above 90% for all assays with intermediate and high thresholds. Specificity of AcuStar HIT-IgG (92.8%; 95% CI 87.7, 96.2) was significantly higher than PaGIA (83.0%; 95% CI 76.3, 88.5) and higher than ELISA (81.8%, 95% CI 74.2, 88.0) at low threshold (p<0.05). Reproducibility was adequate for all assays. Total costs per test were CHF 51.02 for ELISA, 117.70 for AcuStar HIT-IgG, and 83.13 for PaGIA. Conclusions We observed favourable diagnostic accuracy measures and a high reproducibility for PaGIA and AcuStar HIT-IgG. Implementation into 24-hours-service might improve patient care but the results must be confirmed in other settings and larger populations as well. PMID:28594835
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laugeman, E; Weiss, E; Chen, S
2014-06-01
Purpose: Evaluate and compare the cycle-to-cycle consistency of breathing patterns and their reproducibility over the course of treatment, for supine and prone positioning. Methods: Respiratory traces from 25 patients were recorded for sequential supine/prone 4DCT scans acquired prior to treatment, and during the course of the treatment (weekly or bi-weekly). For each breathing cycle, the average(AVE), end-of-exhale(EoE) and end-of-inhale( EoI) locations were identified using in-house developed software. In addition, the mean values and variations for the above quantities were computed for each breathing trace. F-tests were used to compare the cycle-to-cycle consistency of all pairs of sequential supine and pronemore » scans. Analysis of variances was also performed using population means for AVE, EoE and EoI to quantify differences between the reproducibility of prone and supine respiration traces over the treatment course. Results: Consistency: Cycle-to-cycle variations are less in prone than supine in the pre-treatment and during-treatment scans for AVE, EoE and EoI points, for the majority of patients (differences significant at p<0.05). The few cases where the respiratory pattern had more variability in prone appeared to be random events. Reproducibility: The reproducibility of breathing patterns (supine and prone) improved as treatment progressed, perhaps due to patients becoming more comfortable with the procedure. However, variability in supine position continued to remain significantly larger than in prone (p<0.05), as indicated by the variance analysis of population means for the pretreatment and subsequent during-treatment scans. Conclusions: Prone positioning stabilizes breathing patterns in most subjects investigated in this study. Importantly, a parallel analysis of the same group of patients revealed a tendency towards increasing motion amplitude of tumor targets in prone position regardless of their size or location; thus, the choice for body positioning during radiation therapy will have to consider the clinical relevance of the two opposing trends - breathing consistency and motion amplitude.« less
Improving the Impact and Return of Investment of Game-Based Learning
ERIC Educational Resources Information Center
Loh, Christian Sebastian
2013-01-01
Today's economic situation demands that learning organizations become more diligent in their business dealings to reduce cost and increase bottom line for survival. While there are many champions and proponents claiming that game-based learning (GBL) is sure to improve learning, researchers have, thus far, been unable to (re)produce concrete,…
ERIC Educational Resources Information Center
Goswick, Anna E.; Mullet, Hillary G.; Marsh, Elizabeth J.
2013-01-01
Children's memories improve throughout childhood, and this improvement is often accompanied by a reduction in suggestibility. In this context, it is surprising that older children learn and reproduce more factual errors from stories than do younger children (Fazio & Marsh, 2008). The present study examined whether this developmental…
Relaxing decision criteria does not improve recognition memory in amnesic patients.
Reber, P J; Squire, L R
1999-05-01
An important question about the organization of memory is whether information available in non-declarative memory can contribute to performance on tasks of declarative memory. Dorfman, Kihlstrom, Cork, and Misiaszek (1995) described a circumstance in which the phenomenon of priming might benefit recognition memory performance. They reported that patients receiving electroconvulsive therapy improved their recognition performance when they were encouraged to relax their criteria for endorsing test items as familiar. It was suggested that priming improved recognition by making information available about the familiarity of test items. In three experiments, we sought unsuccessfully to reproduce this phenomenon in amnesic patients. In Experiment 3, we reproduced the methods and procedure used by Dorfman et al. but still found no evidence for improved recognition memory following the manipulation of decision criteria. Although negative findings have their own limitations, our findings suggest that the phenomenon reported by Dorfman et al. does not generalize well. Our results agree with several recent findings that suggest that priming is independent of recognition memory and does not contribute to recognition memory scores.
Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie
2012-01-01
Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464
Togni, P; Rijnen, Z; Numan, W C M; Verhaart, R F; Bakker, J F; van Rhoon, G C; Paulides, M M
2013-09-07
Accumulating evidence shows that hyperthermia improves head-and-neck cancer treatment. Over the last decade, we introduced a radiofrequency applicator, named HYPERcollar, which enables local heating also of deep locations in this region. Based on clinical experience, we redesigned the HYPERcollar for improved comfort, reproducibility and operator handling. In the current study, we analyze the redesign from an electromagnetic point of view. We show that a higher number of antennas and their repositioning allow for a substantially improved treatment quality. Combined with the much better reproducibility of the water bolus, this will substantially minimize the risk of underexposure. All improvements combined enable a reduction of hot-spot prominence (hot-spot to target SAR quotient) by 32% at an average of 981 W, which drastically reduces the probability for system power to become a treatment limiting source. Moreover, the power deposited in the target selectively can be increased by more than twofold. Hence, we expect that the HYPERcollar redesign currently under construction allows us to double the clinically applied power to the target while reducing the hot-spots, resulting in higher temperatures and, consequently, better clinical outcome.
An SU-8 liquid cell for surface acoustic wave biosensors
NASA Astrophysics Data System (ADS)
Francis, Laurent A.; Friedt, Jean-Michel; Bartic, Carmen; Campitelli, Andrew
2004-08-01
One significant challenge facing biosensor development is packaging. For surface acoustic wave based biosensors, packaging influences the general sensing performance. The acoustic wave is generated and received thanks to interdigital transducers and the separation between the transducers defines the sensing area. Liquids used in biosensing experiments lead to an attenuation of the acoustic signal while in contact with the transducers. We have developed a liquid cell based on photodefinable epoxy SU-8 that prevents the presence of liquid on the transducers, has a small disturbance effect on the propagation of the acoustic wave, does not interfere with the biochemical sensing event, and leads to an integrated sensor system with reproducible properties. The liquid cell is achieved in two steps. In a first step, the SU-8 is precisely patterned around the transducers to define 120 μm thick walls. In a second step and after the dicing of the sensors, a glass capping is placed manually and glued on top of the SU-8 walls. This design approach is an improvement compared to the more classical solution consisting of a pre-molded cell that must be pressed against the device in order to avoid leaks, with negative consequences on the reproducibility of the experimental results. We demonstrate the effectiveness of our approach by protein adsorption monitoring. The packaging materials do not interfere with the biomolecules and have a high chemical resistance. For future developments, wafer level bonding of the quartz capping onto the SU-8 walls is envisioned.
Vahle-Hinz, K; Rybczynski, A; Jakstat, H; Ahlers, M O
2009-01-01
Condylar position analysis facilitates a quantitative comparison of the condylar position with and without a bite record, different records and changed influencing factors. Handling by the examiner when positioning the model is a significant factor with regard to the accuracy of the examination. Measurement accuracy could be improved when positioning the models by using special working bites, hence the objective of the experiments described in this study consisted in examining the extent to which the measuring results are influenced by different examiners and by using working bites. In the first trial, one examiner performed ten measurements without and with an interposed working bite for five model pairs in each case. In the second trial, nine examiners (three specialized dentists, three dental assistants, three students) performed ten measurements in each case without and with an interposed working bite. The three-dimensional position was read digitally with the E-CPM (Gamma Dental, Klosterneuburg/Vienna, Austria), recorded by means of spreadsheet software (Microsoft Excel) and diagnostic software (CMDfact, CMD3D module, dentaConcept, Hamburg), and evaluated with graphing software (Sigma Plot, Systat Software, USA). In the first trial, it was shown that the reproducibility of mounting was improved markedly (p <0.01) by using bite records in the form of working bites. In the second trial, it was shown that the mean error increased significantly (p <0.01) when several examiners performed the measurements compared with the results of one examiner alone. No significantly different results occurred (p < 0.01) in the comparison of the different groups of examiners with different educational and training backgrounds. This applied for the mounting methods without and with working bite. On the other hand, the reproducibility of mounting improved distinctly (p<0.01) in every group of examiners when working bites were used. Reproducibility of condylar position analysis was improved significantly by mounting the models with special working bites. This applied for operators of different professional background (dentists, dental assistants and dental students), while there were no significant differences between results of the three groups.
Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki
2014-11-01
Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.
Reproducible detection of disease-associated markers from gene expression data.
Omae, Katsuhiro; Komori, Osamu; Eguchi, Shinto
2016-08-18
Detection of disease-associated markers plays a crucial role in gene screening for biological studies. Two-sample test statistics, such as the t-statistic, are widely used to rank genes based on gene expression data. However, the resultant gene ranking is often not reproducible among different data sets. Such irreproducibility may be caused by disease heterogeneity. When we divided data into two subsets, we found that the signs of the two t-statistics were often reversed. Focusing on such instability, we proposed a sign-sum statistic that counts the signs of the t-statistics for all possible subsets. The proposed method excludes genes affected by heterogeneity, thereby improving the reproducibility of gene ranking. We compared the sign-sum statistic with the t-statistic by a theoretical evaluation of the upper confidence limit. Through simulations and applications to real data sets, we show that the sign-sum statistic exhibits superior performance. We derive the sign-sum statistic for getting a robust gene ranking. The sign-sum statistic gives more reproducible ranking than the t-statistic. Using simulated data sets we show that the sign-sum statistic excludes hetero-type genes well. Also for the real data sets, the sign-sum statistic performs well in a viewpoint of ranking reproducibility.
Improving 3D Genome Reconstructions Using Orthologous and Functional Constraints
Diament, Alon; Tuller, Tamir
2015-01-01
The study of the 3D architecture of chromosomes has been advancing rapidly in recent years. While a number of methods for 3D reconstruction of genomic models based on Hi-C data were proposed, most of the analyses in the field have been performed on different 3D representation forms (such as graphs). Here, we reproduce most of the previous results on the 3D genomic organization of the eukaryote Saccharomyces cerevisiae using analysis of 3D reconstructions. We show that many of these results can be reproduced in sparse reconstructions, generated from a small fraction of the experimental data (5% of the data), and study the properties of such models. Finally, we propose for the first time a novel approach for improving the accuracy of 3D reconstructions by introducing additional predicted physical interactions to the model, based on orthologous interactions in an evolutionary-related organism and based on predicted functional interactions between genes. We demonstrate that this approach indeed leads to the reconstruction of improved models. PMID:26000633
Reproducibility in Computational Neuroscience Models and Simulations
McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.
2016-01-01
Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845
Reproducibility Issues: Avoiding Pitfalls in Animal Inflammation Models.
Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E
2017-01-01
In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite evident advancements in searching for alternatives, that is, replacing, reducing, and refining animal experiments-the three R's of Russel and Burch (1959)-understanding the complex interactions of the cells of the immune system, the nervous system and the affected tissue/organ during inflammation critically relies on in vivo models. Consequently, scientific advancement and ultimately novel therapeutic interventions depend on improving the reproducibility of animal inflammation models. As a prelude to the remaining hands-on protocols described in this volume, here, we summarize potential pitfalls of preclinical animal research and provide resources and background reading on how to avoid them.
Hadron rapidity spectra within a hybrid model
NASA Astrophysics Data System (ADS)
Khvorostukhin, A. S.; Toneev, V. D.
2017-01-01
A 2-stage hybrid model is proposed that joins the fast initial state of interaction, described by the hadron string dynamics (HSD) model, to subsequent evolution of the expanding system at the second stage, treated within ideal hydrodynamics. The developed hybrid model is assigned to describe heavy-ion collisions in the energy range of the NICA collider under construction in Dubna. Generally, the model is in reasonable agreement with the available data on proton rapidity spectra. However, reproducing proton rapidity spectra, our hybrid model cannot describe the rapidity distributions of pions. The model should be improved by taking into consideration viscosity effects at the hydrodynamical stage of system evolution.
Renal Tumor Anatomic Complexity: Clinical Implications for Urologists.
Joshi, Shreyas S; Uzzo, Robert G
2017-05-01
Anatomic tumor complexity can be objectively measured and reported using nephrometry. Various scoring systems have been developed in an attempt to correlate tumor complexity with intraoperative and postoperative outcomes. Nephrometry may also predict tumor biology in a noninvasive, reproducible manner. Other scoring systems can help predict surgical complexity and the likelihood of complications, independent of tumor characteristics. The accumulated data in this new field provide provocative evidence that objectifying anatomic complexity can consolidate reporting mechanisms and improve metrics of comparisons. Further prospective validation is needed to understand the full descriptive and predictive ability of the various nephrometry scores. Copyright © 2017 Elsevier Inc. All rights reserved.
Determining a one-tailed upper limit for future sample relative reproducibility standard deviations.
McClure, Foster D; Lee, Jung K
2006-01-01
A formula was developed to determine a one-tailed 100p% upper limit for future sample percent relative reproducibility standard deviations (RSD(R),%= 100s(R)/y), where S(R) is the sample reproducibility standard deviation, which is the square root of a linear combination of the sample repeatability variance (s(r)2) plus the sample laboratory-to-laboratory variance (s(L)2), i.e., S(R) = s(L)2, and y is the sample mean. The future RSD(R),% is expected to arise from a population of potential RSD(R),% values whose true mean is zeta(R),% = 100sigmaR, where sigmaR and mu are the population reproducibility standard deviation and mean, respectively.
NASA Technical Reports Server (NTRS)
Palmieri, Frank L.; Belcher, Marcus A.; Wohl, Christopher J.; Blohowiak, Kay Y.; Connell, John W.
2013-01-01
Surface preparation is widely recognized as a key step to producing robust and predictable bonds in a precise and reproducible manner. Standard surface preparation techniques, including grit blasting, manual abrasion, and peel ply, can lack precision and reproducibility, which can lead to variation in surface properties and subsequent bonding performance. The use of a laser to ablate composite surface resin can provide an efficient, precise, and reproducible means of preparing composite surfaces for adhesive bonding. Advantages include elimination of physical waste (i.e., grit media and sacrificial peel ply layers that ultimately require disposal), reduction in process variability due to increased precision (e.g. increased reproducibility), and automation of surface preparation, all of which improve reliability and process control. This paper describes a Nd:YAG laser surface preparation technique for composite substrates and the mechanical performance and failure modes of bonded laminates thus prepared. Additionally, bonded specimens were aged in a hot, wet environment for approximately one year and subsequently mechanically tested. The results of a one year hygrothermal aging study will be presented.
Nanomaterial-Based Electrochemical Immunosensors for Clinically Significant Biomarkers
Ronkainen, Niina J.; Okon, Stanley L.
2014-01-01
Nanotechnology has played a crucial role in the development of biosensors over the past decade. The development, testing, optimization, and validation of new biosensors has become a highly interdisciplinary effort involving experts in chemistry, biology, physics, engineering, and medicine. The sensitivity, the specificity and the reproducibility of biosensors have improved tremendously as a result of incorporating nanomaterials in their design. In general, nanomaterials-based electrochemical immunosensors amplify the sensitivity by facilitating greater loading of the larger sensing surface with biorecognition molecules as well as improving the electrochemical properties of the transducer. The most common types of nanomaterials and their properties will be described. In addition, the utilization of nanomaterials in immunosensors for biomarker detection will be discussed since these biosensors have enormous potential for a myriad of clinical uses. Electrochemical immunosensors provide a specific and simple analytical alternative as evidenced by their brief analysis times, inexpensive instrumentation, lower assay cost as well as good portability and amenability to miniaturization. The role nanomaterials play in biosensors, their ability to improve detection capabilities in low concentration analytes yielding clinically useful data and their impact on other biosensor performance properties will be discussed. Finally, the most common types of electroanalytical detection methods will be briefly touched upon. PMID:28788700
Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong
2017-02-01
Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.
Dowling, Geraldine; Malone, Edward; Harbison, Tom; Martin, Sheila
2010-07-01
A sensitive and selective method for the determination of six non-steroidal anti-inflammatory drugs (NSAIDs) in bovine plasma was developed. An improved method for the determination of authorized and non-authorized residues of 10 non-steroidal anti-inflammatory drugs in milk was developed. Analytes were separated and acquired by high performance liquid chromatography coupled with an electrospray ionisation tandem mass spectrometer (ESI-MS/MS). Target compounds were acidified in plasma, and plasma and milk samples were extracted with acetonitrile and both extracts were purified on an improved solid phase extraction procedure utilising Evolute ABN cartridges. The accuracy of the methods for milk and plasma was between 73 and 109%. The precision of the method for authorized and non-authorized NSAIDs in milk and plasma expressed as % RSD, for the within lab reproducibility was less than 16%. The % RSD for authorized NSAIDs at their associated MRL(s) in milk was less than 10% for meloxicam, flunixin and tolfenamic acid and was less than 25% for hydroxy flunixin. The methods were validated according to Commission Decision 2002/657/EC.
Engelberg, Jesse A; Retallack, Hanna; Balassanian, Ronald; Dowsett, Mitchell; Zabaglo, Lila; Ram, Arishneel A; Apple, Sophia K; Bishop, John W; Borowsky, Alexander D; Carpenter, Philip M; Chen, Yunn-Yi; Datnow, Brian; Elson, Sarah; Hasteh, Farnaz; Lin, Fritz; Moatamed, Neda A; Zhang, Yanhong; Cardiff, Robert D
2015-11-01
Hormone receptor status is an integral component of decision-making in breast cancer management. IHC4 score is an algorithm that combines hormone receptor, HER2, and Ki-67 status to provide a semiquantitative prognostic score for breast cancer. High accuracy and low interobserver variance are important to ensure the score is accurately calculated; however, few previous efforts have been made to measure or decrease interobserver variance. We developed a Web-based training tool, called "Score the Core" (STC) using tissue microarrays to train pathologists to visually score estrogen receptor (using the 300-point H score), progesterone receptor (percent positive), and Ki-67 (percent positive). STC used a reference score calculated from a reproducible manual counting method. Pathologists in the Athena Breast Health Network and pathology residents at associated institutions completed the exercise. By using STC, pathologists improved their estrogen receptor H score and progesterone receptor and Ki-67 proportion assessment and demonstrated a good correlation between pathologist and reference scores. In addition, we collected information about pathologist performance that allowed us to compare individual pathologists and measures of agreement. Pathologists' assessment of the proportion of positive cells was closer to the reference than their assessment of the relative intensity of positive cells. Careful training and assessment should be used to ensure the accuracy of breast biomarkers. This is particularly important as breast cancer diagnostics become increasingly quantitative and reproducible. Our training tool is a novel approach for pathologist training that can serve as an important component of ongoing quality assessment and can improve the accuracy of breast cancer prognostic biomarkers. Copyright © 2015 Elsevier Inc. All rights reserved.
Nanomedical innovation: the SEON-concept for an improved cancer therapy with magnetic nanoparticles.
Lyer, Stefan; Tietze, Rainer; Unterweger, Harald; Zaloga, Jan; Singh, Raminder; Matuszak, Jasmin; Poettler, Marina; Friedrich, Ralf P; Duerr, Stephan; Cicha, Iwona; Janko, Christina; Alexiou, Christoph
2015-01-01
Nanomedicine offers tremendous opportunities for the development of novel therapeutic and diagnostic tools. During the last decades, extensive knowledge was gained about stabilizing and the coating of nanoparticles, their functionalization for drug binding and drug release and possible strategies for therapies and diagnostics of different diseases. Most recently, more and more emphasis has been placed on nanotoxicology and nanosafety aspects. The section of experimental oncology and nanomedicine developed a concept for translating this knowledge into clinical application of magnetic drug targeting for the treatment of cancer and other diseases using superparamagnetic iron oxide nanoparticles. This approach includes reproducible synthesis, detailed characterization, nanotoxicological testing, evaluation in ex vivo models, preclinical animal studies and production of superparamagnetic iron oxide nanoparticles according to good manufacturing practice regulations.
Social forces for team coordination in ball possession game
NASA Astrophysics Data System (ADS)
Yokoyama, Keiko; Shima, Hiroyuki; Fujii, Keisuke; Tabuchi, Noriyuki; Yamamoto, Yuji
2018-02-01
Team coordination is a basic human behavioral trait observed in many real-life communities. To promote teamwork, it is important to cultivate social skills that elicit team coordination. In the present work, we consider which social skills are indispensable for individuals performing a ball possession game in soccer. We develop a simple social force model that describes the synchronized motion of offensive players. Comparing the simulation results with experimental observations, we uncovered that the cooperative social force, a measure of perception skill, has the most important role in reproducing the harmonized collective motion of experienced players in the task. We further developed an experimental tool that facilitates real players' perceptions of interpersonal distance, revealing that the tool improves novice players' motions as if the cooperative social force were imposed.
Calibration of X-Ray diffractometer by the experimental comparison method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dudka, A. P., E-mail: dudka@ns.crys.ras.ru
2015-07-15
A software for calibrating an X-ray diffractometer with area detector has been developed. It is proposed to search for detector and goniometer calibration models whose parameters are reproduced in a series of measurements on a reference crystal. Reference (standard) crystals are prepared during the investigation; they should provide the agreement of structural models in repeated analyses. The technique developed has been used to calibrate Xcalibur Sapphire and Eos, Gemini Ruby (Agilent) and Apex x8 and Apex Duo (Bruker) diffractometers. The main conclusions are as follows: the calibration maps are stable for several years and can be used to improve structuralmore » results, verified CCD detectors exhibit significant inhomogeneity of the efficiency (response) function, and a Bruker goniometer introduces smaller distortions than an Agilent goniometer.« less
Yuan, Guangdi; Wan, Yanran; Li, Xiaoyu; He, Bingqing; Zhang, Youjun; Xu, Baoyun; Wang, Shaoli; Xie, Wen; Zhou, Xuguo; Wu, Qingjun
2017-01-01
Although near-isogenic lines (NILs) can standardize genetic backgrounds among individuals, it has never been applied in parthenogenetically reproduced animals. Here, through multiple rounds of backcrossing and spinosad screening, we generated spinosad resistant NILs in the western flower thrips, Frankliniella occidentalis (Pergande) (Thysanoptera: Thripidae), with a haplo-diploid reproduction system. The resultant F. occidentalis NIL-R strain maintained a resistance ratio over 30,000-fold, which was comparable to its parental resistant strain, Spin-R. More importantly, F. occidentalis NIL-R shared 98.90% genetic similarity with its susceptible parental strain Ivf03. By developing this toolset, we are able to segregate individual resistance and facilitate the mechanistic study of insecticide resistances in phloem-feeding arthropods, a group of devastating pest species reproducing sexually as well as asexually. PMID:28348528
Yuan, Guangdi; Wan, Yanran; Li, Xiaoyu; He, Bingqing; Zhang, Youjun; Xu, Baoyun; Wang, Shaoli; Xie, Wen; Zhou, Xuguo; Wu, Qingjun
2017-01-01
Although near-isogenic lines (NILs) can standardize genetic backgrounds among individuals, it has never been applied in parthenogenetically reproduced animals. Here, through multiple rounds of backcrossing and spinosad screening, we generated spinosad resistant NILs in the western flower thrips, Frankliniella occidentalis (Pergande) (Thysanoptera: Thripidae), with a haplo-diploid reproduction system. The resultant F. occidentalis NIL-R strain maintained a resistance ratio over 30,000-fold, which was comparable to its parental resistant strain, Spin-R. More importantly, F. occidentalis NIL-R shared 98.90% genetic similarity with its susceptible parental strain Ivf03. By developing this toolset, we are able to segregate individual resistance and facilitate the mechanistic study of insecticide resistances in phloem-feeding arthropods, a group of devastating pest species reproducing sexually as well as asexually.
Controlled electrostatic methodology for imaging indentations in documents.
Yaraskavitch, Luke; Graydon, Matthew; Tanaka, Tobin; Ng, Lay-Keow
2008-05-20
The electrostatic process for imaging indentations on documents using the ESDA device is investigated under controlled experimental settings. An in-house modified commercial xerographic developer housing is used to control the uniformity and volume of toner deposition, allowing for reproducible image development. Along with this novel development tool, an electrostatic voltmeter and fixed environmental conditions facilitate an optimization process. Sample documents are preconditioned in a humidity cabinet with microprocessor control, and the significant benefit of humidification above 70% RH on image quality is verified. Improving on the subjective methods of previous studies, image quality analysis is carried out in an objective and reproducible manner using the PIAS-II. For the seven commercial paper types tested, the optimum ESDA operating point is found to be at an electric potential near -400V at the Mylar surface; however, for most paper types, the optimum operating regime is found to be quite broad, spanning relatively small electric potentials between -200 and -550V. At -400V, the film right above an indented area generally carries a voltage which is 30-50V less negative than the non-indented background. In contrast with Seward's findings [G.H. Seward, Model for electrostatic imaging of forensic evidence via discharge through Mylar-paper path, J. Appl. Phys. 83 (3) (1998) 1450-1456; G.H. Seward, Practical implications of the charge transport model for electrostatic detection apparatus (ESDA), J. Forensic Sci. 44 (4) (1999) 832-836], a period of charge decay before image development is not required when operating in this optimal regime. A brief investigation of the role played by paper-to-paper friction during the indentation process is conducted using our optimized development method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singleton, A.H.
1995-06-28
The goal of this project is the development of a commercially-viable, cobalt-based Fischer-Tropsch (F-T) catalyst for use in a slurry bubble column reactor. The major objectives of this work are (1) to develop a cobalt-based F-T catalyst with low (< 5%) methane selectivity, (2) to develop a cobalt-based F-T catalyst with water-gas shift activity, and (3) to combine both these improvements into one catalyst. The project consists of five major tasks: catalyst development; catalyst testing; catalyst reproducibility tests; catalyst aging tests; and preliminary design and cost estimate for a demonstrate scale catalyst production facility. Technical accomplishments during this reporting periodmore » include the following. It appears that the higher activity obtained for the catalysts prepared using an organic solution and reduced directly without prior calcination was the result of higher dispersions obtained under such pretreatment. A Ru-promoted Co catalyst on alumina with 30% Co loading exhibited a 4-fold increase in dispersion and a 2-fold increase in activity in the fixed-bed reactor from that obtained with the non-promoted catalyst. Several reactor runs have again focused on pushing conversion to higher levels. The maximum conversion obtained has been 49.7% with 26g catalyst. Further investigations of the effect of reaction temperature on the performance of Co catalysts during F-T synthesis were started using a low activity catalyst and one of the most active catalysts. The three 1 kg catalyst batches prepared by Calsicat for the reproducibility and aging studies were tested in both the fixed-bed and slurry bubble column reactors under the standard reaction conditions. The effects of adding various promoters to some cobalt catalysts have also been addressed. Results are presented and discussed.« less
Warren, Jamie M; Pawliszyn, Janusz
2011-12-16
For air/headspace analysis, needle trap devices (NTDs) are applicable for sampling a wide range of volatiles such as benzene, alkanes, and semi-volatile particulate bound compounds such as pyrene. This paper describes a new NTD that is simpler to produce and improves performance relative to previous NTD designs. A NTD utilizing a side-hole needle used a modified tip, which removed the need to use epoxy glue to hold sorbent particles inside the NTD. This design also improved the seal between the NTD and narrow neck liner of the GC injector; therefore, improving the desorption efficiency. A new packing method has been developed and evaluated using solvent to pack the device, and is compared to NTDs prepared using the previous vacuum aspiration method. The slurry packing method reduced preparation time and improved reproducibility between NTDs. To evaluate the NTDs, automated headspace extraction was completed using benzene, toluene, ethylbenzene, p-xylene (BTEX), anthracene, and pyrene (PAH). NTD geometries evaluated include: blunt tip with side-hole needle, tapered tip with side-hole needle, slider tip with side-hole, dome tapered tip with side-hole and blunt with no side-hole needle (expanded desorptive flow). Results demonstrate that the tapered and slider tip NTDs performed with improved desorption efficiency. Copyright © 2011 Elsevier B.V. All rights reserved.
George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N
2015-05-01
Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.
Faster, More Reproducible DESI-MS for Biological Tissue Imaging
NASA Astrophysics Data System (ADS)
Tillner, Jocelyn; Wu, Vincen; Jones, Emrys A.; Pringle, Steven D.; Karancsi, Tamas; Dannhorn, Andreas; Veselkov, Kirill; McKenzie, James S.; Takats, Zoltan
2017-10-01
A new, more robust sprayer for desorption electrospray ionization (DESI) mass spectrometry imaging is presented. The main source of variability in DESI is thought to be the uncontrolled variability of various geometric parameters of the sprayer, primarily the position of the solvent capillary, or more specifically, its positioning within the gas capillary or nozzle. If the solvent capillary is off-center, the sprayer becomes asymmetrical, making the geometry difficult to control and compromising reproducibility. If the stiffness, tip quality, and positioning of the capillary are improved, sprayer reproducibility can be improved by an order of magnitude. The quality of the improved sprayer and its potential for high spatial resolution imaging are demonstrated on human colorectal tissue samples by acquisition of images at pixel sizes of 100, 50, and 20 μm, which corresponds to a lateral resolution of 40-60 μm, similar to the best values published in the literature. The high sensitivity of the sprayer also allows combination with a fast scanning quadrupole time-of-flight mass spectrometer. This provides up to 30 times faster DESI acquisition, reducing the overall acquisition time for a 10 mm × 10 mm rat brain sample to approximately 1 h. Although some spectral information is lost with increasing analysis speed, the resulting data can still be used to classify tissue types on the basis of a previously constructed model. This is particularly interesting for clinical applications, where fast, reliable diagnosis is required. [Figure not available: see fulltext.
Labots, Mariette; van der Mijn, Johannes C; Beekhof, Robin; Piersma, Sander R; de Goeij-de Haas, Richard R; Pham, Thang V; Knol, Jaco C; Dekker, Henk; van Grieken, Nicole C T; Verheul, Henk M W; Jiménez, Connie R
2017-06-06
Mass spectrometry-based phosphoproteomics of cancer cell and tissue lysates provides insight in aberrantly activated signaling pathways and potential drug targets. For improved understanding of individual patient's tumor biology and to allow selection of tyrosine kinase inhibitors in individual patients, phosphoproteomics of small clinical samples should be feasible and reproducible. We aimed to scale down a pTyr-phosphopeptide enrichment protocol to biopsy-level protein input and assess reproducibility and applicability to tumor needle biopsies. To this end, phosphopeptide immunoprecipitation using anti-phosphotyrosine beads was performed using 10, 5 and 1mg protein input from lysates of colorectal cancer (CRC) cell line HCT116. Multiple needle biopsies from 7 human CRC resection specimens were analyzed at the 1mg-level. The total number of phosphopeptides captured and detected by LC-MS/MS ranged from 681 at 10mg input to 471 at 1mg HCT116 protein. ID-reproducibility ranged from 60.5% at 10mg to 43.9% at 1mg. Per 1mg-level biopsy sample, >200 phosphopeptides were identified with 57% ID-reproducibility between paired tumor biopsies. Unsupervised analysis clustered biopsies from individual patients together and revealed known and potential therapeutic targets. This study demonstrates the feasibility of label-free pTyr-phosphoproteomics at the tumor biopsy level based on reproducible analyses using 1mg of protein input. The considerable number of identified phosphopeptides at this level is attributed to an effective down-scaled immuno-affinity protocol as well as to the application of ID propagation in the data processing and analysis steps. Unsupervised cluster analysis reveals patient-specific profiles. Together, these findings pave the way for clinical trials in which pTyr-phosphoproteomics will be performed on pre- and on-treatment biopsies. Such studies will improve our understanding of individual tumor biology and may enable future pTyr-phosphoproteomics-based personalized medicine. Copyright © 2017. Published by Elsevier B.V.
Worth, Leon J; Brett, Judy; Bull, Ann L; McBryde, Emma S; Russo, Philip L; Richards, Michael J
2009-10-01
Effective and comparable surveillance for central venous catheter-related bloodstream infections (CLABSIs) in the intensive care unit requires a reproducible case definition that can be readily applied by infection control professionals. Using a questionnaire containing clinical cases, reproducibility of the National Nosocomial Infection Surveillance System (NNIS) surveillance definition for CLABSI was assessed in an Australian cohort of infection control professionals participating in the Victorian Hospital Acquired Infection Surveillance System (VICNISS). The same questionnaire was then used to evaluate the reproducibility of the National Healthcare Safety Network (NHSN) surveillance definition for CLABSI. Target hospitals were defined as large metropolitan (1A) or other large hospitals (non-1A), according to the Victorian Department of Human Services. Questionnaire responses of Centers for Disease Control and Prevention NHSN surveillance experts were used as gold standard comparator. Eighteen of 21 eligible VICNISS centers participated in the survey. Overall concordance with the gold standard was 57.1%, and agreement was highest for 1A hospitals (60.6%). The proportion of congruently classified cases varied according to NNIS criteria: criterion 1 (recognized pathogen), 52.8%; criterion 2a (skin contaminant in 2 or more blood cultures), 83.3%; criterion 2b (skin contaminant in 1 blood culture and appropriate antimicrobial therapy instituted), 58.3%; non-CLABSI cases, 51.4%. When survey questions regarding identification of cases of CLABSI criterion 2b were removed (consistent with the current NHSN definition), overall percentage concordance increased to 62.5% (72.2% for 1A centers). Further educational interventions are required to improve the discrimination of primary and secondary causes of bloodstream infection in Victorian intensive care units. Although reproducibility of the CLABSI case definition is relatively poor, adoption of the revised NHSN definition for CLABSI is likely to improve the concordance of Victorian data with international centers.
NASA Astrophysics Data System (ADS)
Kanki, R.; Uchiyama, Y.; Miyazaki, D.; Takano, A.; Miyazawa, Y.; Yamazaki, H.
2014-12-01
Mesoscale oceanic structure and variability are required to be reproduced as accurately as possible in realistic regional ocean modeling. Uchiyama et al. (2012) demonstrated with a submesoscale eddy-resolving JCOPE2-ROMS downscaling oceanic modeling system that the mesoscale reproducibility of the Kuroshio meandering along Japan is significantly improved by introducing a simple restoration to data which we call "TS nudging" (a.k.a. robust diagnosis) where the prognostic temperature and salinity fields are weakly nudged four-dimensionally towards the assimilative JCOPE2 reanalysis (Miyazawa et al., 2009). However, there is not always a reliable reanalysis for oceanic downscaling in an arbitrary region and at an arbitrary time, and therefore alternative dataset should be prepared. Takano et al. (2009) proposed an empirical method to estimate mesoscale 3-D thermal structure from the near real-time AVISO altimetry data along with the ARGO float data based on the two-layer model of Goni et al. (1996). In the present study, we consider the TS data derived from this method as a candidate. We thus conduct a synoptic forward modeling of the Kuroshio using the JCOPE2-ROMS downscaling system to explore potential utility of this empirical TS dataset (hereinafter TUM-TS) by carrying out two runs with the T-S nudging towards 1) the JCOPE2-TS and 2) TUM-TS fields. An example of the comparison between the two ROMS test runs is shown in the attached figure showing the annually averaged surface EKE. Both of TUM-TS and JCOPE2-TS are found to help reproducing the mesoscale variance of the Koroshio and its extension as well as its mean paths, surface KE and EKE reasonably well. Therefore, the AVISO-ARGO derived empirical 3-D TS estimation is potentially exploitable for the dataset to conduct the T-S nudging to reproduce mesoscale oceanic structure.
Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén
2006-04-21
A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.
Sağlam, Arzu; Usubütün, Alp; Dolgun, Anıl; Mutter, George L; Salman, M Coşkun; Kurtulan, Olcay; Akyol, Aytekin; Özkan, Eylem Akar; Baykara, Sema; Bülbül, Dilek; Calay, Zerrin; Eren, Funda; Gümürdülü, Derya; Haberal, Nihan; Ilvan, Şennur; Karaveli, Şeyda; Koyuncuoğlu, Meral; Müezzinoğlu, Bahar; Müftüoğlu, Kamil Hakan; Özen, Özlem; Özdemir, Necmettin; Peştereli, Elif; Ulukuş, Çağnur; Zekioğlu, Osman
2017-01-01
Inter-observer differences in the diagnosis of HPV related cervical lesions are problematic and response of gynecologists to these diagnostic entities is non-standardized. This study evaluated the diagnostic reproducibility of "cervical intraepithelial neoplasia" (CIN) and "squamous intraepithelial lesion" (SIL) diagnoses. 19 pathologists evaluated 66 cases once using H&E slides and once with immunohistochemical studies (p16, Ki-67 and Pro-ExC). Management response to diagnoses was evaluated amongst 12 gynecologists. Pathologists and gynecologists were also given a questionnaire about how additional information like smear results and age modify diagnosis and management. We show moderate interobserver diagnostic reproducibility amongst pathologists. The overall kappa value was 0.50 and 0.59 using the CIN and SIL classifications respectively. Impact of immunohistochemical evaluation on interpretation of cases differed and there was lack of statistically significant improvement of interobserver diagnostic reproducibility with the addition of immunohistochemistry. We saw that choice of treatment methods amongst gynecologists varied and overall concordance was only fair to moderate. The CIN2 diagnostic category was seen to have the lowest percentage agreement amongst both pathologists and gynecologists. We showed that pathologists had diagnostic "styles" and gynecologists had management "styles". In summary each pathologist had different diagnostic tendencies which were affected not only by histopathology and marker studies, but also by the patient management tendencies of the gynecologist that the pathologist worked with. The two-tiered modified Bethesda system improved diagnostic agreement. We concluded that immunohistochemistry should be used only to resolve problems in select cases and not for every case.
Richardson, Peter M.; Jackson, Scott; Parrott, Andrew J.; Nordon, Alison; Duckett, Simon B.
2018-01-01
Signal amplification by reversible exchange (SABRE) is a hyperpolarisation technique that catalytically transfers nuclear polarisation from parahydrogen, the singlet nuclear isomer of H2, to a substrate in solution. The SABRE exchange reaction is carried out in a polarisation transfer field (PTF) of tens of gauss before transfer to a stronger magnetic field for nuclear magnetic resonance (NMR) detection. In the simplest implementation, polarisation transfer is achieved by shaking the sample in the stray field of a superconducting NMR magnet. Although convenient, this method suffers from limited reproducibility and cannot be used with NMR spectrometers that do not have appreciable stray fields, such as benchtop instruments. Here, we use a simple hand‐held permanent magnet array to provide the necessary PTF during sample shaking. We find that the use of this array provides a 25% increase in SABRE enhancement over the stray field approach, while also providing improved reproducibility. Arrays with a range of PTFs were tested, and the PTF‐dependent SABRE enhancements were found to be in excellent agreement with comparable experiments carried out using an automated flow system where an electromagnet is used to generate the PTF. We anticipate that this approach will improve the efficiency and reproducibility of SABRE experiments carried out using manual shaking and will be particularly useful for benchtop NMR, where a suitable stray field is not readily accessible. The ability to construct arrays with a range of PTFs will also enable the rapid optimisation of SABRE enhancement as function of PTF for new substrate and catalyst systems. PMID:29193324
Du, Xue-Fei; Xiao, Meng; Liang, Hong-Yan; Sun, Zhe; Jiang, Yue-Hong; Chen, Guo-Yu; Meng, Xiao-Yu; Zou, Gui-Ling; Zhang, Li; Liu, Ya-Li; Zhang, Hui; Sun, Hong-Li; Jiang, Xiao-Feng; Xu, Ying-Chun
2014-01-01
Methicillin-resistant Staphylococcus aureus (MRSA) has become an important nosocomial pathogen, causing considerable morbidity and mortality. During the last 20 years, a variety of genotyping methods have been introduced for screening the prevalence of MRSA. In this study, we developed and evaluated an improved approach capillary gel electrophoresis based multilocus variable-number tandem-repeat fingerprinting (CGE/MLVF) for rapid MRSA typing. A total of 42 well-characterized strains and 116 non-repetitive clinical MRSA isolates collected from six hospitals in northeast China between 2009 and 2010 were tested. The results obtained by CGE/MLVF against clinical isolates were compared with traditional MLVF, spa typing, Multilocus sequence typing/staphylococcal cassette chromosome mec (MLST/SCCmec) and pulse field gel electrophoresis (PFGE). The discriminatory power estimated by Simpson’s index of diversity was 0.855 (28 types), 0.855 (28 patterns), 0.623 (11 types), 0.517 (8 types) and 0.854 (28 patterns) for CGE/MLVF, traditional MLVF, spa typing, MLST/SCCmec and PFGE, respectively. All methods tested showed a satisfied concordance in clonal complex level calculated by adjusted Rand’s coefficient. CGE/MLVF showed better reproducibility and accuracy than traditional MLVF and PFGE methods. In addition, the CGE/MLVF has potential to produce portable results. In conclusion, CGE/MLVF is a rapid and easy to use MRSA typing method with lower cost, good reproducibility and high discriminatory power for monitoring the outbreak and clonal spread of MRSA isolates. PMID:24406728
NASA Astrophysics Data System (ADS)
Klein, R.; Adler, A.; Beanlands, R. S.; de Kemp, R. A.
2007-02-01
A rubidium-82 (82Rb) elution system is described for use with positron emission tomography. Due to the short half-life of 82Rb (76 s), the system physics must be modelled precisely to account for transport delay and the associated activity decay and dispersion. Saline flow is switched between a 82Sr/82Rb generator and a bypass line to achieve a constant-activity elution of 82Rb. Pulse width modulation (PWM) of a solenoid valve is compared to simple threshold control as a means to simulate a proportional valve. A predictive-corrective control (PCC) algorithm is developed which produces a constant-activity elution within the constraints of long feedback delay and short elution time. The system model parameters are adjusted through a self-tuning algorithm to minimize error versus the requested time-activity profile. The system is self-calibrating with 2.5% repeatability, independent of generator activity and elution flow rate. Accurate 30 s constant-activity elutions of 10-70% of the total generator activity are achieved using both control methods. The combined PWM-PCC method provides significant improvement in precision and accuracy of the requested elution profiles. The 82Rb elution system produces accurate and reproducible constant-activity elution profiles of 82Rb activity, independent of parent 82Sr activity in the generator. More reproducible elution profiles may improve the quality of clinical and research PET perfusion studies using 82Rb.
NASA Astrophysics Data System (ADS)
Li, Puxi; Zhou, Tianjun; Zou, Liwei
2016-04-01
The authors evaluated the performance of Meteorological Research Institute (MRI) AGCM3.2 models in the simulations of climatology and interannual variability of the Spring Persistent Rains (SPR) over southeastern China. The possible impacts of different horizontal resolutions were also investigated based on the experiments with three different horizontal resolutions (i.e., 120, 60, and 20km). The model could reasonably reproduce the main rainfall center over southeastern China in boreal spring under the three different resolutions. In comparison with 120 simulation, it revealed that 60km and 20km simulations show the superiority in simulating rainfall centers anchored by the Nanling-Wuyi Mountains, but overestimate rainfall intensity. Water vapor budget diagnosis showed that, the 60km and 20km simulations tended to overestimate the water vapor convergence over southeastern China, which leads to wet biases. In the aspect of interannual variability of SPR, the model could reasonably reproduce the anomalous lower-tropospheric anticyclone in the western North Pacific (WNPAC) and positive precipitation anomalies over southeastern China in El Niño decaying spring. Compared with the 120km resolution, the large positive biases are substantially reduced in the mid and high resolution models which evidently improve the simulation of horizontal moisture advection in El Niño decaying spring. We highlight the importance of developing high resolution climate model as it could potentially improve the climatology and interannual variability of SPR.
Klein, R; Adler, A; Beanlands, R S; Dekemp, R A
2007-02-07
A rubidium-82 ((82)Rb) elution system is described for use with positron emission tomography. Due to the short half-life of (82)Rb (76 s), the system physics must be modelled precisely to account for transport delay and the associated activity decay and dispersion. Saline flow is switched between a (82)Sr/(82)Rb generator and a bypass line to achieve a constant-activity elution of (82)Rb. Pulse width modulation (PWM) of a solenoid valve is compared to simple threshold control as a means to simulate a proportional valve. A predictive-corrective control (PCC) algorithm is developed which produces a constant-activity elution within the constraints of long feedback delay and short elution time. The system model parameters are adjusted through a self-tuning algorithm to minimize error versus the requested time-activity profile. The system is self-calibrating with 2.5% repeatability, independent of generator activity and elution flow rate. Accurate 30 s constant-activity elutions of 10-70% of the total generator activity are achieved using both control methods. The combined PWM-PCC method provides significant improvement in precision and accuracy of the requested elution profiles. The (82)Rb elution system produces accurate and reproducible constant-activity elution profiles of (82)Rb activity, independent of parent (82)Sr activity in the generator. More reproducible elution profiles may improve the quality of clinical and research PET perfusion studies using (82)Rb.
NASA Astrophysics Data System (ADS)
Huang, He; Chen, Yiding; Liu, Libo; Le, Huijun; Wan, Weixing
2015-05-01
It is an urgent task to improve the ability of ionospheric empirical models to more precisely reproduce the plasma density variations in the topside ionosphere. Based on the Republic of China Satellite 1 (ROCSAT-1) observations, we developed a new empirical model of topside plasma density around 600 km under relatively quiet geomagnetic conditions. The model reproduces the ROCSAT-1 plasma density observations with a root-mean-square-error of 0.125 in units of lg(Ni(cm-3)) and reasonably describes the temporal and spatial variations of plasma density at altitudes in the range from 550 to 660 km. The model results are also in good agreement with observations from Hinotori, Coupled Ion-Neutral Dynamics Investigations/Communications/Navigation Outage Forecasting System satellites and the incoherent scatter radar at Arecibo. Further, we combined ROCSAT-1 and Hinotori data to improve the ROCSAT-1 model and built a new model (R&H model) after the consistency between the two data sets had been confirmed with the original ROCSAT-1 model. In particular, we studied the solar activity dependence of topside plasma density at a fixed altitude by R&H model and find that its feature slightly differs from the case when the orbit altitude evolution is ignored. In addition, the R&H model shows the merging of the two crests of equatorial ionization anomaly above the F2 peak, while the IRI_Nq topside option always produces two separate crests in this range of altitudes.
Enhancing Reuse of Data and Biological Material in Medical Research: From FAIR to FAIR-Health
Kohlmayer, Florian; Prasser, Fabian; Mayrhofer, Michaela Th.; Schlünder, Irene; Martin, Gillian M.; Casati, Sara; Koumakis, Lefteris; Wutte, Andrea; Kozera, Łukasz; Strapagiel, Dominik; Anton, Gabriele; Zanetti, Gianluigi; Sezerman, Osman Ugur; Mendy, Maimuna; Valík, Dalibor; Lavitrano, Marialuisa; Dagher, Georges; Zatloukal, Kurt; van Ommen, GertJan B.; Litton, Jan-Eric
2018-01-01
The known challenge of underutilization of data and biological material from biorepositories as potential resources for medical research has been the focus of discussion for over a decade. Recently developed guidelines for improved data availability and reusability—entitled FAIR Principles (Findability, Accessibility, Interoperability, and Reusability)—are likely to address only parts of the problem. In this article, we argue that biological material and data should be viewed as a unified resource. This approach would facilitate access to complete provenance information, which is a prerequisite for reproducibility and meaningful integration of the data. A unified view also allows for optimization of long-term storage strategies, as demonstrated in the case of biobanks. We propose an extension of the FAIR Principles to include the following additional components: (1) quality aspects related to research reproducibility and meaningful reuse of the data, (2) incentives to stimulate effective enrichment of data sets and biological material collections and its reuse on all levels, and (3) privacy-respecting approaches for working with the human material and data. These FAIR-Health principles should then be applied to both the biological material and data. We also propose the development of common guidelines for cloud architectures, due to the unprecedented growth of volume and breadth of medical data generation, as well as the associated need to process the data efficiently. PMID:29359962
Rizvi, Imran; Moon, Sangjun; Hasan, Tayyaba; Demirci, Utkan
2013-01-01
In vitro 3D cancer models that provide a more accurate representation of disease in vivo are urgently needed to improve our understanding of cancer pathology and to develop better cancer therapies. However, development of 3D models that are based on manual ejection of cells from micropipettes suffer from inherent limitations such as poor control over cell density, limited repeatability, low throughput, and, in the case of coculture models, lack of reproducible control over spatial distance between cell types (e.g., cancer and stromal cells). In this study, we build on a recently introduced 3D model in which human ovarian cancer (OVCAR-5) cells overlaid on Matrigel™ spontaneously form multicellular acini. We introduce a high-throughput automated cell printing system to bioprint a 3D coculture model using cancer cells and normal fibroblasts micropatterned on Matrigel™. Two cell types were patterned within a spatially controlled microenvironment (e.g., cell density, cell-cell distance) in a high-throughput and reproducible manner; both cell types remained viable during printing and continued to proliferate following patterning. This approach enables the miniaturization of an established macro-scale 3D culture model and would allow systematic investigation into the multiple unknown regulatory feedback mechanisms between tumor and stromal cells and provide a tool for high-throughput drug screening. PMID:21298805
Taverna, Constanza Giselle; Mazza, Mariana; Bueno, Nadia Soledad; Alvarez, Christian; Amigot, Susana; Andreani, Mariana; Azula, Natalia; Barrios, Rubén; Fernández, Norma; Fox, Barbara; Guelfand, Liliana; Maldonado, Ivana; Murisengo, Omar Alejandro; Relloso, Silvia; Vivot, Matias; Davel, Graciela
2018-05-11
Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has revolutionized the identification of microorganisms in clinical laboratories because it is rapid, relatively simple to use, accurate, and can be used for a wide number of microorganisms. Several studies have demonstrated the utility of this technique in the identification of yeasts; however, its performance is usually improved by the extension of the database. Here we developed an in-house database of 143 strains belonging to 42 yeast species in the MALDI Biotyper platform, and we validated the extended database with 388 regional strains and 15 reference strains belonging to 55 yeast species. We also performed an intra- and interlaboratory study to assess reproducibility and analyzed the use of the cutoff values of 1.700 and 2.000 to correctly identify at species level. The creation of an in-house database that extended the manufacturer's database was successful in view of no incorrect identification was introduced. The best performance was observed by using the extended database and a cutoff value of 1.700 with a sensitivity of .94 and specificity of .96. A reproducibility study showed utility to detect deviations and could be used for external quality control. The extended database was able to differentiate closely related species and it has potential in distinguishing the molecular genotypes of Cryptococcus neoformans and Cryptococcus gattii.
Wagner, Stephen; Skripchenko, Andrey; Thompson-Montgomery, Dedeene
2002-09-01
Limited photoinactivation kinetics, use of low-volume 30 percent Hct RBCs, and hemolysis have restricted the practicality of the use of dimethylmethylene blue (DMMB) and light for RBC decontamination. A flow-cell system was developed to rapidly treat larger volumes of oxygenated 45 percent Hct RBCs with high-intensity red light. CPD-whole blood was WBC reduced, RBCs were diluted in additive solutions (either Adsol or Erythrosol), and suspensions were subsequently oxygenated by gas overlay. Intracellular or extracellular VSV and DMMB were sequentially added. VSV-infected RBC suspensions (45% Hct) were passed through 1-mm-thick flow cells and illuminated. Samples were titered for VSV, stored for up to 42 days, and assayed for Hb, supernatant potassium, ATP, and MCV. The use of oxygenated RBCs resulted in rapid and reproducible photoinactivaton of > or = 6.6 log extracellular and approximately 4.0 log intracellular VSV independent of additive solution. Phototreated Adsol RBCs exhibited more than 10 times greater hemolysis and 30 percent greater MCV during storage than identically treated Erythrosol RBCs. Phototreatment caused RBC potassium leakage from RBCs in both additive solutions. ATP levels were better preserved in Erythrosol than Adsol RBCs. A rapid, reproducible, and robust method for photoinactivating model virus in RBC suspensions was developed. Despite improved hemolysis and ATP levels in Erythrosol-phototreated RBCs, storage properties were not maintained for 42 days.
Sykes, J R; Lindsay, R; Dean, C J; Brettle, D S; Magee, D R; Thwaites, D I
2008-10-07
For image-guided radiotherapy (IGRT) systems based on cone beam CT (CBCT) integrated into a linear accelerator, the reproducible alignment of imager to x-ray source is critical to the registration of both the x-ray-volumetric image with the megavoltage (MV) beam isocentre and image sharpness. An enhanced method of determining the CBCT to MV isocentre alignment using the QUASAR Penta-Guide phantom was developed which improved both precision and accuracy. This was benchmarked against our existing method which used software and a ball-bearing (BB) phantom provided by Elekta. Additionally, a method of measuring an image sharpness metric (MTF(50)) from the edge response function of a spherical air cavity within the Penta-Guide phantom was developed and its sensitivity was tested by simulating misalignments of the kV imager. Reproducibility testing of the enhanced Penta-Guide method demonstrated a systematic error of <0.2 mm when compared to the BB method with near equivalent random error (s=0.15 mm). The mean MTF(50) for five measurements was 0.278+/-0.004 lp mm(-1) with no applied misalignment. Simulated misalignments exhibited a clear peak in the MTF(50) enabling misalignments greater than 0.4 mm to be detected. The Penta-Guide phantom can be used to precisely measure CBCT-MV coincidence and image sharpness on CBCT-IGRT systems.
Enhancing Reuse of Data and Biological Material in Medical Research: From FAIR to FAIR-Health.
Holub, Petr; Kohlmayer, Florian; Prasser, Fabian; Mayrhofer, Michaela Th; Schlünder, Irene; Martin, Gillian M; Casati, Sara; Koumakis, Lefteris; Wutte, Andrea; Kozera, Łukasz; Strapagiel, Dominik; Anton, Gabriele; Zanetti, Gianluigi; Sezerman, Osman Ugur; Mendy, Maimuna; Valík, Dalibor; Lavitrano, Marialuisa; Dagher, Georges; Zatloukal, Kurt; van Ommen, GertJan B; Litton, Jan-Eric
2018-04-01
The known challenge of underutilization of data and biological material from biorepositories as potential resources for medical research has been the focus of discussion for over a decade. Recently developed guidelines for improved data availability and reusability-entitled FAIR Principles (Findability, Accessibility, Interoperability, and Reusability)-are likely to address only parts of the problem. In this article, we argue that biological material and data should be viewed as a unified resource. This approach would facilitate access to complete provenance information, which is a prerequisite for reproducibility and meaningful integration of the data. A unified view also allows for optimization of long-term storage strategies, as demonstrated in the case of biobanks. We propose an extension of the FAIR Principles to include the following additional components: (1) quality aspects related to research reproducibility and meaningful reuse of the data, (2) incentives to stimulate effective enrichment of data sets and biological material collections and its reuse on all levels, and (3) privacy-respecting approaches for working with the human material and data. These FAIR-Health principles should then be applied to both the biological material and data. We also propose the development of common guidelines for cloud architectures, due to the unprecedented growth of volume and breadth of medical data generation, as well as the associated need to process the data efficiently.
Rupiani, Sebastiano; Guidotti, Laura; Manerba, Marcella; Di Ianni, Lorenza; Giacomini, Elisa; Falchi, Federico; Di Stefano, Giuseppina; Roberti, Marinella; Recanatini, Maurizio
2016-11-22
Glycolysis is the main route for energy production in tumors. LDH-A is a key enzyme of this process and its inhibition represents an attractive strategy to hamper cancer cell metabolism. Galloflavin is a reliable LDH-A inhibitor as previously identified by us; however, its poor physicochemical properties and chemical tractability render it unsuitable for further development. Therefore, a rational design was undertaken with the aim to reproduce the pharmacophore of galloflavin on simpler, potentially more soluble and synthetic accessible scaffolds. Following a process of structural simplification, natural urolithin M6 (UM6), which is an ellagitannin metabolite produced by gut microbiota, was identified as a putative galloflavin mimetic. In the present study, the synthesis of UM6 is described for the first time. An efficient synthetic pathway has been developed, which involved five steps from readily accessible starting materials. The key reaction steps, a Suzuki coupling and an intramolecular C-H oxygenation, have been optimized to improve the synthetic feasibility and provide the best conditions in terms of reaction time and yield. Moreover, this route would be suitable to obtain other analogs for SAR studies. Preliminary biological tests revealed that UM6 was able to smoothly reproduce the behavior of galloflavin, confirming that our approach was successful in providing a new and accessible structure in the search for new LDH-A inhibitors.
Pemp, Berthold; Kardon, Randy H; Kircher, Karl; Pernicka, Elisabeth; Schmidt-Erfurth, Ursula; Reitner, Andreas
2013-07-01
Automated detection of subtle changes in peripapillary retinal nerve fibre layer thickness (RNFLT) over time using optical coherence tomography (OCT) is limited by inherent image quality before layer segmentation, stabilization of the scan on the peripapillary retina and its precise placement on repeated scans. The present study evaluates image quality and reproducibility of spectral domain (SD)-OCT comparing different rates of automatic real-time tracking (ART). Peripapillary RNFLT was measured in 40 healthy eyes on six different days using SD-OCT with an eye-tracking system. Image brightness of OCT with unaveraged single frame B-scans was compared to images using ART of 16 B-scans and 100 averaged frames. Short-term and day-to-day reproducibility was evaluated by calculation of intraindividual coefficients of variation (CV) and intraclass correlation coefficients (ICC) for single measurements as well as for seven repeated measurements per study day. Image brightness, short-term reproducibility, and day-to-day reproducibility were significantly improved using ART of 100 frames compared to one and 16 frames. Short-term CV was reduced from 0.94 ± 0.31 % and 0.91 ± 0.54 % in scans of one and 16 frames to 0.56 ± 0.42 % in scans of 100 averaged frames (P ≤ 0.003 each). Day-to-day CV was reduced from 0.98 ± 0.86 % and 0.78 ± 0.56 % to 0.53 ± 0.43 % (P ≤ 0.022 each). The range of ICC was 0.94 to 0.99. Sample size calculations for detecting changes of RNFLT over time in the range of 2 to 5 μm were performed based on intraindividual variability. Image quality and reproducibility of mean peripapillary RNFLT measurements using SD-OCT is improved by averaging OCT images with eye-tracking compared to unaveraged single frame images. Further improvement is achieved by increasing the amount of frames per measurement, and by averaging values of repeated measurements per session. These strategies may allow a more accurate evaluation of RNFLT reduction in clinical trials observing optic nerve degeneration.
Reproducible research in palaeomagnetism
NASA Astrophysics Data System (ADS)
Lurcock, Pontus; Florindo, Fabio
2015-04-01
The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined, then saved as a self-contained configuration which can be re-run without human interaction. PuffinPlot can thus be used as a component of a larger scientific workflow, integrated with workflow management tools such as Kepler, without compromising its capabilities as an exploratory tool. Since both PuffinPlot and the platform it runs on (Java) are Free/Open Source software, even the most fundamental components of an analysis can be verified and reproduced.
Piette, Elizabeth R; Moore, Jason H
2018-01-01
Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.
Donner, Daniel G; Kiriazis, Helen; Du, Xiao-Jun; Marwick, Thomas H; McMullen, Julie R
2018-04-20
Informal training in preclinical research may be a contributor to the poor reproducibility of preclinical cardiology research and low rates of translation into clinical research and practice. Mouse echocardiography is a widely used technique to assess cardiac structure and function in drug intervention studies using disease models. The inter-observer variability (IOV) of clinical echocardiographic measurements has been shown to improve with formalized training, but preclinical echocardiography lacks similarly critical standardization of training. The aims of this investigation were to assess the IOV of echocardiographic measurements from studies in mice, and address any technical impediments to reproducibility by implementing standardized guidelines through formalized training. In this prospective, single-site, observational cohort study, 13 scientists performing preclinical echocardiographic image analysis were assessed for measurement of short-axis M-mode-derived dimensions and calculated left ventricular mass (LVMass). Ten M-mode images of mouse hearts acquired and analyzed by an expert researcher with a spectrum of LVMass were selected for assessment, and validated by autopsy weight. Following the initial observation, a structured formal training program was introduced, and accuracy and reproducibility were re-evaluated. Mean absolute percentage error (MAPE) for Expert-calculated LVMass was 6{plus minus}4% compared to autopsy LVMass, and 25{plus minus}21% for participants before training. Standardized formal training improved participant MAPE by approximately 30% relative to expert-calculated LVMass (p<0.001). Participants initially categorized with high-range error (25-45%) improved to low-moderate error ranges (<15-25%). This report reveals an example of technical skill training insufficiency likely endemic to preclinical research and provides validated guidelines for echocardiographic measurement for adaptation to formalized in-training programs.
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.
2016-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.
[A method for reproducing amnesia in mice by the complex extremal exposure].
Iasnetsov, V V; Provornova, N A
2003-01-01
It is suggested to reproduce a retrograde amnesia in mice by means of a complex extremal action: emaciating swim in cold water with simultaneous wheel rotation. It was found that nootropes such as pyracetam, mexidol, semax, nooglutil, acephen, and noopept fully or completely prevent from the amnesia development.
Comment on "Habitat split and the global decline of amphibians".
Cannatella, David C
2008-05-16
Becker et al. (Reports, 14 December 2007, p. 1775) reported that forest amphibians with terrestrial development are less susceptible to the effects of habitat degradation than those with aquatic larvae. However, analysis with more appropriate statistical methods suggests there is no evidence for a difference between aquatic-reproducing and terrestrial-reproducing species.
Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy
ERIC Educational Resources Information Center
Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie
2012-01-01
A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…
Brown, Christopher A.; Brown, Kevin S.
2010-01-01
Correlated amino acid substitution algorithms attempt to discover groups of residues that co-fluctuate due to either structural or functional constraints. Although these algorithms could inform both ab initio protein folding calculations and evolutionary studies, their utility for these purposes has been hindered by a lack of confidence in their predictions due to hard to control sources of error. To complicate matters further, naive users are confronted with a multitude of methods to choose from, in addition to the mechanics of assembling and pruning a dataset. We first introduce a new pair scoring method, called ZNMI (Z-scored-product Normalized Mutual Information), which drastically improves the performance of mutual information for co-fluctuating residue prediction. Second and more important, we recast the process of finding coevolving residues in proteins as a data-processing pipeline inspired by the medical imaging literature. We construct an ensemble of alignment partitions that can be used in a cross-validation scheme to assess the effects of choices made during the procedure on the resulting predictions. This pipeline sensitivity study gives a measure of reproducibility (how similar are the predictions given perturbations to the pipeline?) and accuracy (are residue pairs with large couplings on average close in tertiary structure?). We choose a handful of published methods, along with ZNMI, and compare their reproducibility and accuracy on three diverse protein families. We find that (i) of the algorithms tested, while none appear to be both highly reproducible and accurate, ZNMI is one of the most accurate by far and (ii) while users should be wary of predictions drawn from a single alignment, considering an ensemble of sub-alignments can help to determine both highly accurate and reproducible couplings. Our cross-validation approach should be of interest both to developers and end users of algorithms that try to detect correlated amino acid substitutions. PMID:20531955
Loy, S L; Marhazlina, M; Nor, Azwany Y; Hamid, Jan J M
2011-04-01
This study aimed to develop and examine the validity and reproducibility of a semi-quantitative food frequency questionnaire (FFQ) among Malay pregnant women in Kelantan, Malaysia. A total of 177 Malay pregnant women participated in the validation study while 85 of them participated in the reproducibility study which was carried out in the antenatal clinic of Universiti Sains Malaysia Hospital. The newly developed FFQ was validated against two 24-hour dietary recalls (DR). The FFQ was repeated 20 to 28 days apart. Results showed that the FFQ moderately over-estimated the nutrient and food intakes compared to the DR. Spearman correlation coefficients for nutrients ranged from 0.24 (fat) to 0.61 (calcium) and for foods, ranged from 0.13 (organ meats, onion and garlic) to 0.57 (malt drink). For nutrients, 72 to 85% of women were classified into the correct quartiles from the FFQ and the DR while for foods, 67 to 85% of women were classified correctly. Bland-Altman plot showed relatively good agreement between these two dietary methods. The intra-class correlation (ICC) was used to estimate reproducibility. It ranged from 0.75 (vitamin C) to 0.94 (phosphorus) for nutrients while it ranged from 0.73 (confectionary) to 0.96 (coffee) for foods. On average, at least 90% of pregnant women were correctly classified into the quartiles for nutrients and foods from the two sets of the FFQ. The FFQ presented acceptable reproducibility and appears to be a valid tool for categorising pregnant women according to dietary intake.
NASA Astrophysics Data System (ADS)
Jones, A. A.; Holt, R. M.
2017-12-01
Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).
Improving data collection, documentation, and workflow in a dementia screening study.
Read, Kevin B; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I; Galvin, James E; Surkis, Alisa
2017-04-01
A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies' data collection, entry, and processing workflows. The librarians' role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library's broader user community.
Improving reliability of a residency interview process.
Peeters, Michael J; Serres, Michelle L; Gundrum, Todd E
2013-10-14
To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station-impact of content specificity was greatly reduced with more interview stations. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity.
High resolution fate map of the zebrafish diencephalon.
Russek-Blum, Niva; Nabel-Rosen, Helit; Levkowitz, Gil
2009-07-01
The diencephalon acts as an interactive site between the sensory, central, and endocrine systems and is one of the most elaborate structures in the vertebrate brain. To better understand the embryonic development and morphogenesis of the diencephalon, we developed an improved photoactivation (uncaging)-based lineage tracing strategy. To determine the exact position of a given diencephalic progenitor domain, we used a transgenic line driving green fluorescent protein (GFP) in cells expressing the proneural protein, Neurogenin1 (Neurog1), which was used as a visible neural plate landmark. This approach facilitated precise labeling of defined groups of cells in the prospective diencephalon of the zebrafish neural plate. In this manner, we labeled multiple overlapping areas of the diencephalon, thereby ensuring both accuracy and reproducibility of our lineage tracing regardless of the dynamic changes of the developing neural plate. We present a fate map of the zebrafish diencephalon at a higher spatial resolution than previously described. (c) 2009 Wiley-Liss, Inc.
LEC GaAs for integrated circuit applications
NASA Technical Reports Server (NTRS)
Kirkpatrick, C. G.; Chen, R. T.; Homes, D. E.; Asbeck, P. M.; Elliott, K. R.; Fairman, R. D.; Oliver, J. D.
1984-01-01
Recent developments in liquid encapsulated Czochralski techniques for the growth of semiinsulating GaAs for integrated circuit applications have resulted in significant improvements in the quality and quantity of GaAs material suitable for device processing. The emergence of high performance GaAs integrated circuit technologies has accelerated the demand for high quality, large diameter semiinsulating GaAs substrates. The new device technologies, including digital integrated circuits, monolithic microwave integrated circuits and charge coupled devices have largely adopted direct ion implantation for the formation of doped layers. Ion implantation lends itself to good uniformity and reproducibility, high yield and low cost; however, this technique also places stringent demands on the quality of the semiinsulating GaAs substrates. Although significant progress was made in developing a viable planar ion implantation technology, the variability and poor quality of GaAs substrates have hindered progress in process development.
Price, L H; Li, Y; Patel, A; Gyawali, C Prakash
2014-05-01
Multiple rapid swallows (MRS) during esophageal high resolution manometry (HRM) assess esophageal neuromuscular integrity by evaluating postdeglutitive inhibition and rebound contraction, but most reports performed only a single MRS sequence. We assessed patterns of MRS reproducibility during clinical HRM in comparison to a normal cohort. Consecutive clinical HRM studies were included if two separate MRS sequences (four to six rapid swallows ≤4 s apart) were successfully performed. Chicago Classification diagnoses were identified; contraction wave abnormalities were additionally recorded. MRS-induced inhibition (contraction ≤3 cm during inhibition phase) and rebound contraction was assessed, and findings compared to 18 controls (28.0 ± 0.7 year, 50.0% female). Reproducibility consisted of similar inhibition and contraction responses with both sequences; discordance was segregated into inhibition and contraction phases. Multiple rapid swallows were successfully performed in 89.3% patients and all controls; 225 subjects (56.2 ± 0.9 year, 62.7% female) met study inclusion criteria. Multiple rapid swallows were reproducible in 76.9% patients and 94.4% controls (inhibition phase: 88.0% vs 94.4%, contraction phase 86.7% vs 100%, respectively, p = ns). A gradient of reproducibility was noted, highest in well-developed motor disorders (achalasia spectrum, hypermotility disorders, and aperistalsis, 91.7-100%, p = ns compared to controls); and lower in lesser motor disorders (contraction wave abnormalities, esophageal body hypomotility) or normal studies (62.2-70.8%, p < 0.0001 compared to well-developed motor disorders). Inhibition phase was most discordant in contraction wave abnormalities, while contraction phase was most discordant when studies were designated normal. Multiple rapid swallows are highly reproducible, especially in well-developed motor disorders, and complement the standard wet swallow manometry protocol. © 2014 John Wiley & Sons Ltd.
Development and Assessment of a New Empirical Model for Predicting Full Creep Curves
Gray, Veronica; Whittaker, Mark
2015-01-01
This paper details the development and assessment of a new empirical creep model that belongs to the limited ranks of models reproducing full creep curves. The important features of the model are that it is fully standardised and is universally applicable. By standardising, the user no longer chooses functions but rather fits one set of constants only. Testing it on 7 contrasting materials, reproducing 181 creep curves we demonstrate its universality. New model and Theta Projection curves are compared to one another using an assessment tool developed within this paper. PMID:28793458
Improved Audio Reproduction System
NASA Technical Reports Server (NTRS)
Chang, C. S.
1972-01-01
Circuitry utilizing electrical feedback of instantaneous speaker coil velocity compensates for loudspeaker resonance, transient peaks and frequency drop-off so that sounds of widely varying frequencies and amplitudes can be reproduced accurately from high fidelity recordings of any variety.
Performance of European chemistry transport models as function of horizontal resolution
NASA Astrophysics Data System (ADS)
Schaap, M.; Cuvelier, C.; Hendriks, C.; Bessagnet, B.; Baldasano, J. M.; Colette, A.; Thunis, P.; Karam, D.; Fagerli, H.; Graff, A.; Kranenburg, R.; Nyiri, A.; Pay, M. T.; Rouïl, L.; Schulz, M.; Simpson, D.; Stern, R.; Terrenoire, E.; Wind, P.
2015-07-01
Air pollution causes adverse effects on human health as well as ecosystems and crop yield and also has an impact on climate change trough short-lived climate forcers. To design mitigation strategies for air pollution, 3D Chemistry Transport Models (CTMs) have been developed to support the decision process. Increases in model resolution may provide more accurate and detailed information, but will cubically increase computational costs and pose additional challenges concerning high resolution input data. The motivation for the present study was therefore to explore the impact of using finer horizontal grid resolution for policy support applications of the European Monitoring and Evaluation Programme (EMEP) model within the Long Range Transboundary Air Pollution (LRTAP) convention. The goal was to determine the "optimum resolution" at which additional computational efforts do not provide increased model performance using presently available input data. Five regional CTMs performed four runs for 2009 over Europe at different horizontal resolutions. The models' responses to an increase in resolution are broadly consistent for all models. The largest response was found for NO2 followed by PM10 and O3. Model resolution does not impact model performance for rural background conditions. However, increasing model resolution improves the model performance at stations in and near large conglomerations. The statistical evaluation showed that the increased resolution better reproduces the spatial gradients in pollution regimes, but does not help to improve significantly the model performance for reproducing observed temporal variability. This study clearly shows that increasing model resolution is advantageous, and that leaving a resolution of 50 km in favour of a resolution between 10 and 20 km is practical and worthwhile. As about 70% of the model response to grid resolution is determined by the difference in the spatial emission distribution, improved emission allocation procedures at high spatial and temporal resolution are a crucial factor for further model resolution improvements.
Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Donnell, James T.; Maile, Tobias; Rose, Cody
Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less
van der Leij, Christiaan; Lavini, Cristina; van de Sande, Marleen G H; de Hair, Marjolein J H; Wijffels, Christophe; Maas, Mario
2015-12-01
To compare the between-session reproducibility of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) combined with time-intensity curve (TIC)-shape analysis in arthritis patients, within one scanner and between two different scanners, and to compare this method with qualitative analysis and pharmacokinetic modeling (PKM). Fifteen knee joint arthritis patients were included and scanned twice on a closed-bore 1.5T scanner (n = 9, group 1), or on a closed-bore 1.5T and on an open-bore 1.0T scanner (n = 6, group 2). DCE-MRI data were postprocessed using in-house developed software ("Dynamo"). Disease activity was assessed. Disease activity was comparable between the two visits. In group 1 qualitative analysis showed the highest reproducibility with intraclass correlation coefficients (ICCs) between 0.78 and 0.98 and root mean square-coefficients of variation (RMS-CoV) of 8.0%-14.9%. TIC-shape analysis showed a slightly lower reproducibility with similar ICCs (0.78-0.97) but higher RMS-CoV (18.3%-42.9%). The PKM analysis showed the lowest reproducibility with ICCs between 0.39 and 0.64 (RMS-CoV 21.5%-51.9%). In group 2 TIC-shape analysis of the two most important TIC-shape types showed the highest reproducibility with ICCs of 0.78 and 0.71 (RMS-CoV 29.8% and 59.4%) and outperformed the reproducibility of the most important qualitative parameter (ICC 0.31, RMS-CoV 45.1%) and the within-scanner reproducibility of PKM analysis. TIC-shape analysis is a robust postprocessing method within one scanner, almost as reproducible as the qualitative analysis. Between scanners, the reproducibility of the most important TIC-shapes outperform that of the most important qualitative parameter and the within-scanner reproducibility of PKM analysis. © 2015 Wiley Periodicals, Inc.
Droplet microfluidics for synthetic biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
Droplet microfluidics for synthetic biology
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee; ...
2017-08-10
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
Enhancing Results of Microarray Hybridizations Through Microagitation
Toegl, Andreas; Kirchner, Roland; Gauer, Christoph; Wixforth, Achim
2003-01-01
Protein and DNA microarrays have become a standard tool in proteomics/genomics research. In order to guarantee fast and reproducible hybridization results, the diffusion limit must be overcome. Surface acoustic wave (SAW) micro-agitation chips efficiently agitate the smallest sample volumes (down to 10 μL and below) without introducing any dead volume. The advantages are reduced reaction time, increased signal-to-noise ratio, improved homogeneity across the microarray, and better slide-to-slide reproducibility. The SAW micromixer chips are the heart of the Advalytix ArrayBooster, which is compatible with all microarrays based on the microscope slide format. PMID:13678150
Accounting for reciprocal host-microbiome interactions in experimental science.
Stappenbeck, Thaddeus S; Virgin, Herbert W
2016-06-09
Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research.
Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.
Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee
2013-09-01
Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.
Bae, Yong Jin; Park, Kyung Man; Ahn, Sung Hee; Moon, Jeong Hee; Kim, Myung Soo
2014-08-01
Previously, we reported that MALDI spectra of peptides became reproducible when temperature was kept constant. Linear calibration curves derived from such spectral data could be used for quantification. Homogeneity of samples was one of the requirements. Among the three popular matrices used in peptide MALDI [i.e., α-cyano-4-hydroxycinnamic acid (CHCA), 2,5-dihydroxybenzoic acid (DHB), and sinapinic acid (SA)], homogeneous samples could be prepared by conventional means only for CHCA. In this work, we showed that sample preparation by micro-spotting improved the homogeneity for all three cases.
A gene network model accounting for development and evolution of mammalian teeth
Salazar-Ciudad, Isaac; Jernvall, Jukka
2002-01-01
Generation of morphological diversity remains a challenge for evolutionary biologists because it is unclear how an ultimately finite number of genes involved in initial pattern formation integrates with morphogenesis. Ideally, models used to search for the simplest developmental principles on how genes produce form should account for both developmental process and evolutionary change. Here we present a model reproducing the morphology of mammalian teeth by integrating experimental data on gene interactions and growth into a morphodynamic mechanism in which developing morphology has a causal role in patterning. The model predicts the course of tooth-shape development in different mammalian species and also reproduces key transitions in evolution. Furthermore, we reproduce the known expression patterns of several genes involved in tooth development and their dynamics over developmental time. Large morphological effects frequently can be achieved by small changes, according to this model, and similar morphologies can be produced by different changes. This finding may be consistent with why predicting the morphological outcomes of molecular experiments is challenging. Nevertheless, models incorporating morphology and gene activity show promise for linking genotypes to phenotypes. PMID:12048258
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Silva Indrasekara, Agampodi S.; Johnson, Sean F.; Odion, Ren A.
Among plasmonic nanoparticles, surfactant-free branched gold nanoparticles have exhibited exceptional properties as a nanoplatform for a wide variety of applications ranging from surface-enhanced Raman scattering sensing and imaging applications to photothermal treatment and photoimmunotherapy for cancer treatments. The effectiveness and reliability of branched gold nanoparticles in biomedical applications strongly rely on the consistency and reproducibility of physical, chemical, optical, and therapeutic properties of nanoparticles, which are mainly governed by their morphological features. Herein, we present an optimized bottom-up synthesis that improves the reproducibility and homogeneity of the gold-branched nanoparticles with desired morphological features and optical properties. We identified that themore » order of reagent addition is crucial for improved homogeneity of the branched nature of nanoparticles that enable a high batch-to-batch reproducibility and reliability. In addition, a different combination of the synthesis parameters, in particular, additive halides and concentration ratios of reactive Au to Ag and Au to Au seeds, which yield branched nanoparticle of similar localized surface plasmon resonances but with distinguishable changes in the dimensions of the branches, was realized. Overall, our study introduces the design parameters for the purpose-tailored manufacturing of surfactant-free gold nanostars in a reliable manner.« less
Investigating the Eddy Diffusivity Concept in the Coastal Ocean
NASA Astrophysics Data System (ADS)
Rypina, I.; Kirincich, A.; Lentz, S. J.; Sundermeyer, M. A.
2016-12-01
We test the validity, utility, and limitations of the lateral eddy diffusivity concept in a coastal environment through analyzing data from coupled drifter and dye releases within the footprint of a high-resolution (800 m) high-frequency radar south of Martha's Vineyard, Massachusetts. Specifically, we investigate how well a combination of radar-based velocities and drifter-derived diffusivities can reproduce observed dye spreading over an 8-h time interval. A drifter-based estimate of an anisotropic diffusivity tensor is used to parameterize small-scale motions that are unresolved and under-resolved by the radar system. This leads to a significant improvement in the ability of the radar to reproduce the observed dye spreading. Our drifter-derived diffusivity estimates are O(10 m2/s), are consistent with the diffusivity inferred from aerial images of the dye taken using the quadcopter-mounted digital camera during the dye release, and are roughly an order of magnitude larger than diffusivity estimates of Okubo (O(1 m2/s)) for similar spatial scales ( 1 km). Despite the fact that the drifter-based diffusivity approach was successful in improving the ability of the radar to reproduce the observed dye spreading, the dispersion of drifters was, for the most part, not consistent with the diffusive spreading regime.
De Silva Indrasekara, Agampodi S.; Johnson, Sean F.; Odion, Ren A.; ...
2018-02-22
Among plasmonic nanoparticles, surfactant-free branched gold nanoparticles have exhibited exceptional properties as a nanoplatform for a wide variety of applications ranging from surface-enhanced Raman scattering sensing and imaging applications to photothermal treatment and photoimmunotherapy for cancer treatments. The effectiveness and reliability of branched gold nanoparticles in biomedical applications strongly rely on the consistency and reproducibility of physical, chemical, optical, and therapeutic properties of nanoparticles, which are mainly governed by their morphological features. Herein, we present an optimized bottom-up synthesis that improves the reproducibility and homogeneity of the gold-branched nanoparticles with desired morphological features and optical properties. We identified that themore » order of reagent addition is crucial for improved homogeneity of the branched nature of nanoparticles that enable a high batch-to-batch reproducibility and reliability. In addition, a different combination of the synthesis parameters, in particular, additive halides and concentration ratios of reactive Au to Ag and Au to Au seeds, which yield branched nanoparticle of similar localized surface plasmon resonances but with distinguishable changes in the dimensions of the branches, was realized. Overall, our study introduces the design parameters for the purpose-tailored manufacturing of surfactant-free gold nanostars in a reliable manner.« less
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.
2015-01-01
Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120
He, Xiao-Mei; Ding, Jun; Yu, Lei; Hussain, Dilshad; Feng, Yu-Qi
2016-09-01
Quantitative analysis of small molecules by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been a challenging task due to matrix-derived interferences in low m/z region and poor reproducibility of MS signal response. In this study, we developed an approach by applying black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix for the quantitative analysis of small molecules for the first time. Black phosphorus-assisted laser desorption/ionization mass spectrometry (BP/ALDI-MS) showed clear background and exhibited superior detection sensitivity toward quaternary ammonium compounds compared to carbon-based materials. By combining stable isotope labeling (SIL) strategy with BP/ALDI-MS (SIL-BP/ALDI-MS), a variety of analytes labeled with quaternary ammonium group were sensitively detected. Moreover, the isotope-labeled forms of analytes also served as internal standards, which broadened the analyte coverage of BP/ALDI-MS and improved the reproducibility of MS signals. Based on these advantages, a reliable method for quantitative analysis of aldehydes from complex biological samples (saliva, urine, and serum) was successfully established. Good linearities were obtained for five aldehydes in the range of 0.1-20.0 μM with correlation coefficients (R (2)) larger than 0.9928. The LODs were found to be 20 to 100 nM. Reproducibility of the method was obtained with intra-day and inter-day relative standard deviations (RSDs) less than 10.4 %, and the recoveries in saliva samples ranged from 91.4 to 117.1 %. Taken together, the proposed SIL-BP/ALDI-MS strategy has proved to be a reliable tool for quantitative analysis of aldehydes from complex samples. Graphical Abstract An approach for the determination of small molecules was developed by using black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix.
Sochat, Vanessa
2018-01-01
Abstract Background Here, we present the Scientific Filesystem (SCIF), an organizational format that supports exposure of executables and metadata for discoverability of scientific applications. The format includes a known filesystem structure, a definition for a set of environment variables describing it, and functions for generation of the variables and interaction with the libraries, metadata, and executables located within. SCIF makes it easy to expose metadata, multiple environments, installation steps, files, and entry points to render scientific applications consistent, modular, and discoverable. A SCIF can be installed on a traditional host or in a container technology such as Docker or Singularity. We start by reviewing the background and rationale for the SCIF, followed by an overview of the specification and the different levels of internal modules (“apps”) that the organizational format affords. Finally, we demonstrate that SCIF is useful by implementing and discussing several use cases that improve user interaction and understanding of scientific applications. SCIF is released along with a client and integration in the Singularity 2.4 software to quickly install and interact with SCIF. When used inside of a reproducible container, a SCIF is a recipe for reproducibility and introspection of the functions and users that it serves. Results We use SCIF to evaluate container software, provide metrics, serve scientific workflows, and execute a primary function under different contexts. To encourage collaboration and sharing of applications, we developed tools along with an open source, version-controlled, tested, and programmatically accessible web infrastructure. SCIF and associated resources are available at https://sci-f.github.io. The ease of using SCIF, especially in the context of containers, offers promise for scientists’ work to be self-documenting and programatically parseable for maximum reproducibility. SCIF opens up an abstraction from underlying programming languages and packaging logic to work with scientific applications, opening up new opportunities for scientific software development. PMID:29718213
DOE Office of Scientific and Technical Information (OSTI.GOV)
González-Lavado, Eloisa; Corchado, Jose C.; Espinosa-Garcia, Joaquin, E-mail: joaquin@unex.es
2014-02-14
Based exclusively on high-level ab initio calculations, a new full-dimensional analytical potential energy surface (PES-2014) for the gas-phase reaction of hydrogen abstraction from methane by an oxygen atom is developed. The ab initio information employed in the fit includes properties (equilibrium geometries, relative energies, and vibrational frequencies) of the reactants, products, saddle point, points on the reaction path, and points on the reaction swath, taking especial caution respecting the location and characterization of the intermediate complexes in the entrance and exit channels. By comparing with the reference results we show that the resulting PES-2014 reproduces reasonably well the whole setmore » of ab initio data used in the fitting, obtained at the CCSD(T) = FULL/aug-cc-pVQZ//CCSD(T) = FC/cc-pVTZ single point level, which represents a severe test of the new surface. As a first application, on this analytical surface we perform an extensive dynamics study using quasi-classical trajectory calculations, comparing the results with recent experimental and theoretical data. The excitation function increases with energy (concave-up) reproducing experimental and theoretical information, although our values are somewhat larger. The OH rotovibrational distribution is cold in agreement with experiment. Finally, our results reproduce experimental backward scattering distribution, associated to a rebound mechanism. These results lend confidence to the accuracy of the new surface, which substantially improves the results obtained with our previous surface (PES-2000) for the same system.« less
Variability in sublingual microvessel density and flow measurements in healthy volunteers.
Hubble, Sheena M A; Kyte, Hayley L; Gooding, Kim; Shore, Angela C
2009-02-01
As sublingual microvascular indices are increasingly heralded as new resuscitation end-points, better population data are required to power clinical studies. This paper describes improved methods to quantify sublingual microvessel flow and density in images obtained by sidestream dark field (SDF) technology in healthy volunteers, including vessels under 10 microm in diameter. Measurements of sublingual capillary density and flow were obtained by recording three 15-second images in 20 healthy volunteers over three days. Two independent observers quantified capillary density by using two methods: total vessel length (mm/mm2) and counting (number/mm). Both intraoral and temporal variabilities within subject and observer reproducibilities were determined by using coefficients of variability and reproducibility indices. For small (1-10 microm), medium (11-20 microm), and large (21-50 microm) diameter, mean vessel density with standard deviations (SDs) in volunteers was 21.3(+/- 4.9), 5.2 (+/- 1.2), and 2.7 (+/- 0.9) mm/mm2, respectively. Also, 94.0 +/- 1.4% of small vessels, 94.5 +/- 1.4% of medium vessels, and 94.5+/- 4.0% of large vessels had continuous perfusion. Within subjects, the means of all measurements over three days varied less than 13, 22, and 35% in small, medium, and large vessels, respectively. Interobserver reproducibility was good, especially for capillary (1-10 microm) density and flow measurements. Our methods of microvessel flow and density quantification have low observer variability and confirm the stability of microcirculatory measurements over time. These results facilitate the development of SDF-acquired sublingual microvascular indices as feasible microperfusion markers in shock resuscitation.
Integrated signal probe based aptasensor for dual-analyte detection.
Xiang, Juan; Pi, Xiaomei; Chen, Xiaoqing; Xiang, Lei; Yang, Minghui; Ren, Hao; Shen, Xiaojuan; Qi, Ning; Deng, Chunyan
2017-10-15
For the multi-analyte detection, although the sensitivity has commonly met the practical requirements, the reliability, reproducibility and stability need to be further improved. In this work, two different aptamer probes labeled with redox tags were used as signal probe1 (sP1) and signal probe2 (sP2), which were integrated into one unity DNA architecture to develop the integrated signal probe (ISP). Comparing with the conventional independent signal probes for the simultaneous multi-analyte detection, the proposed ISP was more reproducible and accurate. This can be due to that ISP in one DNA structure can ensure the completely same modification condition and an equal stoichiometric ratio between sP1 and sP2, and furthermore the cross interference between sP1 and sP2 can be successfully prevented by regulating the complementary position of sP1 and sP2. The ISP-based assay system would be a great progress for the dual-analyte detection. Combining with gold nanoparticles (AuNPs) signal amplification, the ISP/AuNPs-based aptasensor for the sensitive dual-analyte detection was explored. Based on DNA structural switching induced by targets binding to aptamer, the simultaneous dual-analyte detection was simply achieved by monitoring the electrochemical responses of methylene blue (MB) and ferrocene (Fc) This proposed detection system possesses such advantages as simplicity in design, easy operation, good reproducibility and accuracy, high sensitivity and selectivity, which indicates the excellent application of this aptasensor in the field of clinical diagnosis or other molecular sensors. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Chenghai; Yang, Kai
2018-04-01
Land surface models (LSMs) have developed significantly over the past few decades, with the result that most LSMs can generally reproduce the characteristics of the land surface. However, LSMs fail to reproduce some details of soil water and heat transport during seasonal transition periods because they neglect the effects of interactions between water movement and heat transfer in the soil. Such effects are critical for a complete understanding of water-heat transport within a soil thermohydraulic regime. In this study, a fully coupled water-heat transport scheme (FCS) is incorporated into the Community Land Model (version 4.5) to replaces its original isothermal scheme, which is more complete in theory. Observational data from five sites are used to validate the performance of the FCS. The simulation results at both single-point and global scale show that the FCS improved the simulation of soil moisture and temperature. FCS better reproduced the characteristics of drier and colder surface layers in arid regions by considering the diffusion of soil water vapor, which is a nonnegligible process in soil, especially for soil surface layers, while its effects in cold regions are generally inverse. It also accounted for the sensible heat fluxes caused by liquid water flow, which can contribute to heat transfer in both surface and deep layers. The FCS affects the estimation of surface sensible heat (SH) and latent heat (LH) and provides the details of soil heat and water transportation, which benefits to understand the inner physical process of soil water-heat migration.
Tomizawa, Ryoko; Yamano, Mayumi; Osako, Mitue; Hirabayashi, Naotugu; Oshima, Nobuo; Sigeta, Masahiro; Reeves, Scott
2017-12-01
Few scales currently exist to assess the quality of interprofessional teamwork through team members' perceptions of working together in mental health settings. The purpose of this study was to revise and validate an interprofessional scale to assess the quality of teamwork in inpatient psychiatric units and to use it multi-nationally. A literature review was undertaken to identify evaluative teamwork tools and develop an additional 12 items to ensure a broad global focus. Focus group discussions considered adaptation to different care systems using subjective judgements from 11 participants in a pre-test of items. Data quality, construct validity, reproducibility, and internal consistency were investigated in the survey using an international comparative design. Exploratory factor analysis yielded five factors with 21 items: 'patient/community centred care', 'collaborative communication', 'interprofessional conflict', 'role clarification', and 'environment'. High overall internal consistency, reproducibility, adequate face validity, and reasonable construct validity were shown in the USA and Japan. The revised Collaborative Practice Assessment Tool (CPAT) is a valid measure to assess the quality of interprofessional teamwork in psychiatry and identifies the best strategies to improve team performance. Furthermore, the revised scale will generate more rigorous evidence for collaborative practice in psychiatry internationally.
García-Sarrió, María Jesús; Sanz, María Luz; Sanz, Jesús; González-Coloma, Azucena; Cristina Soria, Ana
2018-04-14
A new microwave-assisted extraction (MAE) method using ethanol as solvent has been optimized by means of a Box-Behnken experimental design for the enhanced extraction of bioactive terpenoids from Mentha rotundifolia leaves; 100°C, 5 min, 1.125 g dry sample: 10 mL solvent and a single extraction cycle were selected as optimal conditions. Improved performance of MAE method in terms of extraction yield and/or reproducibility over conventional solid-liquid extraction and ultrasound assisted extraction was also previously assessed. A comprehensive characterization of MAE extracts was carried out by GC-MS. A total of 46 compounds, mostly terpenoids, were identified; piperitenone oxide and piperitenone were the major compounds determined. Several neophytadiene isomers were also detected for the first time in MAE extracts. Different procedures (solid-phase extraction and activated charcoal (AC) treatment) were also evaluated for clean-up of MAE extracts, with AC providing the highest enrichment in bioactive terpenoids. Finally, the MAE method here developed is shown as a green, fast, efficient and reproducible liquid extraction methodology to obtain M. rotundifolia bioactive extracts for further application, among others, as food preservatives. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nutrient cycle benchmarks for earth system land model
NASA Astrophysics Data System (ADS)
Zhu, Q.; Riley, W. J.; Tang, J.; Zhao, L.
2017-12-01
Projecting future biosphere-climate feedbacks using Earth system models (ESMs) relies heavily on robust modeling of land surface carbon dynamics. More importantly, soil nutrient (particularly, nitrogen (N) and phosphorus (P)) dynamics strongly modulate carbon dynamics, such as plant sequestration of atmospheric CO2. Prevailing ESM land models all consider nitrogen as a potentially limiting nutrient, and several consider phosphorus. However, including nutrient cycle processes in ESM land models potentially introduces large uncertainties that could be identified and addressed by improved observational constraints. We describe the development of two nutrient cycle benchmarks for ESM land models: (1) nutrient partitioning between plants and soil microbes inferred from 15N and 33P tracers studies and (2) nutrient limitation effects on carbon cycle informed by long-term fertilization experiments. We used these benchmarks to evaluate critical hypotheses regarding nutrient cycling and their representation in ESMs. We found that a mechanistic representation of plant-microbe nutrient competition based on relevant functional traits best reproduced observed plant-microbe nutrient partitioning. We also found that for multiple-nutrient models (i.e., N and P), application of Liebig's law of the minimum is often inaccurate. Rather, the Multiple Nutrient Limitation (MNL) concept better reproduces observed carbon-nutrient interactions.
Tutschek, B; Braun, T; Chantraine, F; Henrich, W
2011-01-01
Intrapartum translabial ultrasound (ITU) has the potential to objectively and quantitatively assess the progress of labour. The relationships between the different ITU parameters and their development during normal term labour have not been studied. Observational study. University teaching hospital. Labouring women with normal term fetuses in cephalic presentation. Intrapartum translabial ultrasound measurements for 'head station', 'head direction', and 'angle of descent' (AoD) were taken in 50 labouring women, compared, studied for repeatability, and correlated with the progress of labour. Reproducibility and correlation of ITU parameters and their pattern of changes during labour. All three ITU parameters were clinically well reproducible. AoD and head station were interchangeable, and could be calculated from each other. Head station and head direction changed in a typical pattern along the birth canal. Time to delivery correlated with ITU head station. Intrapartum translabial ultrasound is a simple technique that improves the understanding of normal and abnormal labour, enables the objective measurement of birth progress and provides a more scientific basis for assessing labour. © 2010 The Authors Journal compilation © RCOG 2010 BJOG An International Journal of Obstetrics and Gynaecology.
Stormwater Runoff and Water Quality Modeling in Urban Maryland
NASA Astrophysics Data System (ADS)
Wang, J.; Forman, B. A.; Natarajan, P.; Davis, A.
2015-12-01
Urbanization significantly affects storm water runoff through the creation of new impervious surfaces such as highways, parking lots, and rooftops. Such changes can adversely impact the downstream receiving water bodies in terms of physical, chemical, and biological conditions. In order to mitigate the effects of urbanization on downstream water bodies, stormwater control measures (SCMs) have been widely used (e.g., infiltration basins, bioswales). A suite of observations from an infiltration basin installed adjacent to a highway in urban Maryland was used to evaluate stormwater runoff attenuation and pollutant removal rates at the well-instrumented SCM study site. In this study, the Storm Water Management Model (SWMM) was used to simulate the performance of the SCM. An automatic, split-sample calibration framework was developed to improve SWMM performance efficiency. The results indicate SWMM can accurately reproduce the hydraulic response of the SCM (in terms of reproducing measured inflow and outflow) during synoptic scale storm events lasting more than one day, but is less accurate during storm events lasting only a few hours. Similar results were found for a suite of modeled (and observed) water quality constituents, including suspended sediment, metals, N, P, and chloride.
NASA Astrophysics Data System (ADS)
Liao, Zhenyu; Zhang, Ying; Su, Lin; Chang, Jin; Wang, Hanjie
2017-02-01
Ochratoxin A (OTA), the most harmful and abundant ochratoxin, is chemically stable and commonly existed in foodstuffs. In this work, upconversion luminescent-magnetic microbeads (UCLMMs) -based cytometric bead array for OTA detection with a less reagent consumption and high sensitivity has been established and optimized. In UCLMMs, upconversion nanocrystals (UCNs) for optical code present a weak background noise and no spectral cross talk between the encoding signals and target labels under two excitation conditions to improve detection sensitivity. While the superparamagnetic Fe3O4 nanoparticles (Fe3O4 NPs) aim for rapid analysis. The results show that the developed method has a sensitivity of 9.553 ppt below HPLC with a 50-μL sample and can be completed in <2 h with good accuracy and high reproducibility. Therefore, different colors of UCLMMs will become a promising assay platform for multiple mycotoxins after further improvement.
Hobbs, Marcia M.; Sparling, P. Frederick; Cohen, Myron S.; Shafer, William M.; Deal, Carolyn D.; Jerse, Ann E.
2011-01-01
Experimental infection of male volunteers with Neisseria gonorrhoeae is safe and reproduces the clinical features of naturally acquired gonococcal urethritis. Human inoculation studies have helped define the natural history of experimental infection with two well-characterized strains of N. gonorrhoeae, FA1090 and MS11mkC. The human model has proved useful for testing the importance of putative gonococcal virulence factors for urethral infection in men. Studies with isogenic mutants have improved our understanding of the requirements for gonococcal LOS structures, pili, opacity proteins, IgA1 protease, and the ability of infecting organisms to obtain iron from human transferrin and lactoferrin during uncomplicated urethritis. The model also presents opportunities to examine innate host immune responses that may be exploited or improved in development and testing of gonococcal vaccines. Here we review results to date with human experimental gonorrhea. PMID:21734909
[A case of loxoprofen-induced pulmonary eosinophilia].
Ii, T; Doutsu, Y; Ashitani, J; Taniguchi, H; Mizuta, M; Toshimori, H; Matsukura, S
1992-05-01
A 65-year-old female suffering from lumbago, headache, and hypertension had been treated with nonsteroidal anti-inflammatory drugs (NSAIDs) and antihypertensive drugs. On June 13, 1990, 2 weeks after the commencement of loxoprofen administration, she developed cough and low grade fever. She was treated with antibiotics and NSAIDs without improvement. Laboratory data showed marked eosinophilia (2200/mm3), elevation of IgE (3090 IU/ml), and liver dysfunction. Her chest X-ray revealed no active lesion, but the percentage of eosinophils in BALF was elevated (38%). Because drug-induced eosinophilic pneumonia was suspected, all drugs were discontinued. Her symptoms improved and the abnormalities of laboratory data normalized. The lymphocyte stimulation test was weakly positive with three NSAIDs (loxoprofen, pranoprofen, and alminoprofen). The challenge test by loxoprofen reproduced eosinophilia and liver dysfunction, suggesting that she had loxoprofen-induced eosinophilic pneumonia. To our knowledge, this is the first reported case of loxoprofen-induced lung injury.
Access to patents as sources to musical acoustics inventions
NASA Astrophysics Data System (ADS)
Brock-Nannestad, George
2005-09-01
Patents are important sources for the development of any technology. The paper addresses modern methods of access to patent publications relating to musical acoustics, in particular the constructions of instruments and components for instruments, methods for tuning, methods for teaching, and measuring equipment. The patent publications available are, among others, from the U.S., England, France, Germany, Japan, Russia, and the date range is from ca. 1880 to the present day. The two main searchable websites use different classification systems in their approach, and by suitable combination of the information it is possible to target the search efficiently. The paper will demonstrate the recent transfer of inventions relating to physical instruments to electronic simulations, and the fact that most recent inventions were made by independent inventors. A specific example is given by discussing the proposals for improved pipe organ and violin constructions invented in Denmark in the 1930s by Jarnak based on patented improvements for telephone reproducers.
Faddeev-chiral unitary approach to the K-d scattering length
NASA Astrophysics Data System (ADS)
Mizutani, T.; Fayard, C.; Saghai, B.; Tsushima, K.
2013-03-01
Our earlier Faddeev three-body study in the K--deuteron scattering length, AK-d, is revisited here in light of the recent developments on two fronts: (i) the improved chiral unitary approach to the theoretical description of the coupled K¯N related channels at low energies, and (ii) the new and improved measurement from SIDDHARTA Collaboration of the strong interaction energy shift and width in the lowest K--hydrogen atomic level. Those two, in combination, have allowed us to produce a reliable two-body input to the three-body calculation. All available low-energy K-p observables are well reproduced and predictions for the K¯N scattering lengths and amplitudes, (πΣ)∘ invariant-mass spectra, as well as for AK-d are put forward and compared with results from other sources. The findings of the present work are expected to be useful in interpreting the forthcoming data from CLAS, HADES, LEPS, and SIDDHARTA Collaborations.
Digital image analysis: improving accuracy and reproducibility of radiographic measurement.
Bould, M; Barnard, S; Learmonth, I D; Cunningham, J L; Hardy, J R
1999-07-01
To assess the accuracy and reproducibility of a digital image analyser and the human eye, in measuring radiographic dimensions. We experimentally compared radiographic measurement using either an image analyser system or the human eye with digital caliper. The assessment of total hip arthroplasty wear from radiographs relies on both the accuracy of radiographic images and the accuracy of radiographic measurement. Radiographs were taken of a slip gauge (30+/-0.00036 mm) and slip gauge with a femoral stem. The projected dimensions of the radiographic images were calculated by trigonometry. The radiographic dimensions were then measured by blinded observers using both techniques. For a single radiograph, the human eye was accurate to 0.26 mm and reproducible to +/-0.1 mm. In comparison the digital image analyser system was accurate to 0.01 mm with a reproducibility of +/-0.08 mm. In an arthroplasty model, where the dimensions of an object were corrected for magnification by the known dimensions of a femoral head, the human eye was accurate to 0.19 mm, whereas the image analyser system was accurate to 0.04 mm. The digital image analysis system is up to 20 times more accurate than the human eye, and in an arthroplasty model the accuracy of measurement increases four-fold. We believe such image analysis may allow more accurate and reproducible measurement of wear from standard follow-up radiographs.
Lleó-Pérez, A; Ortuño-Soto, A; Rahhal, M S; Martínez-Soriano, F; Sanchis-Gimeno, J A
2004-01-01
To evaluate quantitatively the intraobserver reproducibility of measurements of the retinal nerve fiber layer (RNFL) in healthy subjects and an ocular hypertensive population using two nerve fiber analyzers. Sixty eyes of normal (n=30) and ocular hypertensive subjects (n=30) were consecutively recruited for this study and underwent a complete ophthalmologic examination and achromatic automated perimetry. RNFL were measured using scanning laser polarimeter (GDx-VCC) and optical coherence tomography (OCT Model 3000). Reproducibility of the RNFL measurements obtained with both nerve fiber analyzers were compared using the coefficient of variation. In both groups the authors found fair correlations between the two methods in all ratio and thickness parameters. The mean coefficient of variation for measurement of the variables ranged from 2.24% to 13.12% for GDx-VCC, and from 5.01% to 9.24% for OCT Model 3000. The authors could not detect any significant differences between healthy and ocular hypertensive eyes, although in normal eyes the correlations improved slightly. Nevertheless, the test-retest correlation was slightly better for GDx-VCC than for OCT Model 3000 (5.55% and 7.11%, respectively). Retinal mapping software of both nerve fiber analyzers allows reproducible measurement of RNFL in both healthy subjects and ocular hypertensive eyes, and shows fair correlations and good intraobserver reproducibility. However, in our study, GDx showed a better test-retest correlation.
1981-07-01
process is observed over all of (0,1], the reproducing kernel Hilbert space (RKHS) techniques developed by Parzen (1961a, 1961b) 2 may be used to construct...covariance kernel,R, for the process (1.1) is the reproducing kernel for a reproducing kernel Hilbert space (RKHS) which will be denoted as H(R) (c.f...2.6), it is known that (c.f. Eubank, Smith and Smith (1981a, 1981b)), i) H(R) is a Hilbert function space consisting of functions which satisfy for fEH
All about Me: Reproducible Activity Sheets To Develop Self-Esteem in Your Students.
ERIC Educational Resources Information Center
Palomares, Susanna
This document contains a set of reproducible activity sheets for teachers to use in enhancing the self-esteem of their students. Designed to supplement other approaches being used by teachers, the activities in this book can be used to infuse esteem-building activities into the core curriculum. The activities are organized around several…
Rodriguez, Carly A.; Smith, Emily R.; Villamor, Eduardo; Zavaleta, Nelly; Respicio-Torres, Graciela; Contreras, Carmen; Perea, Sara; Jimenez, Judith; Tintaya, Karen; Lecca, Leonid; Murray, Megan B.; Franke, Molly F.
2017-01-01
Tools to assess intake among children in Latin America are limited. We developed and assessed the reproducibility and validity of a semi-quantitative food frequency questionnaire (FFQ) administered to children, adolescents, and their caregivers in Lima, Peru. We conducted 24-h diet recalls (DRs) and focus groups to develop a locally-tailored FFQ prototype for children aged 0–14 years. To validate the FFQ, we administered two FFQs and three DRs to children and/or their caregivers (N = 120) over six months. We examined FFQ reproducibility by quartile agreement and Pearson correlation coefficients, and validity by quartile agreement and correlation with DRs. For reproducibility, quartile agreement ranged from 60–77% with correlations highest for vitamins A and C (0.31). Age-adjusted correlations for the mean DR and the second-administered FFQ were highest in the 0–7 age group, in which the majority of caregivers completed the FFQ on behalf of the child (total fat; 0.67) and in the 8–14 age group, in which both the child and caregiver completed the FFQ together (calcium, niacin; 0.54); correlations were <0.10 for most nutrients in the 8–14 age group in which the caregiver completed the FFQ on the child’s behalf. The FFQ was reproducible and the first developed and validated to assess various nutrients in children and adolescents in Peru. PMID:29036893
Reproducibility2020: Progress and priorities
Freedman, Leonard P.; Venugopalan, Gautham; Wisman, Rosann
2017-01-01
The preclinical research process is a cycle of idea generation, experimentation, and reporting of results. The biomedical research community relies on the reproducibility of published discoveries to create new lines of research and to translate research findings into therapeutic applications. Since 2012, when scientists from Amgen reported that they were able to reproduce only 6 of 53 “landmark” preclinical studies, the biomedical research community began discussing the scale of the reproducibility problem and developing initiatives to address critical challenges. Global Biological Standards Institute (GBSI) released the “Case for Standards” in 2013, one of the first comprehensive reports to address the rising concern of irreproducible biomedical research. Further attention was drawn to issues that limit scientific self-correction, including reporting and publication bias, underpowered studies, lack of open access to methods and data, and lack of clearly defined standards and guidelines in areas such as reagent validation. To evaluate the progress made towards reproducibility since 2013, GBSI identified and examined initiatives designed to advance quality and reproducibility. Through this process, we identified key roles for funders, journals, researchers and other stakeholders and recommended actions for future progress. This paper describes our findings and conclusions. PMID:28620458
IMPROVEMENTS IN EPOXY RESIN EMBEDDING METHODS
Luft, John H.
1961-01-01
Epoxy embedding methods of Glauert and Kushida have been modified so as to yield rapid, reproducible, and convenient embedding methods for electron microscopy. The sections are robust and tissue damage is less than with methacrylate embedding. PMID:13764136
Evolution of Magnetized Liner Inertial Fusion (MagLIF) Targets
Fooks, J. A.; Carlson, L. C.; Fitzsimmons, P.; ...
2017-12-19
Here, the magnetized liner inertial fusion (MagLIF) experimental campaign conducted at the University of Rochester’s Laboratory for Laser Energetics (LLE) has evolved significantly since its start in 2014. Scientific requirements and OMEGA EP system technology both have progressed, resulting in necessary and available updates to the target design. These include, but are not limited to: optimizing target dimensions and aspect ratios to maximize survival at desired pressures; coating target components to enhance physics diagnosis; precision-machining diagnostic windows along the axis of the target; improving fiducial placement reproducibility and reducing subsequent assembly time by 50%; and implementing gas-pressure transducers on themore » targets. In addition, target fabrication techniques have changed and improved, allowing for simpler target reproducibility and decreased assembly time. To date, eleven variations of targets have been fabricated, with successful target fielding ranging from 1 to 20atm internal pressure and a maximum survivability of 33atm.« less
Δ isobars and nuclear saturation
NASA Astrophysics Data System (ADS)
Ekström, A.; Hagen, G.; Morris, T. D.; Papenbrock, T.; Schwartz, P. D.
2018-02-01
We construct a nuclear interaction in chiral effective field theory with explicit inclusion of the Δ -isobar Δ (1232 ) degree of freedom at all orders up to next-to-next-to-leading order (NNLO). We use pion-nucleon (π N ) low-energy constants (LECs) from a Roy-Steiner analysis of π N scattering data, optimize the LECs in the contact potentials up to NNLO to reproduce low-energy nucleon-nucleon scattering phase shifts, and constrain the three-nucleon interaction at NNLO to reproduce the binding energy and point-proton radius of 4He. For heavier nuclei we use the coupled-cluster method to compute binding energies, radii, and neutron skins. We find that radii and binding energies are much improved for interactions with explicit inclusion of Δ (1232 ) , while Δ -less interactions produce nuclei that are not bound with respect to breakup into α particles. The saturation of nuclear matter is significantly improved, and its symmetry energy is consistent with empirical estimates.
Evolution of Magnetized Liner Inertial Fusion (MagLIF) Targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fooks, J. A.; Carlson, L. C.; Fitzsimmons, P.
Here, the magnetized liner inertial fusion (MagLIF) experimental campaign conducted at the University of Rochester’s Laboratory for Laser Energetics (LLE) has evolved significantly since its start in 2014. Scientific requirements and OMEGA EP system technology both have progressed, resulting in necessary and available updates to the target design. These include, but are not limited to: optimizing target dimensions and aspect ratios to maximize survival at desired pressures; coating target components to enhance physics diagnosis; precision-machining diagnostic windows along the axis of the target; improving fiducial placement reproducibility and reducing subsequent assembly time by 50%; and implementing gas-pressure transducers on themore » targets. In addition, target fabrication techniques have changed and improved, allowing for simpler target reproducibility and decreased assembly time. To date, eleven variations of targets have been fabricated, with successful target fielding ranging from 1 to 20atm internal pressure and a maximum survivability of 33atm.« less
The development of accurate and high quality radiotherapy treatment delivery
NASA Astrophysics Data System (ADS)
Griffiths, Susan E.
Accurate radiotherapy delivery is required for curing cancer. Historical radiotherapy accuracy studies at Leeds (1983-1991) are discussed in context of when radiographers were not involved in practice design. The seminal research was unique in being led by a radiographer practitioner, and in prospectively studying the accuracy of different techniques within one department. The viability of alignment of treatment beams with marks painted on a patient's skin varied daily, and, using film I showed that the alignment of treatment on anatomy varied. I then led 6 sequential studies with collaborating oncologists. Unique outcomes were in identifying the origins of treatment inaccuracies, implementing and evidencing changes in multi-disciplinary practice, thus improving accuracy and reproducibility generally and achieving accuracy for the pelvis to within current norms. Innovations included: discontinuation of painted skin marks and developing whole-body patient positioning using lasers, tattoos, and standardised supports; unification of set-up conditions through planning and treatment; planning normal tissue margins round target tissue to allow for inaccuracies (1985); improved manual shielding methods, changed equipment usage, its quality assurance and design; influenced the development of portal imaging and image analysis. Consequences and current implications. The research, still cited internationally, contributed to clinical management of lymphoma, and critically underpins contemporary practice. It led to my becoming the first radiographer invited into multi-disciplinary collaborative work, to advise in the first multi-centre clinical trials to consider treatment delivery accuracy, contribute to books written from within other disciplines and inform guidelines for good practice so helping to improve practices, with recent publications. I thus led my profession into research activity. Later work included development of a national staffing formula for radiotherapy Centres, and contributing to the evidence-base for improved National radiotherapy resourcing. I recently researched and developed a textbook (second edition) on quality in treatment delivery.
Synthetic Biology Open Language (SBOL) Version 2.2.0.
Cox, Robert Sidney; Madsen, Curtis; McLaughlin, James Alastair; Nguyen, Tramy; Roehner, Nicholas; Bartley, Bryan; Beal, Jacob; Bissell, Michael; Choi, Kiri; Clancy, Kevin; Grünberg, Raik; Macklin, Chris; Misirli, Goksel; Oberortner, Ernst; Pocock, Matthew; Samineni, Meher; Zhang, Michael; Zhang, Zhen; Zundel, Zach; Gennari, John H; Myers, Chris; Sauro, Herbert; Wipat, Anil
2018-04-02
Synthetic biology builds upon the techniques and successes of genetics, molecular biology, and metabolic engineering by applying engineering principles to the design of biological systems. The field still faces substantial challenges, including long development times, high rates of failure, and poor reproducibility. One method to ameliorate these problems would be to improve the exchange of information about designed systems between laboratories. The synthetic biology open language (SBOL) has been developed as a standard to support the specification and exchange of biological design information in synthetic biology, filling a need not satisfied by other pre-existing standards. This document details version 2.2.0 of SBOL that builds upon version 2.1.0 published in last year's JIB special issue. In particular, SBOL 2.2.0 includes improved description and validation rules for genetic design provenance, an extension to support combinatorial genetic designs, a new class to add non-SBOL data as attachments, a new class for genetic design implementations, and a description of a methodology to describe the entire design-build-test-learn cycle within the SBOL data model.
NASA Technical Reports Server (NTRS)
Revankar, Vithal; Hlavacek, Vladimir
1991-01-01
The chemical vapor deposition (CVD) synthesis of fibers capable of effectively reinforcing intermetallic matrices at elevated temperatures which can be used for potential applications in high temperature composite materials is described. This process was used due to its advantage over other fiber synthesis processes. It is extremely important to produce these fibers with good reproducible and controlled growth rates. However, the complex interplay of mass and energy transfer, blended with the fluid dynamics makes this a formidable task. The design and development of CVD reactor assembly and system to synthesize TiB2, CrB, B4C, and TiC fibers was performed. Residual thermal analysis for estimating stresses arising form thermal expansion mismatch were determined. Various techniques to improve the mechanical properties were also performed. Various techniques for improving the fiber properties were elaborated. The crystal structure and its orientation for TiB2 fiber is discussed. An overall view of the CVD process to develop CrB2, TiB2, and other high performance ceramic fibers is presented.
Donor insemination: eugenic and feminist implications.
Hanson, F A
2001-09-01
One concern regarding developments in genetics is that, when techniques such as genetic engineering become safe and affordable, people will use them for positive eugenics: to "improve" their offspring by enpowering them with exceptional qualities. Another is whether new reproductive technologies are being used to improve the condition of women or as the tools of a patriarchal system that appropriates female functions to itself and exploits women to further its own ends. Donor insemination is relevant to both of these issues. The degree to which people have used donor insemination in the past for positive eugenic purposes may give some insight into the likelihood of developing technologies being so used in the future. Donor insemination provides women with the opportunity to reproduce with only the most remote involvement of a man. To what degree do women take advantage of this to liberate themselves from male dominance? Through questionnaires and interviews, women who have used donor insemination disclosed their criteria for selecting sperm donors. The results are analyzed for the prevalence of positive eugenic criteria in the selection process and women's attitudes toward minimizing the male role in reproduction.
Modeling seasonal variability of fecal coliform in natural surface waters using the modified SWAT
NASA Astrophysics Data System (ADS)
Cho, Kyung Hwa; Pachepsky, Yakov A.; Kim, Minjeong; Pyo, JongCheol; Park, Mi-Hyun; Kim, Young Mo; Kim, Jung-Woo; Kim, Joon Ha
2016-04-01
Fecal coliforms are indicators of pathogens and thereby, understanding of their fate and transport in surface waters is important to protect drinking water sources and public health. We compiled fecal coliform observations from four different sites in the USA and Korea and found a seasonal variability with a significant connection to temperature levels. In all observations, fecal coliform concentrations were relatively higher in summer and lower during the winter season. This could be explained by the seasonal dominance of growth or die-off of bacteria in soil and in-stream. Existing hydrologic models, however, have limitations in simulating the seasonal variability of fecal coliform. Soil and in-stream bacterial modules of the Soil and Water Assessment Tool (SWAT) model are oversimplified in that they exclude simulations of alternating bacterial growth. This study develops a new bacteria subroutine for the SWAT in an attempt to improve its prediction accuracy. We introduced critical temperatures as a parameter to simulate the onset of bacterial growth/die-off and to reproduce the seasonal variability of bacteria. The module developed in this study will improve modeling for environmental management schemes.
Murray, Christopher J L; Laakso, Thomas; Shibuya, Kenji; Hill, Kenneth; Lopez, Alan D
2007-09-22
Global efforts have increased the accuracy and timeliness of estimates of under-5 mortality; however, these estimates fail to use all data available, do not use transparent and reproducible methods, do not distinguish predictions from measurements, and provide no indication of uncertainty around point estimates. We aimed to develop new reproducible methods and reanalyse existing data to elucidate detailed time trends. We merged available databases, added to them when possible, and then applied Loess regression to estimate past trends and forecast to 2015 for 172 countries. We developed uncertainty estimates based on different model specifications and estimated levels and trends in neonatal, post-neonatal, and childhood mortality. Global under-5 mortality has fallen from 110 (109-110) per 1000 in 1980 to 72 (70-74) per 1000 in 2005. Child deaths worldwide have decreased from 13.5 (13.4-13.6) million in 1980 to an estimated 9.7 (9.5-10.0) million in 2005. Global under-5 mortality is expected to decline by 27% from 1990 to 2015, substantially less than the target of Millennium Development Goal 4 (MDG4) of a 67% decrease. Several regions in Latin America, north Africa, the Middle East, Europe, and southeast Asia have had consistent annual rates of decline in excess of 4% over 35 years. Global progress on MDG4 is dominated by slow reductions in sub-Saharan Africa, which also has the slowest rates of decline in fertility. Globally, we are not doing a better job of reducing child mortality now than we were three decades ago. Further improvements in the quality and timeliness of child-mortality measurements should be possible by more fully using existing datasets and applying standard analytical strategies.
Development of a Consistent and Reproducible Porcine Scald Burn Model
Kempf, Margit; Kimble, Roy; Cuttle, Leila
2016-01-01
There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153
NASA Astrophysics Data System (ADS)
Chirra, Prathyush; Leo, Patrick; Yim, Michael; Bloch, B. Nicolas; Rastinehad, Ardeshir R.; Purysko, Andrei; Rosen, Mark; Madabhushi, Anant; Viswanath, Satish
2018-02-01
The recent advent of radiomics has enabled the development of prognostic and predictive tools which use routine imaging, but a key question that still remains is how reproducible these features may be across multiple sites and scanners. This is especially relevant in the context of MRI data, where signal intensity values lack tissue specific, quantitative meaning, as well as being dependent on acquisition parameters (magnetic field strength, image resolution, type of receiver coil). In this paper we present the first empirical study of the reproducibility of 5 different radiomic feature families in a multi-site setting; specifically, for characterizing prostate MRI appearance. Our cohort comprised 147 patient T2w MRI datasets from 4 different sites, all of which were first pre-processed to correct acquisition-related for artifacts such as bias field, differing voxel resolutions, as well as intensity drift (non-standardness). 406 3D voxel wise radiomic features were extracted and evaluated in a cross-site setting to determine how reproducible they were within a relatively homogeneous non-tumor tissue region; using 2 different measures of reproducibility: Multivariate Coefficient of Variation and Instability Score. Our results demonstrated that Haralick features were most reproducible between all 4 sites. By comparison, Laws features were among the least reproducible between sites, as well as performing highly variably across their entire parameter space. Similarly, the Gabor feature family demonstrated good cross-site reproducibility, but for certain parameter combinations alone. These trends indicate that despite extensive pre-processing, only a subset of radiomic features and associated parameters may be reproducible enough for use within radiomics-based machine learning classifier schemes.
Cui, Xiao-Yan; Huo, Zhong-Gang; Xin, Zhong-Hua; Tian, Xiao; Zhang, Xiao-Dong
2013-07-01
Three-dimensional (3D) copying of artificial ears and pistol printing are pushing laser three-dimensional copying technique to a new page. Laser three-dimensional scanning is a fresh field in laser application, and plays an irreplaceable part in three-dimensional copying. Its accuracy is the highest among all present copying techniques. Reproducibility degree marks the agreement of copied object with the original object on geometry, being the most important index property in laser three-dimensional copying technique. In the present paper, the error of laser three-dimensional copying was analyzed. The conclusion is that the data processing to the point cloud of laser scanning is the key technique to reduce the error and increase the reproducibility degree. The main innovation of this paper is as follows. On the basis of traditional ant colony optimization, rational ant colony optimization algorithm proposed by the author was applied to the laser three-dimensional copying as a new algorithm, and was put into practice. Compared with customary algorithm, rational ant colony optimization algorithm shows distinct advantages in data processing of laser three-dimensional copying, reducing the error and increasing the reproducibility degree of the copy.
Theory, development, and applicability of the surface water hydrologic model CASC2D
NASA Astrophysics Data System (ADS)
Downer, Charles W.; Ogden, Fred L.; Martin, William D.; Harmon, Russell S.
2002-02-01
Numerical tests indicate that Hortonian runoff mechanisms benefit from scaling effects that non-Hortonian runoff mechanisms do not share. This potentially makes Hortonian watersheds more amenable to physically based modelling provided that the physically based model employed properly accounts for rainfall distribution and initial soil moisture conditions, to which these types of model are highly sensitive. The distributed Hortonian runoff model CASC2D has been developed and tested for the US Army over the past decade. The purpose of the model is to provide the Army with superior predictions of runoff and stream-flow compared with the standard lumped parameter model HEC-1. The model is also to be used to help minimize negative effects on the landscape caused by US armed forces training activities. Development of the CASC2D model is complete and the model has been tested and applied at several locations. These applications indicate that the model can realistically reproduce hydrographs when properly applied. These applications also indicate that there may be many situations where the model is inadequate. Because of this, the Army is pursuing development of a new model, GSSHA, that will provide improved numerical stability and incorporate additional stream-flow-producing mechanisms and improved hydraulics.
Richardson, Peter M; Jackson, Scott; Parrott, Andrew J; Nordon, Alison; Duckett, Simon B; Halse, Meghan E
2018-07-01
Signal amplification by reversible exchange (SABRE) is a hyperpolarisation technique that catalytically transfers nuclear polarisation from parahydrogen, the singlet nuclear isomer of H 2 , to a substrate in solution. The SABRE exchange reaction is carried out in a polarisation transfer field (PTF) of tens of gauss before transfer to a stronger magnetic field for nuclear magnetic resonance (NMR) detection. In the simplest implementation, polarisation transfer is achieved by shaking the sample in the stray field of a superconducting NMR magnet. Although convenient, this method suffers from limited reproducibility and cannot be used with NMR spectrometers that do not have appreciable stray fields, such as benchtop instruments. Here, we use a simple hand-held permanent magnet array to provide the necessary PTF during sample shaking. We find that the use of this array provides a 25% increase in SABRE enhancement over the stray field approach, while also providing improved reproducibility. Arrays with a range of PTFs were tested, and the PTF-dependent SABRE enhancements were found to be in excellent agreement with comparable experiments carried out using an automated flow system where an electromagnet is used to generate the PTF. We anticipate that this approach will improve the efficiency and reproducibility of SABRE experiments carried out using manual shaking and will be particularly useful for benchtop NMR, where a suitable stray field is not readily accessible. The ability to construct arrays with a range of PTFs will also enable the rapid optimisation of SABRE enhancement as function of PTF for new substrate and catalyst systems. © 2017 The Authors Magnetic Resonance in Chemistry Published by John Wiley & Sons Ltd.
Aagten-Murphy, David; Cappagli, Giulia; Burr, David
2014-03-01
Expert musicians are able to time their actions accurately and consistently during a musical performance. We investigated how musical expertise influences the ability to reproduce auditory intervals and how this generalises across different techniques and sensory modalities. We first compared various reproduction strategies and interval length, to examine the effects in general and to optimise experimental conditions for testing the effect of music, and found that the effects were robust and consistent across different paradigms. Focussing on a 'ready-set-go' paradigm subjects reproduced time intervals drawn from distributions varying in total length (176, 352 or 704 ms) or in the number of discrete intervals within the total length (3, 5, 11 or 21 discrete intervals). Overall, Musicians performed more veridical than Non-Musicians, and all subjects reproduced auditory-defined intervals more accurately than visually-defined intervals. However, Non-Musicians, particularly with visual stimuli, consistently exhibited a substantial and systematic regression towards the mean interval. When subjects judged intervals from distributions of longer total length they tended to regress more towards the mean, while the ability to discriminate between discrete intervals within the distribution had little influence on subject error. These results are consistent with a Bayesian model that minimizes reproduction errors by incorporating a central tendency prior weighted by the subject's own temporal precision relative to the current distribution of intervals. Finally a strong correlation was observed between all durations of formal musical training and total reproduction errors in both modalities (accounting for 30% of the variance). Taken together these results demonstrate that formal musical training improves temporal reproduction, and that this improvement transfers from audition to vision. They further demonstrate the flexibility of sensorimotor mechanisms in adapting to different task conditions to minimise temporal estimation errors. © 2013.
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
Label-Free Toxin Detection by Means of Time-Resolved Electrochemical Impedance Spectroscopy
Chai, Changhoon; Takhistov, Paul
2010-01-01
The real-time detection of trace concentrations of biological toxins requires significant improvement of the detection methods from those reported in the literature. To develop a highly sensitive and selective detection device it is necessary to determine the optimal measuring conditions for the electrochemical sensor in three domains: time, frequency and polarization potential. In this work we utilized a time-resolved electrochemical impedance spectroscopy for the detection of trace concentrations of Staphylococcus enterotoxin B (SEB). An anti-SEB antibody has been attached to the nano-porous aluminum surface using 3-aminopropyltriethoxysilane/glutaraldehyde coupling system. This immobilization method allows fabrication of a highly reproducible and stable sensing device. Using developed immobilization procedure and optimized detection regime, it is possible to determine the presence of SEB at the levels as low as 10 pg/mL in 15 minutes. PMID:22315560