Sample records for experimental protocol designed

  1. Latency correction of event-related potentials between different experimental protocols

    NASA Astrophysics Data System (ADS)

    Iturrate, I.; Chavarriaga, R.; Montesano, L.; Minguez, J.; Millán, JdR

    2014-06-01

    Objective. A fundamental issue in EEG event-related potentials (ERPs) studies is the amount of data required to have an accurate ERP model. This also impacts the time required to train a classifier for a brain-computer interface (BCI). This issue is mainly due to the poor signal-to-noise ratio and the large fluctuations of the EEG caused by several sources of variability. One of these sources is directly related to the experimental protocol or application designed, and may affect the amplitude or latency of ERPs. This usually prevents BCI classifiers from generalizing among different experimental protocols. In this paper, we analyze the effect of the amplitude and the latency variations among different experimental protocols based on the same type of ERP. Approach. We present a method to analyze and compensate for the latency variations in BCI applications. The algorithm has been tested on two widely used ERPs (P300 and observation error potentials), in three experimental protocols in each case. We report the ERP analysis and single-trial classification. Main results. The results obtained show that the designed experimental protocols significantly affect the latency of the recorded potentials but not the amplitudes. Significance. These results show how the use of latency-corrected data can be used to generalize the BCIs, reducing the calibration time when facing a new experimental protocol.

  2. Design and Evaluation of Complex Moving HIFU Treatment Protocols

    NASA Astrophysics Data System (ADS)

    Kargl, Steven G.; Andrew, Marilee A.; Kaczkowski, Peter J.; Brayman, Andrew A.; Crum, Lawrence A.

    2005-03-01

    The use of moving high-intensity focused ultrasound (HIFU) treatment protocols is of interest in achieving efficient formation of large-volume thermal lesions in tissue. Judicious protocol design is critical in order to avoid collateral damage to healthy tissues outside the treatment zone. A KZK-BHTE model, extended to simulate multiple, moving scans in tissue, is used to investigate protocol design considerations. Prediction and experimental observations are presented which 1) validate the model, 2) illustrate how to assess the effects of acoustic nonlinearity, and 3) demonstrate how to assess and control collateral damage such as prefocal lesion formation and lesion formation resulting from thermal conduction without direct HIFU exposure. Experimental data consist of linear and circular scan protocols delivered over a range of exposure regimes in ex vivo bovine liver.

  3. 75 FR 23775 - Agency Information Collection Activities; Proposed Collection; Comment Request; Experimental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-04

    ... for research entitled ``Experimental Study of Patient Information Prototypes.'' This study is designed..., the working group refined several prototypes and designed a study to investigate the usefulness of... Design and Protocol The study is experimental and will have two independent variables in a 3 x 2 design...

  4. Selection for production-related traits in Pelargonium zonale: improved design and analysis make all the difference

    PubMed Central

    Molenaar, Heike; Glawe, Martin; Boehm, Robert; Piepho, Hans-Peter

    2017-01-01

    Ornamental plant variety improvement is limited by current phenotyping approaches and neglected use of experimental designs. The present study was conducted to show the benefits of using an experimental design and corresponding analysis in ornamental breeding regarding simulated response to selection in Pelargonium zonale for production-related traits. This required establishment of phenotyping protocols for root formation and stem cutting counts, with which 974 genotypes were assessed in a two-phase experimental design. The present paper evaluates this protocol. The possibility of varietal improvement through indirect selection on secondary traits such as branch count and flower count was assessed by genetic correlations. Simulated response to selection varied greatly, depending on the genotypic variances of the breeding population and traits. A varietal improvement of over 20% is possible for stem cutting count, root formation, branch count and flower count. In contrast, indirect selection of stem cutting count by branch count or flower count was found to be ineffective. The established phenotypic protocols and two-phase experimental designs are valuable tools for breeding of P. zonale. PMID:28243453

  5. Selection for production-related traits in Pelargonium zonale: improved design and analysis make all the difference.

    PubMed

    Molenaar, Heike; Glawe, Martin; Boehm, Robert; Piepho, Hans-Peter

    2017-01-01

    Ornamental plant variety improvement is limited by current phenotyping approaches and neglected use of experimental designs. The present study was conducted to show the benefits of using an experimental design and corresponding analysis in ornamental breeding regarding simulated response to selection in Pelargonium zonale for production-related traits. This required establishment of phenotyping protocols for root formation and stem cutting counts, with which 974 genotypes were assessed in a two-phase experimental design. The present paper evaluates this protocol. The possibility of varietal improvement through indirect selection on secondary traits such as branch count and flower count was assessed by genetic correlations. Simulated response to selection varied greatly, depending on the genotypic variances of the breeding population and traits. A varietal improvement of over 20% is possible for stem cutting count, root formation, branch count and flower count. In contrast, indirect selection of stem cutting count by branch count or flower count was found to be ineffective. The established phenotypic protocols and two-phase experimental designs are valuable tools for breeding of P. zonale .

  6. A Range Finding Protocol to Support Design for Transcriptomics Experimentation: Examples of In-Vitro and In-Vivo Murine UV Exposure

    PubMed Central

    van Oostrom, Conny T.; Jonker, Martijs J.; de Jong, Mark; Dekker, Rob J.; Rauwerda, Han; Ensink, Wim A.; de Vries, Annemieke; Breit, Timo M.

    2014-01-01

    In transcriptomics research, design for experimentation by carefully considering biological, technological, practical and statistical aspects is very important, because the experimental design space is essentially limitless. Usually, the ranges of variable biological parameters of the design space are based on common practices and in turn on phenotypic endpoints. However, specific sub-cellular processes might only be partially reflected by phenotypic endpoints or outside the associated parameter range. Here, we provide a generic protocol for range finding in design for transcriptomics experimentation based on small-scale gene-expression experiments to help in the search for the right location in the design space by analyzing the activity of already known genes of relevant molecular mechanisms. Two examples illustrate the applicability: in-vitro UV-C exposure of mouse embryonic fibroblasts and in-vivo UV-B exposure of mouse skin. Our pragmatic approach is based on: framing a specific biological question and associated gene-set, performing a wide-ranged experiment without replication, eliminating potentially non-relevant genes, and determining the experimental ‘sweet spot’ by gene-set enrichment plus dose-response correlation analysis. Examination of many cellular processes that are related to UV response, such as DNA repair and cell-cycle arrest, revealed that basically each cellular (sub-) process is active at its own specific spot(s) in the experimental design space. Hence, the use of range finding, based on an affordable protocol like this, enables researchers to conveniently identify the ‘sweet spot’ for their cellular process of interest in an experimental design space and might have far-reaching implications for experimental standardization. PMID:24823911

  7. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the test system. (8) A description of the experimental design, including methods for the control of... 40 Protection of Environment 31 2010-07-01 2010-07-01 true Protocol. 792.120 Section 792.120... at which the study is being conducted. (4) The proposed experimental start and termination dates. (5...

  8. Evaluating the Process of Generating a Clinical Trial Protocol

    PubMed Central

    Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.

    2002-01-01

    The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.

  9. Using an Animal Group Vigilance Practical Session to Give Learners a "Heads-Up" to Problems in Experimental Design

    ERIC Educational Resources Information Center

    Rands, Sean A.

    2011-01-01

    The design of experimental ecological fieldwork is difficult to teach to classes, particularly when protocols for data collection are normally carefully controlled by the class organiser. Normally, reinforcement of the some problems of experimental design such as the avoidance of pseudoreplication and appropriate sampling techniques does not occur…

  10. ProtocolNavigator: emulation-based software for the design, documentation and reproduction biological experiments.

    PubMed

    Khan, Imtiaz A; Fraser, Adam; Bray, Mark-Anthony; Smith, Paul J; White, Nick S; Carpenter, Anne E; Errington, Rachel J

    2014-12-01

    Experimental reproducibility is fundamental to the progress of science. Irreproducible research decreases the efficiency of basic biological research and drug discovery and impedes experimental data reuse. A major contributing factor to irreproducibility is difficulty in interpreting complex experimental methodologies and designs from written text and in assessing variations among different experiments. Current bioinformatics initiatives either are focused on computational research reproducibility (i.e. data analysis) or laboratory information management systems. Here, we present a software tool, ProtocolNavigator, which addresses the largely overlooked challenges of interpretation and assessment. It provides a biologist-friendly open-source emulation-based tool for designing, documenting and reproducing biological experiments. ProtocolNavigator was implemented in Python 2.7, using the wx module to build the graphical user interface. It is a platform-independent software and freely available from http://protocolnavigator.org/index.html under the GPL v2 license. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Experimental high-speed network

    NASA Astrophysics Data System (ADS)

    McNeill, Kevin M.; Klein, William P.; Vercillo, Richard; Alsafadi, Yasser H.; Parra, Miguel V.; Dallas, William J.

    1993-09-01

    Many existing local area networking protocols currently applied in medical imaging were originally designed for relatively low-speed, low-volume networking. These protocols utilize small packet sizes appropriate for text based communication. Local area networks of this type typically provide raw bandwidth under 125 MHz. These older network technologies are not optimized for the low delay, high data traffic environment of a totally digital radiology department. Some current implementations use point-to-point links when greater bandwidth is required. However, the use of point-to-point communications for a total digital radiology department network presents many disadvantages. This paper describes work on an experimental multi-access local area network called XFT. The work includes the protocol specification, and the design and implementation of network interface hardware and software. The protocol specifies the Physical and Data Link layers (OSI layers 1 & 2) for a fiber-optic based token ring providing a raw bandwidth of 500 MHz. The protocol design and implementation of the XFT interface hardware includes many features to optimize image transfer and provide flexibility for additional future enhancements which include: a modular hardware design supporting easy portability to a variety of host system buses, a versatile message buffer design providing 16 MB of memory, and the capability to extend the raw bandwidth of the network to 3.0 GHz.

  12. Guidelines for experimental design protocol and validation procedure for the measurement of heat resistance of microorganisms in milk.

    PubMed

    Condron, Robin; Farrokh, Choreh; Jordan, Kieran; McClure, Peter; Ross, Tom; Cerf, Olivier

    2015-01-02

    Studies on the heat resistance of dairy pathogens are a vital part of assessing the safety of dairy products. However, harmonized methodology for the study of heat resistance of food pathogens is lacking, even though there is a need for such harmonized experimental design protocols and for harmonized validation procedures for heat treatment studies. Such an approach is of particular importance to allow international agreement on appropriate risk management of emerging potential hazards for human and animal health. This paper is working toward establishment of a harmonized protocol for the study of the heat resistance of pathogens, identifying critical issues for establishment of internationally agreed protocols, including a harmonized framework for reporting and interpretation of heat inactivation studies of potentially pathogenic microorganisms. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... system. (8) A description of the experimental design, including methods for the control of bias. (9... being conducted. (4) The proposed experimental start and termination dates. (5) Justification for...

  14. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system. (8) A description of the experimental design, including methods for the control of bias. (9... being conducted. (4) The proposed experimental start and termination dates. (5) Justification for...

  15. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system. (8) A description of the experimental design, including methods for the control of bias. (9... being conducted. (4) The proposed experimental start and termination dates. (5) Justification for...

  16. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system. (8) A description of the experimental design, including methods for the control of bias. (9... being conducted. (4) The proposed experimental start and termination dates. (5) Justification for...

  17. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... system. (8) A description of the experimental design, including methods for the control of bias. (9... being conducted. (4) The proposed experimental start and termination dates. (5) Justification for...

  18. Comparison of a Stimulus Equivalence Protocol and Traditional Lecture for Teaching Single-Subject Designs

    ERIC Educational Resources Information Center

    Lovett, Sadie; Rehfeldt, Ruth Anne; Garcia, Yors; Dunning, Johnna

    2011-01-01

    This study compared the effects of a computer-based stimulus equivalence protocol to a traditional lecture format in teaching single-subject experimental design concepts to undergraduate students. Participants were assigned to either an equivalence or a lecture group, and performance on a paper-and-pencil test that targeted relations among the…

  19. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    PubMed

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  20. Environmental Compliance Assessment Protocol-Centers for Disease Control and Prevention (ECAP-CDC)

    DTIC Science & Technology

    1993-10-01

    propellers, or appliances. 2. Military weapons or equipment designed for combat use. 3. Rockets or equipment designed for research, or experimental or...should be reproduced and used during the assessment to take notes. It is designed to be inserted between each page of the protocols, allowing the...procedures are designed as an aid and should not be considered exhaus- tive. Use of the guide requires the evaluator’s judgement to play a role in

  1. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549

  2. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the test system. (8) A description of the experimental design, including methods for the control of... at which the study is being conducted. (4) The proposed experimental start and termination dates. (5...

  3. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the test system. (8) A description of the experimental design, including methods for the control of... at which the study is being conducted. (4) The proposed experimental start and termination dates. (5...

  4. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the test system. (8) A description of the experimental design, including methods for the control of... at which the study is being conducted. (4) The proposed experimental start and termination dates. (5...

  5. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the test system. (8) A description of the experimental design, including methods for the control of... at which the study is being conducted. (4) The proposed experimental start and termination dates. (5...

  6. Guided Inquiry in a Biochemistry Laboratory Course Improves Experimental Design Ability

    ERIC Educational Resources Information Center

    Goodey, Nina M.; Talgar, Cigdem P.

    2016-01-01

    Many biochemistry laboratory courses expose students to laboratory techniques through pre-determined experiments in which students follow stepwise protocols provided by the instructor. This approach fails to provide students with sufficient opportunities to practice experimental design and critical thinking. Ten inquiry modules were created for a…

  7. The effects of incubation temperature and experimental design on heart rates of lizard embryos.

    PubMed

    Hulbert, Austin C; Mitchell, Timothy S; Hall, Joshua M; Guiffre, Cassia M; Douglas, Danielle C; Warner, Daniel A

    2017-08-01

    Many studies of phenotypic plasticity alter environmental conditions during embryonic development, yet only measure phenotypes at the neonatal stage (after embryonic development). However, measuring aspects of embryo physiology enhances our understanding of how environmental factors immediately affect embryos, which aids our understanding of developmental plasticity. While current research on reptile developmental plasticity has demonstrated that fluctuating incubation temperatures affect development differently than constant temperatures, most research on embryo physiology is still performed with constant temperature experiments. In this study, we noninvasively measured embryonic heart rates of the brown anole (Anolis sagrei), across ecologically relevant fluctuating temperatures. We incubated eggs under temperatures measured from potential nests in the field and examined how heart rates change through a diel cycle and throughout embryonic development. We also evaluated how experimental design (e.g., repeated vs. single measures designs, constant vs. fluctuating temperatures) and different protocols (e.g., removing eggs from incubators) might influence heart rate. We found that heart rates were correlated with daily temperature and increased through development. Our findings suggest that experimenters have reasonable flexibility in choosing an experimental design to address their questions; however, some aspects of design and protocol can potentially influence estimations of heart rates. Overall, we present the first ecologically relevant measures of anole embryonic heart rates and provide recommendations for experimental designs for future experiments. © 2017 Wiley Periodicals, Inc.

  8. Experimental demonstration of the anti-maser

    NASA Astrophysics Data System (ADS)

    Mazzocco, Anthony; Aviles, Michael; Andrews, Jim; Dawson, Nathan; Crescimanno, Michael

    2012-10-01

    We denote by ``anti-maser'' a coherent perfect absorption (CPA) process in the radio frequency domain. We demonstrate several experimental realizations of the anti-maser suitable for an advanced undergraduate laboratory. Students designed, assembled and tested these devices, as well as the inexpensive laboratory setup and experimental protocol for displaying various CPA phenomenon.

  9. Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2014-01-01

    Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.

  10. The Effects of Certain Background Noises on the Performance of a Voice Recognition System.

    DTIC Science & Technology

    1980-09-01

    Principles in Experimental Design. New York: McGraw-Hill, 1962. Woodworth, R.S. and H. Schlosberg, Experimental Psychology, (Revised edition), New...collection iheet APPENDIX II EXPERIMENTAL PROTOCOL AND SUBJECTS’ INSTRICTJONS THIS IS AN EXPERIMENT DESIGNED TO EVALUJATE SOME ," lE RECOGNITION EQUIPMENT. I...37. CDR Paul Chatelier OUSD R&E Room 3D129 Pentagon Washington, D.C. 20301 38. Ralph Cleveland NFMSO Code 9333 Mechanicsburg, PA 17055 39. Clay Coler

  11. Experimental Optimal Single Qubit Purification in an NMR Quantum Information Processor

    PubMed Central

    Hou, Shi-Yao; Sheng, Yu-Bo; Feng, Guan-Ru; Long, Gui-Lu

    2014-01-01

    High quality single qubits are the building blocks in quantum information processing. But they are vulnerable to environmental noise. To overcome noise, purification techniques, which generate qubits with higher purities from qubits with lower purities, have been proposed. Purifications have attracted much interest and been widely studied. However, the full experimental demonstration of an optimal single qubit purification protocol proposed by Cirac, Ekert and Macchiavello [Phys. Rev. Lett. 82, 4344 (1999), the CEM protocol] more than one and half decades ago, still remains an experimental challenge, as it requires more complicated networks and a higher level of precision controls. In this work, we design an experiment scheme that realizes the CEM protocol with explicit symmetrization of the wave functions. The purification scheme was successfully implemented in a nuclear magnetic resonance quantum information processor. The experiment fully demonstrated the purification protocol, and showed that it is an effective way of protecting qubits against errors and decoherence. PMID:25358758

  12. Development, implementation, and experimentation of parametric routing protocol for sensor networks

    NASA Astrophysics Data System (ADS)

    Nassr, Matthew S.; Jun, Jangeun; Eidenbenz, Stephan J.; Frigo, Janette R.; Hansson, Anders A.; Mielke, Angela M.; Smith, Mark C.

    2006-09-01

    The development of a scalable and reliable routing protocol for sensor networks is traced from a theoretical beginning to positive simulation results to the end of verification experiments in large and heavily loaded networks. Design decisions and explanations as well as implementation hurdles are presented to give a complete picture of protocol development. Additional software and hardware is required to accurately test the performance of our protocol in field experiments. In addition, the developed protocol is tested in TinyOS on Mica2 motes against well-established routing protocols frequently used in sensor networks. Our protocol proves to outperform the standard (MINTRoute) and the trivial (Gossip) in a variety of different scenarios.

  13. A virtual experimenter to increase standardization for the investigation of placebo effects.

    PubMed

    Horing, Bjoern; Newsome, Nathan D; Enck, Paul; Babu, Sabarish V; Muth, Eric R

    2016-07-18

    Placebo effects are mediated by expectancy, which is highly influenced by psychosocial factors of a treatment context. These factors are difficult to standardize. Furthermore, dedicated placebo research often necessitates single-blind deceptive designs where biases are easily introduced. We propose a study protocol employing a virtual experimenter - a computer program designed to deliver treatment and instructions - for the purpose of standardization and reduction of biases when investigating placebo effects. To evaluate the virtual experimenter's efficacy in inducing placebo effects via expectancy manipulation, we suggest a partially blinded, deceptive design with a baseline/retest pain protocol (hand immersions in hot water bath). Between immersions, participants will receive an (actually inert) medication. Instructions pertaining to the medication will be delivered by one of three metaphors: The virtual experimenter, a human experimenter, and an audio/text presentation (predictor "Metaphor"). The second predictor includes falsely informing participants that the medication is an effective pain killer, or correctly informing them that it is, in fact, inert (predictor "Instruction"). Analysis will be performed with hierarchical linear modelling, with a sample size of N = 50. Results from two pilot studies are presented that indicate the viability of the pain protocol (N = 33), and of the virtual experimenter software and placebo manipulation (N = 48). It will be challenging to establish full comparability between all metaphors used for instruction delivery, and to account for participant differences in acceptance of their virtual interaction partner. Once established, the presence of placebo effects would suggest that the virtual experimenter exhibits sufficient cues to be perceived as a social agent. He could consequently provide a convenient platform to investigate effects of experimenter behavior, or other experimenter characteristics, e.g., sex, age, race/ethnicity or professional status. More general applications are possible, for example in psychological research such as bias research, or virtual reality research. Potential applications also exist for standardizing clinical research by documenting and communicating instructions used in clinical trials.

  14. Design and Implementation of a Secure Modbus Protocol

    NASA Astrophysics Data System (ADS)

    Fovino, Igor Nai; Carcano, Andrea; Masera, Marcelo; Trombetta, Alberto

    The interconnectivity of modern and legacy supervisory control and data acquisition (SCADA) systems with corporate networks and the Internet has significantly increased the threats to critical infrastructure assets. Meanwhile, traditional IT security solutions such as firewalls, intrusion detection systems and antivirus software are relatively ineffective against attacks that specifically target vulnerabilities in SCADA protocols. This paper describes a secure version of the Modbus SCADA protocol that incorporates integrity, authentication, non-repudiation and anti-replay mechanisms. Experimental results using a power plant testbed indicate that the augmented protocol provides good security functionality without significant overhead.

  15. Cure Cycle Design Methodology for Fabricating Reactive Resin Matrix Fiber Reinforced Composites: A Protocol for Producing Void-free Quality Laminates

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung

    2014-01-01

    For the fabrication of resin matrix fiber reinforced composite laminates, a workable cure cycle (i.e., temperature and pressure profiles as a function of processing time) is needed and is critical for achieving void-free laminate consolidation. Design of such a cure cycle is not trivial, especially when dealing with reactive matrix resins. An empirical "trial and error" approach has been used as common practice in the composite industry. Such an approach is not only costly, but also ineffective at establishing the optimal processing conditions for a specific resin/fiber composite system. In this report, a rational "processing science" based approach is established, and a universal cure cycle design protocol is proposed. Following this protocol, a workable and optimal cure cycle can be readily and rationally designed for most reactive resin systems in a cost effective way. This design protocol has been validated through experimental studies of several reactive polyimide composites for a wide spectrum of usage that has been documented in the previous publications.

  16. Gradual training of alpacas to the confinement of metabolism pens reduces stress when normal excretion behavior is accommodated.

    PubMed

    Lund, Kirrin E; Maloney, Shane K; Milton, John T B; Blache, Dominique

    2012-01-01

    Confinement in metabolism pens may provoke a stress response in alpacas that will reduce the welfare of the animal and jeopardize the validity of scientific results obtained in such pens. In this study, we tested a protocol designed to successfully train alpacas to be held in a specially designed metabolism pen so that the animals' confinement would not jeopardize their welfare. We hypothesized that the alpacas would show fewer behaviors associated with a response to stress as training gradually progressed, and that they would adapt to being in the confinement of the metabolism pen. The training protocol was successful at introducing alpacas to the metabolism pens, and it did reduce the incidence of behavioral responses to stress as the training progressed. The success of the training protocol may be attributed to the progressive nature of the training, the tailoring of the protocol to suit alpacas, and the use of positive reinforcement. This study demonstrated that both animal welfare and the validity of the scientific outcomes could be maximized by the gradual training of experimental animals, thereby minimizing the stress imposed on the animals during experimental procedures.

  17. Issues in designing transport layer multicast facilities

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicasting denotes a facility in a communications system for providing efficient delivery from a message's source to some well-defined set of locations using a single logical address. While modem network hardware supports multidestination delivery, first generation Transport Layer protocols (e.g., the DoD Transmission Control Protocol (TCP) (15) and ISO TP-4 (41)) did not anticipate the changes over the past decade in underlying network hardware, transmission speeds, and communication patterns that have enabled and driven the interest in reliable multicast. Much recent research has focused on integrating the underlying hardware multicast capability with the reliable services of Transport Layer protocols. Here, we explore the communication issues surrounding the design of such a reliable multicast mechanism. Approaches and solutions from the literature are discussed, and four experimental Transport Layer protocols that incorporate reliable multicast are examined.

  18. An Alternative Approach to "Identification of Unknowns": Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria.

    PubMed

    Martinez-Vaz, Betsy M; Denny, Roxanne; Young, Nevin D; Sadowsky, Michael J

    2015-12-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students' knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students' verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students' knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001.

  19. Protocol for fermionic positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Arvidsson-Shukur, D. R. M.; Lepage, H. V.; Owen, E. T.; Ferrus, T.; Barnes, C. H. W.

    2017-11-01

    In this paper we present a protocol for the implementation of a positive-operator-valued measure (POVM) on massive fermionic qubits. We present methods for implementing nondispersive qubit transport, spin rotations, and spin polarizing beam-splitter operations. Our scheme attains linear opticslike control of the spatial extent of the qubits by considering ground-state electrons trapped in the minima of surface acoustic waves in semiconductor heterostructures. Furthermore, we numerically simulate a high-fidelity POVM that carries out Procrustean entanglement distillation in the framework of our scheme, using experimentally realistic potentials. Our protocol can be applied not only to pure ensembles with particle pairs of known identical entanglement, but also to realistic ensembles of particle pairs with a distribution of entanglement entropies. This paper provides an experimentally realizable design for future quantum technologies.

  20. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  1. Diverse Protocols for Correlative Super-Resolution Fluorescence Imaging and Electron Microscopy of Cells and Tissue

    DTIC Science & Technology

    2016-05-25

    tissue is critical to biology. Many factors determine optimal experimental design, including attainable localization precision, ultrastructural...both imaging modalities. Examples include: weak tissue preservation protocols resulting in poor ultrastructure, e.g. mitochondrial cristae membranes...tension effects during sample drying that may result in artifacts44. Samples dried in the presence of polyvinyl alcohol do not have the haziness

  2. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  3. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  4. Marine Resiliency Study II

    DTIC Science & Technology

    2016-07-06

    prevention or treatment protocols, or the use of new technology (e.g. MEG ). 5. In coordination with HQMC, NIMH and Army STARRS, to determine...experimental designs such as targeted prevention or treatment protocols or the use of new technology (e.g. MEG ) to identify biomarkers. A specific goal of the...blast sensors, and to analyze MEG data in relation to blast event outcomes during field training. Of the enrolled Marines in the Demonstration Project 4

  5. Predicting Silk Fiber Mechanical Properties through Multiscale Simulation and Protein Design.

    PubMed

    Rim, Nae-Gyune; Roberts, Erin G; Ebrahimi, Davoud; Dinjaski, Nina; Jacobsen, Matthew M; Martín-Moldes, Zaira; Buehler, Markus J; Kaplan, David L; Wong, Joyce Y

    2017-08-14

    Silk is a promising material for biomedical applications, and much research is focused on how application-specific, mechanical properties of silk can be designed synthetically through proper amino acid sequences and processing parameters. This protocol describes an iterative process between research disciplines that combines simulation, genetic synthesis, and fiber analysis to better design silk fibers with specific mechanical properties. Computational methods are used to assess the protein polymer structure as it forms an interconnected fiber network through shearing and how this process affects fiber mechanical properties. Model outcomes are validated experimentally with the genetic design of protein polymers that match the simulation structures, fiber fabrication from these polymers, and mechanical testing of these fibers. Through iterative feedback between computation, genetic synthesis, and fiber mechanical testing, this protocol will enable a priori prediction capability of recombinant material mechanical properties via insights from the resulting molecular architecture of the fiber network based entirely on the initial protein monomer composition. This style of protocol may be applied to other fields where a research team seeks to design a biomaterial with biomedical application-specific properties. This protocol highlights when and how the three research groups (simulation, synthesis, and engineering) should be interacting to arrive at the most effective method for predictive design of their material.

  6. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  7. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  8. Of taps and toilets: quasi-experimental protocol for evaluating community-demand-driven projects.

    PubMed

    Pattanayak, Subhrendu K; Poulos, Christine; Yang, Jui-Chen; Patil, Sumeet R; Wendland, Kelly J

    2009-09-01

    Sustainable and equitable access to safe water and adequate sanitation are widely acknowledged as vital, yet neglected, development goals. Water supply and sanitation (WSS) policies are justified because of the usual efficiency criteria, but also major equity concerns. Yet, to date there are few scientific impact evaluations showing that WSS policies are effective in delivering social welfare outcomes. This lack of an evaluation culture is partly because WSS policies are characterized by diverse mechanisms, broad goals and the increasing importance of decentralized delivery, and partly because programme administrators are unaware of appropriate methods. We describe a protocol for a quasi-experimental evaluation of a community-demand-driven programme for water and sanitation in rural India, which addresses several evaluation challenges. After briefly reviewing policy and implementation issues in the sector, we describe key features of our protocol, including control group identification, pre-post measurement, programme theory, sample sufficiency and robust indicators. At its core, our protocol proposes to combine propensity score matching and difference-in-difference estimation. We conclude by briefly summarizing how quasi-experimental impact evaluations can address key issues in WSS policy design and when such evaluations are needed.

  9. A Smart Sensor for Defending against Clock Glitching Attacks on the I2C Protocol in Robotic Applications

    PubMed Central

    Jiménez-Naharro, Raúl; Gómez-Bravo, Fernando; Medina-García, Jonathan; Sánchez-Raya, Manuel; Gómez-Galán, Juan Antonio

    2017-01-01

    This paper presents a study about hardware attacking and clock signal vulnerability. It considers a particular type of attack on the clock signal in the I2C protocol, and proposes the design of a new sensor for detecting and defending against this type of perturbation. The analysis of the attack and the defense is validated by means of a configurable experimental platform that emulates a differential drive robot. A set of experimental results confirm the interest of the studied vulnerabilities and the efficiency of the proposed sensor in defending against this type of situation. PMID:28346337

  10. A General Small-Scale Reactor To Enable Standardization and Acceleration of Photocatalytic Reactions.

    PubMed

    Le, Chi Chip; Wismer, Michael K; Shi, Zhi-Cai; Zhang, Rui; Conway, Donald V; Li, Guoqing; Vachal, Petr; Davies, Ian W; MacMillan, David W C

    2017-06-28

    Photocatalysis for organic synthesis has experienced an exponential growth in the past 10 years. However, the variety of experimental procedures that have been reported to perform photon-based catalyst excitation has hampered the establishment of general protocols to convert visible light into chemical energy. To address this issue, we have designed an integrated photoreactor for enhanced photon capture and catalyst excitation. Moreover, the evaluation of this new reactor in eight photocatalytic transformations that are widely employed in medicinal chemistry settings has confirmed significant performance advantages of this optimized design while enabling a standardized protocol.

  11. (abstract) Experimental Results From Internetworking Data Applications Over Various Wireless Networks Using a Single Flexible Error Control Protocol

    NASA Technical Reports Server (NTRS)

    Kanai, T.; Kramer, M.; McAuley, A. J.; Nowack, S.; Pinck, D. S.; Ramirez, G.; Stewart, I.; Tohme, H.; Tong, L.

    1995-01-01

    This paper describes results from several wireless field trials in New Jersey, California, and Colorado, conducted jointly by researchers at Bellcore, JPL, and US West over the course of 1993 and 1994. During these trials, applications communicated over multiple wireless networks including satellite, low power PCS, high power cellular, packet data, and the wireline Public Switched Telecommunications Network (PSTN). Key goals included 1) designing data applications and an API suited to mobile users, 2) investigating internetworking issues, 3) characterizing wireless networks under various field conditions, and 4) comparing the performance of different protocol mechanisms over the diverse networks and applications. We describe experimental results for different protocol mechanisms and parameters, such as acknowledgment schemes and packet sizes. We show the need for powerful error control mechanisms such as selective acknowledgements and combining data from multiple transmissions. We highlight the possibility of a common protocol for all wireless networks, from micro-cellular PCS to satellite networks.

  12. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    PubMed

    Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja

    2015-01-01

    The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  13. Considerations for the design and execution of protocols for animal research and treatment to improve reproducibility and standardization: "DEPART well-prepared and ARRIVE safely".

    PubMed

    Smith, M M; Clarke, E C; Little, C B

    2017-03-01

    To review the factors in experimental design that contribute to poor translation of pre-clinical research to therapies for patients with osteoarthritis (OA) and how this might be improved. Narrative review of the literature, and evaluation of the different stages of design conduct and analysis of studies using animal models of OA to define specific issues that might reduce quality of evidence and how this can be minimised. Preventing bias and improving experimental rigour and reporting are important modifiable factors to improve translation from pre-clinical animal models to successful clinical trials of therapeutic agents. Despite publication and adoption by many journals of guidelines such as Animals in Research: Reporting In Vivo Experiments (ARRIVE), experimental animal studies published in leading rheumatology journals are still deficient in their reporting. In part, this may be caused by researchers first consulting these guidelines after the completion of experiments, at the time of publication. This review discusses factors that can (1) bias the outcome of experimental studies using animal models of osteoarthritis or (2) alter the quality of evidence for translation. We propose a checklist to consult prior to starting experiments; in the Design and Execution of Protocols for Animal Research and Treatment (DEPART). Following DEPART during the design phase will enable completion of the ARRIVE checklist at the time of publication, and thus improve the quality of evidence for inclusion of experimental animal research in meta-analyses and systematic reviews: "DEPART well-prepared and ARRIVE safely". Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  14. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  15. Using semantics for representing experimental protocols.

    PubMed

    Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar

    2017-11-13

    An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .

  16. Mutation-based learning to improve student autonomy and scientific inquiry skills in a large genetics laboratory course.

    PubMed

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the "mutations"; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional "cookbook"-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class.

  17. Experimental study on all-fiber-based unidimensional continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Xuyang; Liu, Wenyuan; Wang, Pu; Li, Yongmin

    2017-06-01

    We experimentally demonstrated an all-fiber-based unidimensional continuous-variable quantum key distribution (CV QKD) protocol and analyzed its security under collective attack in realistic conditions. A pulsed balanced homodyne detector, which could not be accessed by eavesdroppers, with phase-insensitive efficiency and electronic noise, was considered. Furthermore, a modulation method and an improved relative phase-locking technique with one amplitude modulator and one phase modulator were designed. The relative phase could be locked precisely with a standard deviation of 0.5° and a mean of almost zero. Secret key bit rates of 5.4 kbps and 700 bps were achieved for transmission fiber lengths of 30 and 50 km, respectively. The protocol, which simplified the CV QKD system and reduced the cost, displayed a performance comparable to that of a symmetrical counterpart under realistic conditions. It is expected that the developed protocol can facilitate the practical application of the CV QKD.

  18. Understanding Photography as Applied Chemistry: Using Talbot's Calotype Process to Introduce Chemistry to Design Students

    ERIC Educational Resources Information Center

    Ro¨sch, Esther S.; Helmerdig, Silke

    2017-01-01

    Early photography processes were predestined to combine chemistry and art. William Henry Fox Talbot is one of the early photography pioneers. In 2-3 day workshops, design students without a major background in chemistry are able to define a reproducible protocol for Talbot's gallic acid containing calotype process. With the experimental concept…

  19. Wear-screening and joint simulation studies vs. materials selection and prosthesis design.

    PubMed

    Clarke, I C

    1982-01-01

    Satisfactory friction and wear performance of orthomedic biomaterials is an essential criterion for both hemiarthroplasty and total joint replacements. This report will chart the clinical historical experience of candidate biomaterials with their wear resistance and compare/contrast these data to experimental test predictions. The latter review will encompass publications dealing with both joint simulators and the more basic friction and wear screening devices. Special consideration will be given to the adequacy of the test protocol, the design of the experimental machines, and the accuracy of the measurement techniques. The discussion will then center on clinical reality vs. experimental adequacy and summarize current developments.

  20. RosettaScripts: a scripting language interface to the Rosetta macromolecular modeling suite.

    PubMed

    Fleishman, Sarel J; Leaver-Fay, Andrew; Corn, Jacob E; Strauch, Eva-Maria; Khare, Sagar D; Koga, Nobuyasu; Ashworth, Justin; Murphy, Paul; Richter, Florian; Lemmon, Gordon; Meiler, Jens; Baker, David

    2011-01-01

    Macromolecular modeling and design are increasingly useful in basic research, biotechnology, and teaching. However, the absence of a user-friendly modeling framework that provides access to a wide range of modeling capabilities is hampering the wider adoption of computational methods by non-experts. RosettaScripts is an XML-like language for specifying modeling tasks in the Rosetta framework. RosettaScripts provides access to protocol-level functionalities, such as rigid-body docking and sequence redesign, and allows fast testing and deployment of complex protocols without need for modifying or recompiling the underlying C++ code. We illustrate these capabilities with RosettaScripts protocols for the stabilization of proteins, the generation of computationally constrained libraries for experimental selection of higher-affinity binding proteins, loop remodeling, small-molecule ligand docking, design of ligand-binding proteins, and specificity redesign in DNA-binding proteins.

  1. Near-optimal protocols in complex nonequilibrium transformations

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...

    2016-08-29

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less

  2. Evaluating optimal therapy robustness by virtual expansion of a sample population, with a case study in cancer immunotherapy

    PubMed Central

    Barish, Syndi; Ochs, Michael F.; Sontag, Eduardo D.; Gevertz, Jana L.

    2017-01-01

    Cancer is a highly heterogeneous disease, exhibiting spatial and temporal variations that pose challenges for designing robust therapies. Here, we propose the VEPART (Virtual Expansion of Populations for Analyzing Robustness of Therapies) technique as a platform that integrates experimental data, mathematical modeling, and statistical analyses for identifying robust optimal treatment protocols. VEPART begins with time course experimental data for a sample population, and a mathematical model fit to aggregate data from that sample population. Using nonparametric statistics, the sample population is amplified and used to create a large number of virtual populations. At the final step of VEPART, robustness is assessed by identifying and analyzing the optimal therapy (perhaps restricted to a set of clinically realizable protocols) across each virtual population. As proof of concept, we have applied the VEPART method to study the robustness of treatment response in a mouse model of melanoma subject to treatment with immunostimulatory oncolytic viruses and dendritic cell vaccines. Our analysis (i) showed that every scheduling variant of the experimentally used treatment protocol is fragile (nonrobust) and (ii) discovered an alternative region of dosing space (lower oncolytic virus dose, higher dendritic cell dose) for which a robust optimal protocol exists. PMID:28716945

  3. FDA approved drugs complexed to their targets: evaluating pose prediction accuracy of docking protocols.

    PubMed

    Bohari, Mohammed H; Sastry, G Narahari

    2012-09-01

    Efficient drug discovery programs can be designed by utilizing existing pools of knowledge from the already approved drugs. This can be achieved in one way by repositioning of drugs approved for some indications to newer indications. Complex of drug to its target gives fundamental insight into molecular recognition and a clear understanding of putative binding site. Five popular docking protocols, Glide, Gold, FlexX, Cdocker and LigandFit have been evaluated on a dataset of 199 FDA approved drug-target complexes for their accuracy in predicting the experimental pose. Performance for all the protocols is assessed at default settings, with root mean square deviation (RMSD) between the experimental ligand pose and the docked pose of less than 2.0 Å as the success criteria in predicting the pose. Glide (38.7 %) is found to be the most accurate in top ranked pose and Cdocker (58.8 %) in top RMSD pose. Ligand flexibility is a major bottleneck in failure of docking protocols to correctly predict the pose. Resolution of the crystal structure shows an inverse relationship with the performance of docking protocol. All the protocols perform optimally when a balanced type of hydrophilic and hydrophobic interaction or dominant hydrophilic interaction exists. Overall in 16 different target classes, hydrophobic interactions dominate in the binding site and maximum success is achieved for all the docking protocols in nuclear hormone receptor class while performance for the rest of the classes varied based on individual protocol.

  4. The effect of repeated laser stimuli to ink-marked skin on skin temperature-recommendations for a safe experimental protocol in humans.

    PubMed

    Madden, Victoria J; Catley, Mark J; Grabherr, Luzia; Mazzola, Francesca; Shohag, Mohammad; Moseley, G Lorimer

    2016-01-01

    Background. Nd:YAP laser is widely used to investigate the nociceptive and pain systems, generating perpetual and laser-evoked neurophysiological responses. A major procedural concern for the use of Nd:YAP laser stimuli in experimental research is the risk of skin damage. The absorption of Nd:YAP laser stimuli is greater in darker skin, or in pale skin that has been darkened with ink, prompting some ethics boards to refuse approval to experimenters wishing to track stimulus location by marking the skin with ink. Some research questions, however, require laser stimuli to be delivered at particular locations or within particular zones, a requirement that is very difficult to achieve if marking the skin is not possible. We thoroughly searched the literature for experimental evidence and protocol recommendations for safe delivery of Nd:YAP laser stimuli over marked skin, but found nothing. Methods. We designed an experimental protocol to define safe parameters for the use of Nd:YAP laser stimuli over skin that has been marked with black dots, and used thermal imaging to assess the safety of the procedure at the forearm and the back. Results. Using thermal imaging and repeated laser stimulation to ink-marked skin, we demonstrated that skin temperature did not increase progressively across the course of the experiment, and that the small change in temperature seen at the forearm was reversed during the rest periods between blocks. Furthermore, no participant experienced skin damage due to the procedure. Conclusion. This protocol offers parameters for safe, confident and effective experimentation using repeated Nd:YAP laser on skin marked with ink, thus paving the way for investigations that depend on it.

  5. What is the best?: simple versus visitor restricted rest period.

    PubMed

    Silvius-Byron, Stephanie A; Florimonte, Christine; Panganiban, Elizabeth G; Ulmer, Janice Fitzgerald

    2014-05-01

    The aim of this study was to compare a highly structured planned rest protocol that includes visitor and healthcare personnel restrictions with a simple planned rest period that encourages patients to rest during a designated time without restriction of visitors and healthcare personnel. Many hospitals acute care have begun to restrict visitors and nonessential health team interventions during specific times despite the lack of experimentally designed studies. Using a convenience sample of 52 intermediate care unit patients, a randomized experimental design study compared a highly structured planned rest protocol with restriction of visitors/healthcare personnel to a simple planned rest period without restrictions. The primary outcome variable was the patient's perceived quality of rest after a 2-hour rest period. Intermediate care patients' perception of rest and sleep during a designated rest period was similar whether elaborate rest strategies were used, including visitor and healthcare personnel restrictions, or if it was only suggested they rest and the door to their room closed. The restriction of visitors and healthcare personnel during a 2-hour rest period did not improve the patient's perception of rest or how long it took them to go to sleep.

  6. Engineering platform and experimental protocol for design and evaluation of a neurally-controlled powered transfemoral prosthesis.

    PubMed

    Zhang, Fan; Liu, Ming; Harper, Stephen; Lee, Michael; Huang, He

    2014-07-22

    To enable intuitive operation of powered artificial legs, an interface between user and prosthesis that can recognize the user's movement intent is desired. A novel neural-machine interface (NMI) based on neuromuscular-mechanical fusion developed in our previous study has demonstrated a great potential to accurately identify the intended movement of transfemoral amputees. However, this interface has not yet been integrated with a powered prosthetic leg for true neural control. This study aimed to report (1) a flexible platform to implement and optimize neural control of powered lower limb prosthesis and (2) an experimental setup and protocol to evaluate neural prosthesis control on patients with lower limb amputations. First a platform based on a PC and a visual programming environment were developed to implement the prosthesis control algorithms, including NMI training algorithm, NMI online testing algorithm, and intrinsic control algorithm. To demonstrate the function of this platform, in this study the NMI based on neuromuscular-mechanical fusion was hierarchically integrated with intrinsic control of a prototypical transfemoral prosthesis. One patient with a unilateral transfemoral amputation was recruited to evaluate our implemented neural controller when performing activities, such as standing, level-ground walking, ramp ascent, and ramp descent continuously in the laboratory. A novel experimental setup and protocol were developed in order to test the new prosthesis control safely and efficiently. The presented proof-of-concept platform and experimental setup and protocol could aid the future development and application of neurally-controlled powered artificial legs.

  7. Optimal experimental design for parameter estimation of a cell signaling model.

    PubMed

    Bandara, Samuel; Schlöder, Johannes P; Eils, Roland; Bock, Hans Georg; Meyer, Tobias

    2009-11-01

    Differential equation models that describe the dynamic changes of biochemical signaling states are important tools to understand cellular behavior. An essential task in building such representations is to infer the affinities, rate constants, and other parameters of a model from actual measurement data. However, intuitive measurement protocols often fail to generate data that restrict the range of possible parameter values. Here we utilized a numerical method to iteratively design optimal live-cell fluorescence microscopy experiments in order to reveal pharmacological and kinetic parameters of a phosphatidylinositol 3,4,5-trisphosphate (PIP(3)) second messenger signaling process that is deregulated in many tumors. The experimental approach included the activation of endogenous phosphoinositide 3-kinase (PI3K) by chemically induced recruitment of a regulatory peptide, reversible inhibition of PI3K using a kinase inhibitor, and monitoring of the PI3K-mediated production of PIP(3) lipids using the pleckstrin homology (PH) domain of Akt. We found that an intuitively planned and established experimental protocol did not yield data from which relevant parameters could be inferred. Starting from a set of poorly defined model parameters derived from the intuitively planned experiment, we calculated concentration-time profiles for both the inducing and the inhibitory compound that would minimize the predicted uncertainty of parameter estimates. Two cycles of optimization and experimentation were sufficient to narrowly confine the model parameters, with the mean variance of estimates dropping more than sixty-fold. Thus, optimal experimental design proved to be a powerful strategy to minimize the number of experiments needed to infer biological parameters from a cell signaling assay.

  8. High speed bus technology development

    NASA Astrophysics Data System (ADS)

    Modrow, Marlan B.; Hatfield, Donald W.

    1989-09-01

    The development and demonstration of the High Speed Data Bus system, a 50 Million bits per second (Mbps) local data network intended for avionics applications in advanced military aircraft is described. The Advanced System Avionics (ASA)/PAVE PILLAR program provided the avionics architecture concept and basic requirements. Designs for wire and fiber optic media were produced and hardware demonstrations were performed. An efficient, robust token-passing protocol was developed and partially demonstrated. The requirements specifications, the trade-offs made, and the resulting designs for both a coaxial wire media system and a fiber optics design are examined. Also, the development of a message-oriented media access protocol is described, from requirements definition through analysis, simulation and experimentation. Finally, the testing and demonstrations conducted on the breadboard and brassboard hardware is presented.

  9. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    PubMed

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  10. Experimental adaptive quantum tomography of two-qubit states

    NASA Astrophysics Data System (ADS)

    Struchalin, G. I.; Pogorelov, I. A.; Straupe, S. S.; Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.

    2016-01-01

    We report an experimental realization of adaptive Bayesian quantum state tomography for two-qubit states. Our implementation is based on the adaptive experimental design strategy proposed in the work by Huszár and Houlsby [F. Huszár and N. M. T. Houlsby, Phys. Rev. A 85, 052120 (2012)., 10.1103/PhysRevA.85.052120] and provides an optimal measurement approach in terms of the information gain. We address the practical questions which one faces in any experimental application: the influence of technical noise and the behavior of the tomographic algorithm for an easy-to-implement class of factorized measurements. In an experiment with polarization states of entangled photon pairs, we observe a lower instrumental noise floor and superior reconstruction accuracy for nearly pure states of the adaptive protocol compared to a nonadaptive protocol. At the same time, we show that for the mixed states, the restriction to factorized measurements results in no advantage for adaptive measurements, so general measurements have to be used.

  11. Mutation-Based Learning to Improve Student Autonomy and Scientific Inquiry Skills in a Large Genetics Laboratory Course

    PubMed Central

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a “mutation” method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the “mutations”; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional “cookbook”-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class. PMID:24006394

  12. How to customize a bona fide psychotherapy for generalized anxiety disorder? A two-arms, patient blinded, ABAB crossed-therapist randomized clinical implementation trial design [IMPLEMENT 2.0].

    PubMed

    Flückiger, Christoph; Wolfer, Christine; Held, Judith; Hilpert, Peter; Rubel, Julian; Allemand, Mathias; Zinbarg, Richard E; Vîslă, Andreea

    2018-04-03

    Bona fide psychotherapy approaches are effective treatments for generalized anxiety disorder (GAD) compared to no-treatment conditions. Treatment manuals and protocols allow a relatively high degree of freedom for the way therapists implement these overall treatment packages and there is a systematic lack of knowledge on how therapists should customize these treatments. The present study experimentally examines two implementation strategies of customizing a bona fide psychotherapy approach based on a 16 session time-limited cognitive-behavioral therapy (CBT) protocol and their relation to the post-session and ultimate treatment outcomes. This trial contrasts two different implementation strategies of how to customize the in-session structure of a manual-based CBT-protocol for GAD. The patients will be randomly assigned to two implementation conditions: (1) a systematic focus on subtle changes lasting from 7 to 20 min at the check-in phase of every psychotherapy session and (2) a state-of-the-art (SOTA) check-in phase lasting several minutes mainly focused on the session goals. Potential therapist effects will be examined based on an ABAB crossed-therapist design. Treatment outcomes will be assessed at the following times: post-session outcomes, treatment outcome at post assessment and 6- as well as 12-month follow-up. The proposed randomized clinical implementation trial addresses the clinically relevant question of how to customize a bona fide psychotherapy protocol experimentally contrasting two implementation strategies. Through the development and testing of the proposed implementation design, this trial has the potential to inform therapists about efficacious implementation strategies of how to customize a manual-based treatment protocol in respect to the timing of the in-session structure. This trial was registered at ClinicalTrials.gov ( NCT03079336 ) at March 14, 2017.

  13. Optimal Experiment Design for Thermal Characterization of Functionally Graded Materials

    NASA Technical Reports Server (NTRS)

    Cole, Kevin D.

    2003-01-01

    The purpose of the project was to investigate methods to accurately verify that designed , materials meet thermal specifications. The project involved heat transfer calculations and optimization studies, and no laboratory experiments were performed. One part of the research involved study of materials in which conduction heat transfer predominates. Results include techniques to choose among several experimental designs, and protocols for determining the optimum experimental conditions for determination of thermal properties. Metal foam materials were also studied in which both conduction and radiation heat transfer are present. Results of this work include procedures to optimize the design of experiments to accurately measure both conductive and radiative thermal properties. Detailed results in the form of three journal papers have been appended to this report.

  14. Branched Nerve Allografts to Improve Outcomes in Facial Composite Tissue Transplantation

    DTIC Science & Technology

    2017-12-01

    Ethicon, Inc. Somervile, N.J.). Postoperatively, animals were recovered per standard protocol in the animal care facility.  Experimental Design ...official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No...a human xenograft with or without oral Tacrolimus. Electrophysiologic assessments were performed pre -operatively and at the study endpoint (24 weeks

  15. Design and development of compact monitoring system for disaster remote health centres.

    PubMed

    Santhi, S; Sadasivam, G S

    2015-02-01

    To enhance speedy communication between the patient and the doctor through newly proposed routing protocol at the mobile node. The proposed model is applied for a telemedicine application during disaster recovery management. In this paper, Energy Efficient Link Stability Routing Protocol (EELSRP) has been developed by simulation and real time. This framework is designed for the immediate healing of affected persons in remote areas, especially at the time of the disaster where there is no hospital proximity. In case of disasters, there might be an outbreak of infectious diseases. In such cases, the patient's medical record is also transferred by the field operator from disaster place to the hospital to facilitate the identification of the disease-causing agent and to prescribe the necessary medication. The heterogeneous networking framework provides reliable, energy efficientand speedy communication between the patient and the doctor using the proposed routing protocol at the mobile node. The performance of the simulation and real time versions of the Energy Efficient Link Stability Routing Protocol (EELSRP) protocol has been analyzed. Experimental results prove the efficiency of the real-time version of EESLRP protocol. The packet delivery ratio and throughput of the real time version of EELSRP protocol is increased by 3% and 10%, respectively, when compared to the simulated version of EELSRP. The end-to-end delay and energy consumption are reduced by 10% and 2% in the real time version of EELSRP.

  16. A Mobile Satellite Experiment (MSAT-X) network definition

    NASA Technical Reports Server (NTRS)

    Wang, Charles C.; Yan, Tsun-Yee

    1990-01-01

    The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.

  17. Refining animal models in fracture research: seeking consensus in optimising both animal welfare and scientific validity for appropriate biomedical use.

    PubMed

    Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W

    2007-08-01

    In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards.

  18. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  19. Physics-based enzyme design: predicting binding affinity and catalytic activity.

    PubMed

    Sirin, Sarah; Pearlman, David A; Sherman, Woody

    2014-12-01

    Computational enzyme design is an emerging field that has yielded promising success stories, but where numerous challenges remain. Accurate methods to rapidly evaluate possible enzyme design variants could provide significant value when combined with experimental efforts by reducing the number of variants needed to be synthesized and speeding the time to reach the desired endpoint of the design. To that end, extending our computational methods to model the fundamental physical-chemical principles that regulate activity in a protocol that is automated and accessible to a broad population of enzyme design researchers is essential. Here, we apply a physics-based implicit solvent MM-GBSA scoring approach to enzyme design and benchmark the computational predictions against experimentally determined activities. Specifically, we evaluate the ability of MM-GBSA to predict changes in affinity for a steroid binder protein, catalytic turnover for a Kemp eliminase, and catalytic activity for α-Gliadin peptidase variants. Using the enzyme design framework developed here, we accurately rank the most experimentally active enzyme variants, suggesting that this approach could provide enrichment of active variants in real-world enzyme design applications. © 2014 Wiley Periodicals, Inc.

  20. REFOLDdb: a new and sustainable gateway to experimental protocols for protein refolding.

    PubMed

    Mizutani, Hisashi; Sugawara, Hideaki; Buckle, Ashley M; Sangawa, Takeshi; Miyazono, Ken-Ichi; Ohtsuka, Jun; Nagata, Koji; Shojima, Tomoki; Nosaki, Shohei; Xu, Yuqun; Wang, Delong; Hu, Xiao; Tanokura, Masaru; Yura, Kei

    2017-04-24

    More than 7000 papers related to "protein refolding" have been published to date, with approximately 300 reports each year during the last decade. Whilst some of these papers provide experimental protocols for protein refolding, a survey in the structural life science communities showed a necessity for a comprehensive database for refolding techniques. We therefore have developed a new resource - "REFOLDdb" that collects refolding techniques into a single, searchable repository to help researchers develop refolding protocols for proteins of interest. We based our resource on the existing REFOLD database, which has not been updated since 2009. We redesigned the data format to be more concise, allowing consistent representations among data entries compared with the original REFOLD database. The remodeled data architecture enhances the search efficiency and improves the sustainability of the database. After an exhaustive literature search we added experimental refolding protocols from reports published 2009 to early 2017. In addition to this new data, we fully converted and integrated existing REFOLD data into our new resource. REFOLDdb contains 1877 entries as of March 17 th , 2017, and is freely available at http://p4d-info.nig.ac.jp/refolddb/ . REFOLDdb is a unique database for the life sciences research community, providing annotated information for designing new refolding protocols and customizing existing methodologies. We envisage that this resource will find wide utility across broad disciplines that rely on the production of pure, active, recombinant proteins. Furthermore, the database also provides a useful overview of the recent trends and statistics in refolding technology development.

  1. Quantum work fluctuations in connection with the Jarzynski equality.

    PubMed

    Jaramillo, Juan D; Deng, Jiawen; Gong, Jiangbin

    2017-10-01

    A result of great theoretical and experimental interest, the Jarzynski equality predicts a free energy change ΔF of a system at inverse temperature β from an ensemble average of nonequilibrium exponential work, i.e., 〈e^{-βW}〉=e^{-βΔF}. The number of experimental work values needed to reach a given accuracy of ΔF is determined by the variance of e^{-βW}, denoted var(e^{-βW}). We discover in this work that var(e^{-βW}) in both harmonic and anharmonic Hamiltonian systems can systematically diverge in nonadiabatic work protocols, even when the adiabatic protocols do not suffer from such divergence. This divergence may be regarded as a type of dynamically induced phase transition in work fluctuations. For a quantum harmonic oscillator with time-dependent trapping frequency as a working example, any nonadiabatic work protocol is found to yield a diverging var(e^{-βW}) at sufficiently low temperatures, markedly different from the classical behavior. The divergence of var(e^{-βW}) indicates the too-far-from-equilibrium nature of a nonadiabatic work protocol and makes it compulsory to apply designed control fields to suppress the quantum work fluctuations in order to test the Jarzynski equality.

  2. An Umeclidinium membrane sensor; Two-step optimization strategy for improved responses.

    PubMed

    Yehia, Ali M; Monir, Hany H

    2017-09-01

    In the scientific context of membrane sensors and improved experimentation, we devised an experimentally designed protocol for sensor optimization. Two-step strategy was implemented for Umeclidinium bromide (UMEC) analysis which is a novel quinuclidine-based muscarinic antagonist used for maintenance treatment of symptoms accompanied with chronic obstructive pulmonary disease. In the first place, membrane components were screened for ideal ion exchanger, ionophore and plasticizer using three categorical factors at three levels in Taguchi design. Secondly, experimentally designed optimization was followed in order to tune the sensor up for finest responses. Twelve experiments were randomly carried out in a continuous factor design. Nernstian response, detection limit and selectivity were assigned as responses in these designs. The optimized membrane sensor contained tetrakis-[3,5-bis(trifluoro- methyl)phenyl] borate (0.44wt%) and calix[6]arene (0.43wt%) in 50.00% PVC plasticized with 49.13wt% 2-ni-tro-phenyl octylether. This sensor, along with an optimum concentration of inner filling solution (2×10 -4 molL -1 UMEC) and 2h of soaking time, attained the design objectives. Nernstian response approached 59.7mV/decade and detection limit decreased by about two order of magnitude (8×10 -8 mol L -1 ) through this optimization protocol. The proposed sensor was validated for UMEC determination in its linear range (3.16×10 -7 -1×10 -3 mol L -1 ) and challenged for selective discrimination of other congeners and inorganic cations. Results of INCRUSE ELLIPTA ® inhalation powder analyses obtained from the proposed sensor and manufacturer's UPLC were statistically compared. Moreover the proposed sensor was successfully used for the determination of UMEC in plasma samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Update of the ATTRACT force field for the prediction of protein-protein binding affinity.

    PubMed

    Chéron, Jean-Baptiste; Zacharias, Martin; Antonczak, Serge; Fiorucci, Sébastien

    2017-06-05

    Determining the protein-protein interactions is still a major challenge for molecular biology. Docking protocols has come of age in predicting the structure of macromolecular complexes. However, they still lack accuracy to estimate the binding affinities, the thermodynamic quantity that drives the formation of a complex. Here, an updated version of the protein-protein ATTRACT force field aiming at predicting experimental binding affinities is reported. It has been designed on a dataset of 218 protein-protein complexes. The correlation between the experimental and predicted affinities reaches 0.6, outperforming most of the available protocols. Focusing on a subset of rigid and flexible complexes, the performance raises to 0.76 and 0.69, respectively. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Tactile Sensing Reflexes for Advanced Prosthetic Hands

    DTIC Science & Technology

    2016-10-01

    mos.) 100% • Design and build “mechanical egg ” test equipment (1-2 mos.) (Abandoned, alternate approach developed) • Develop experimental protocol...of a “mechanical egg ” or force measuring object was rejected in favor of this common cracker. Page 6 • A comprehensive bench top study was

  5. Detection of VEGF-A(xxx)b isoforms in human tissues.

    PubMed

    Bates, David O; Mavrou, Athina; Qiu, Yan; Carter, James G; Hamdollah-Zadeh, Maryam; Barratt, Shaney; Gammons, Melissa V; Millar, Ann B; Salmon, Andrew H J; Oltean, Sebastian; Harper, Steven J

    2013-01-01

    Vascular Endothelial Growth Factor-A (VEGF-A) can be generated as multiple isoforms by alternative splicing. Two families of isoforms have been described in humans, pro-angiogenic isoforms typified by VEGF-A165a, and anti-angiogenic isoforms typified by VEGF-A165b. The practical determination of expression levels of alternative isoforms of the same gene may be complicated by experimental protocols that favour one isoform over another, and the use of specific positive and negative controls is essential for the interpretation of findings on expression of the isoforms. Here we address some of the difficulties in experimental design when investigating alternative splicing of VEGF isoforms, and discuss the use of appropriate control paradigms. We demonstrate why use of specific control experiments can prevent assumptions that VEGF-A165b is not present, when in fact it is. We reiterate, and confirm previously published experimental design protocols that demonstrate the importance of using positive controls. These include using known target sequences to show that the experimental conditions are suitable for PCR amplification of VEGF-A165b mRNA for both q-PCR and RT-PCR and to ensure that mispriming does not occur. We also provide evidence that demonstrates that detection of VEGF-A165b protein in mice needs to be tightly controlled to prevent detection of mouse IgG by a secondary antibody. We also show that human VEGF165b protein can be immunoprecipitated from cultured human cells and that immunoprecipitating VEGF-A results in protein that is detected by VEGF-A165b antibody. These findings support the conclusion that more information on the biology of VEGF-A165b isoforms is required, and confirm the importance of the experimental design in such investigations, including the use of specific positive and negative controls.

  6. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  7. Fasting: a major limitation for resistance exercise training effects in rodents

    PubMed Central

    das Neves, W.; de Oliveira, L.F.; da Silva, R.P.; Alves, C.R.R.; Lancha, A.H.

    2017-01-01

    Protocols that mimic resistance exercise training (RET) in rodents present several limitations, one of them being the electrical stimulus, which is beyond the physiological context observed in humans. Recently, our group developed a conditioning system device that does not use electric shock to stimulate rats, but includes fasting periods before each RET session. The current study was designed to test whether cumulative fasting periods have some influence on skeletal muscle mass and function. Three sets of male Wistar rats were used in the current study. The first set of rats was submitted to a RET protocol without food restriction. However, rats were not able to perform exercise properly. The second and third sets were then randomly assigned into three experimental groups: 1) untrained control rats, 2) untrained rats submitted to fasting periods, and 3) rats submitted to RET including fasting periods before each RET session. While the second set of rats performed a short RET protocol (i.e., an adaptation protocol for 3 weeks), the third set of rats performed a longer RET protocol including overload (i.e., 8 weeks). After the short-term protocol, cumulative fasting periods promoted loss of weight (P<0.001). After the longer RET protocol, no difference was observed for body mass, extensor digitorum longus (EDL) morphology or skeletal muscle function (P>0.05 for all). Despite no effects on EDL mass, soleus muscle displayed significant atrophy in the fasting experimental groups (P<0.01). Altogether, these data indicate that fasting is a major limitation for RET in rats. PMID:29185588

  8. Fasting: a major limitation for resistance exercise training effects in rodents.

    PubMed

    das Neves, W; de Oliveira, L F; da Silva, R P; Alves, C R R; Lancha, A H

    2017-11-17

    Protocols that mimic resistance exercise training (RET) in rodents present several limitations, one of them being the electrical stimulus, which is beyond the physiological context observed in humans. Recently, our group developed a conditioning system device that does not use electric shock to stimulate rats, but includes fasting periods before each RET session. The current study was designed to test whether cumulative fasting periods have some influence on skeletal muscle mass and function. Three sets of male Wistar rats were used in the current study. The first set of rats was submitted to a RET protocol without food restriction. However, rats were not able to perform exercise properly. The second and third sets were then randomly assigned into three experimental groups: 1) untrained control rats, 2) untrained rats submitted to fasting periods, and 3) rats submitted to RET including fasting periods before each RET session. While the second set of rats performed a short RET protocol (i.e., an adaptation protocol for 3 weeks), the third set of rats performed a longer RET protocol including overload (i.e., 8 weeks). After the short-term protocol, cumulative fasting periods promoted loss of weight (P<0.001). After the longer RET protocol, no difference was observed for body mass, extensor digitorum longus (EDL) morphology or skeletal muscle function (P>0.05 for all). Despite no effects on EDL mass, soleus muscle displayed significant atrophy in the fasting experimental groups (P<0.01). Altogether, these data indicate that fasting is a major limitation for RET in rats.

  9. Design and interpretation of cell trajectory assays

    PubMed Central

    Bowden, Lucie G.; Simpson, Matthew J.; Baker, Ruth E.

    2013-01-01

    Cell trajectory data are often reported in the experimental cell biology literature to distinguish between different types of cell migration. Unfortunately, there is no accepted protocol for designing or interpreting such experiments and this makes it difficult to quantitatively compare different published datasets and to understand how changes in experimental design influence our ability to interpret different experiments. Here, we use an individual-based mathematical model to simulate the key features of a cell trajectory experiment. This shows that our ability to correctly interpret trajectory data is extremely sensitive to the geometry and timing of the experiment, the degree of motility bias and the number of experimental replicates. We show that cell trajectory experiments produce data that are most reliable when the experiment is performed in a quasi-one-dimensional geometry with a large number of identically prepared experiments conducted over a relatively short time-interval rather than a few trajectories recorded over particularly long time-intervals. PMID:23985736

  10. Efficient generation of monoclonal antibodies from single rhesus macaque antibody secreting cells.

    PubMed

    Meng, Weixu; Li, Leike; Xiong, Wei; Fan, Xuejun; Deng, Hui; Bett, Andrew J; Chen, Zhifeng; Tang, Aimin; Cox, Kara S; Joyce, Joseph G; Freed, Daniel C; Thoryk, Elizabeth; Fu, Tong-Ming; Casimiro, Danilo R; Zhang, Ningyan; A Vora, Kalpit; An, Zhiqiang

    2015-01-01

    Nonhuman primates (NHPs) are used as a preclinical model for vaccine development, and the antibody profiles to experimental vaccines in NHPs can provide critical information for both vaccine design and translation to clinical efficacy. However, an efficient protocol for generating monoclonal antibodies from single antibody secreting cells of NHPs is currently lacking. In this study we established a robust protocol for cloning immunoglobulin (IG) variable domain genes from single rhesus macaque (Macaca mulatta) antibody secreting cells. A sorting strategy was developed using a panel of molecular markers (CD3, CD19, CD20, surface IgG, intracellular IgG, CD27, Ki67 and CD38) to identify the kinetics of B cell response after vaccination. Specific primers for the rhesus macaque IG genes were designed and validated using cDNA isolated from macaque peripheral blood mononuclear cells. Cloning efficiency was averaged at 90% for variable heavy (VH) and light (VL) domains, and 78.5% of the clones (n = 335) were matched VH and VL pairs. Sequence analysis revealed that diverse IGHV subgroups (for VH) and IGKV and IGLV subgroups (for VL) were represented in the cloned antibodies. The protocol was tested in a study using an experimental dengue vaccine candidate. About 26.6% of the monoclonal antibodies cloned from the vaccinated rhesus macaques react with the dengue vaccine antigens. These results validate the protocol for cloning monoclonal antibodies in response to vaccination from single macaque antibody secreting cells, which have general applicability for determining monoclonal antibody profiles in response to other immunogens or vaccine studies of interest in NHPs.

  11. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.

    1992-01-01

    The key elements in the second year (1991-92) of our project are: (1) implementation of the distributed system prototype; (2) successful passing of the candidacy examination and a PhD proposal acceptance by the funded student; (3) design of storage efficient schemes for replicated distributed systems; and (4) modeling of gracefully degrading reliable computing systems. In the third year of the project (1992-93), we propose to: (1) complete the testing of the prototype; (2) enhance the functionality of the modules by enabling the experimentation with more complex protocols; (3) use the prototype to verify the theoretically predicted performance of locking protocols, etc.; and (4) work on issues related to real-time distributed systems. This should result in efficient protocols for these systems.

  12. Web-accessible molecular modeling with Rosetta: The Rosetta Online Server that Includes Everyone (ROSIE).

    PubMed

    Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J

    2018-01-01

    The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.

  13. Design and assessment of engineered CRISPR-Cpf1 and its use for genome editing.

    PubMed

    Li, Bin; Zeng, Chunxi; Dong, Yizhou

    2018-05-01

    Cpf1, a CRISPR endonuclease discovered in Prevotella and Francisella 1 bacteria, offers an alternative platform for CRISPR-based genome editing beyond the commonly used CRISPR-Cas9 system originally discovered in Streptococcus pyogenes. This protocol enables the design of engineered CRISPR-Cpf1 components, both CRISPR RNAs (crRNAs) to guide the endonuclease and Cpf1 mRNAs to express the endonuclease protein, and provides experimental procedures for effective genome editing using this system. We also describe quantification of genome-editing activity and off-target effects of the engineered CRISPR-Cpf1 in human cell lines using both T7 endonuclease I (T7E1) assay and targeted deep sequencing. This protocol enables rapid construction and identification of engineered crRNAs and Cpf1 mRNAs to enhance genome-editing efficiency using the CRISPR-Cpf1 system, as well as assessment of target specificity within 2 months. This protocol may also be appropriate for fine-tuning other types of CRISPR systems.

  14. Real-time intravital microscopy of individual nanoparticle dynamics in liver and tumors of live mice

    PubMed Central

    van de Ven, Anne L; Kim, Pilhan; Ferrari, Mauro; Yun, Seok Hyun

    2013-01-01

    Intravital microscopy is emerging as an important experimental tool for the research and development of multi-functional therapeutic nanoconstructs. The direct visualization of nanoparticle dynamics within live animals provides invaluable insights into the mechanisms that regulate nanotherapeutics transport and cell-particle interactions. Here we present a protocol to image the dynamics of nanoparticles within the liver and tumors of live mice immediately following systemic injection using a high-speed (30-400 fps) confocal or multi-photon laser-scanning fluorescence microscope. Techniques for quantifying the real-time accumulation and cellular association of individual particles with a size ranging from several tens of nanometers to micrometers are described, as well as an experimental strategy for labeling Kupffer cells in the liver in vivo. Experimental design considerations and controls are provided, as well as minimum equipment requirements. The entire protocol takes approximately 4-8 hours and yields quantitative information. These techniques can serve to study a wide range of kinetic parameters that drive nanotherapeutics delivery, uptake, and treatment response. PMID:25383179

  15. An Enhanced Reservation-Based MAC Protocol for IEEE 802.15.4 Networks

    PubMed Central

    Afonso, José A.; Silva, Helder D.; Macedo, Pedro; Rocha, Luis A.

    2011-01-01

    The IEEE 802.15.4 Medium Access Control (MAC) protocol is an enabling standard for wireless sensor networks. In order to support applications requiring dedicated bandwidth or bounded delay, it provides a reservation-based scheme named Guaranteed Time Slot (GTS). However, the GTS scheme presents some drawbacks, such as inefficient bandwidth utilization and support to a maximum of only seven devices. This paper presents eLPRT (enhanced Low Power Real Time), a new reservation-based MAC protocol that introduces several performance enhancing features in comparison to the GTS scheme. This MAC protocol builds on top of LPRT (Low Power Real Time) and includes various mechanisms designed to increase data transmission reliability against channel errors, improve bandwidth utilization and increase the number of supported devices. A motion capture system based on inertial and magnetic sensors has been used to validate the protocol. The effectiveness of the performance enhancements introduced by each of the new features is demonstrated through the provision of both simulation and experimental results. PMID:22163826

  16. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes

    PubMed Central

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-01-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student’s t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period. PMID:27313356

  17. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes.

    PubMed

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-05-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student's t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period.

  18. Refining animal models in fracture research: seeking consensus in optimising both animal welfare and scientific validity for appropriate biomedical use

    PubMed Central

    Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W

    2007-01-01

    Background In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. Methods The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Results & Conclusion Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards. PMID:17678534

  19. Design preferences and cognitive styles: experimentation by automated website synthesis.

    PubMed

    Leung, Siu-Wai; Lee, John; Johnson, Chris; Robertson, David

    2012-06-29

    This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain.

  20. Leveraging CyVerse Resources for De Novo Comparative Transcriptomics of Underserved (Non-model) Organisms

    PubMed Central

    Joyce, Blake L.; Haug-Baltzell, Asher K.; Hulvey, Jonathan P.; McCarthy, Fiona; Devisetty, Upendra Kumar; Lyons, Eric

    2017-01-01

    This workflow allows novice researchers to leverage advanced computational resources such as cloud computing to carry out pairwise comparative transcriptomics. It also serves as a primer for biologists to develop data scientist computational skills, e.g. executing bash commands, visualization and management of large data sets. All command line code and further explanations of each command or step can be found on the wiki (https://wiki.cyverse.org/wiki/x/dgGtAQ). The Discovery Environment and Atmosphere platforms are connected together through the CyVerse Data Store. As such, once the initial raw sequencing data has been uploaded there is no more need to transfer large data files over an Internet connection, minimizing the amount of time needed to conduct analyses. This protocol is designed to analyze only two experimental treatments or conditions. Differential gene expression analysis is conducted through pairwise comparisons, and will not be suitable to test multiple factors. This workflow is also designed to be manual rather than automated. Each step must be executed and investigated by the user, yielding a better understanding of data and analytical outputs, and therefore better results for the user. Once complete, this protocol will yield de novo assembled transcriptome(s) for underserved (non-model) organisms without the need to map to previously assembled reference genomes (which are usually not available in underserved organism). These de novo transcriptomes are further used in pairwise differential gene expression analysis to investigate genes differing between two experimental conditions. Differentially expressed genes are then functionally annotated to understand the genetic response organisms have to experimental conditions. In total, the data derived from this protocol is used to test hypotheses about biological responses of underserved organisms. PMID:28518075

  1. Long-Term Functional Side-Effects of Stimulants and Sedatives in Drosophila melanogaster

    PubMed Central

    Matsagas, Kennedy; Lim, David B.; Horwitz, Marc; Rizza, Cristina L.; Mueller, Laurence D.; Villeponteau, Bryant; Rose, Michael R.

    2009-01-01

    Background Small invertebrate animals, such as nematodes and fruit flies, are increasingly being used to test candidate drugs both for specific therapeutic purposes and for long-term health effects. Some of the protocols used in these experiments feature such experimental design features as lifelong virginity and very low densities. By contrast, the ability of both fruit flies and nematodes to resist stress is frequently correlated with their longevity and other functional measures, suggesting that low-stress assays are not necessarily the only useful protocol for testing the long-term effects of drugs. Methodology/Principal Findings Here we report an alternative protocol for fruit fly drug-testing that maximizes reproductive opportunities and other types of interaction, with moderately high population densities. We validate this protocol using two types of experimental tests: 1. We show that this protocol detects previously well-established genetic differences between outbred fruit fly populations. 2. We show that this protocol is able to distinguish among the long-term effects of similar types of drugs within two broad categories, stimulants and tranquilizers. Conclusions Large-scale fly drug testing can be conducted using mixed-sex high-density cage assays. We find that the commonly-used stimulants caffeine and theobromine differ dramatically in their chronic functional effects, theobromine being more benign. Likewise, we find that two generic pharmaceutical tranquilizers, lithium carbonate and valproic acid, differ dramatically in their chronic effects, lithium being more benign. However, these findings do not necessarily apply to human subjects, and we thus do not recommend the use of any one substance over any other. PMID:19668379

  2. Feasibility and effects of preventive home visits for at-risk older people: design of a randomized controlled trial.

    PubMed

    Cutchin, Malcolm P; Coppola, Susan; Talley, Vibeke; Svihula, Judie; Catellier, Diane; Shank, Kendra Heatwole

    2009-12-03

    The search for preventive methods to mitigate functional decline and unwanted relocation by older adults living in the community is important. Preventive home visit (PHV) models use infrequent but regular visits to older adults by trained practitioners with the goal of maintaining function and quality of life. Evidence about PHV efficacy is mixed but generally supportive. Yet interventions have rarely combined a comprehensive (biopsychosocial) occupational therapy intervention protocol with a home visit to older adults. There is a particular need in the USA to create and examine such a protocol. The study is a single-blind randomized controlled pilot trial designed to assess the feasibility, and to obtain preliminary efficacy estimates, of an intervention consisting of preventive home visits to community-dwelling older adults. An occupational therapy-based preventive home visit (PHV) intervention was developed and is being implemented and evaluated using a repeated measures design. We recruited a sample of 110 from a population of older adults (75+) who were screened and found to be at-risk for functional decline. Participants are currently living in the community (not in assisted living or a skilled nursing facility) in one of three central North Carolina counties. After consent, participants were randomly assigned into experimental and comparison groups. The experimental group receives the intervention 4 times over a 12 month follow-up period while the comparison group receives a minimal intervention of mailed printed materials. Pre- and post-intervention measures are being gathered by questionnaires administered face-to-face by a treatment-blinded research associate. Key outcome measures include functional ability, participation, life satisfaction, self-rated health, and depression. Additional information is collected from participants in the experimental group during the intervention to assess the feasibility of the intervention and potential modifiers. Fidelity is being addressed and measured across several domains. Feasibility indications to date are positive. Although the protocol has some limitations, we expect to learn enough about the intervention, delivery and effects to support a larger trial with a more stringent design and enhanced statistical power. ClinicalTrials.gov ID NCT00985283.

  3. Towards ethically improved animal experimentation in the study of animal reproduction.

    PubMed

    Blache, D; Martin, G B; Maloney, S K

    2008-07-01

    The ethics of animal-based research is a continuing area of debate, but ethical research protocols do not prevent scientific progress. In this paper, we argue that our current knowledge of the factors that affect reproductive processes provides researchers with a solid foundation upon which they can conduct more ethical research and simultaneously produce data of higher quality. We support this argument by showing how a deep understanding of the genetics, nutrition and temperament of our experimental animals can improve compliance with two of the '3 Rs', reduction and refinement, simply by offering better control over the variance in our experimental model. The outcome is a better experimental design, on both ethical and scientific grounds.

  4. Self-Regulated Learning Using Multimedia Programs in Dentistry Postgraduate Students: A Multimethod Approach

    ERIC Educational Resources Information Center

    Lloret, Miguel; Aguila, Estela; Lloret, Alejandro

    2009-01-01

    The purpose of this study was to study the effect of a multimedia computing program on the production of activities and self-regulated learning processes in 18 students of the Dentistry postdegree (Celaya, Mexico). A multi-method design (quasi-experimental, pretest-post-test and qualitative: Think aloud protocol) was used. Self-regulated…

  5. Inquiry in the Large-Enrollment Science Classroom: Simulating a Research Investigation

    ERIC Educational Resources Information Center

    Reeve, Suzanne; Hammond, Jennetta W.; Bradshaw, William S.

    2004-01-01

    We conduct research workshops twice each semester in our cell biology lecture course. Instead of solely analyzing data obtained by others, students form groups to design research questions and experimental protocols on a given topic. The main focus is the process of scientific thinking, not simply obtaining a correct product. (Contains 3 tables…

  6. Mutation-Based Learning to Improve Student Autonomy and Scientific Inquiry Skills in a Large Genetics Laboratory Course

    ERIC Educational Resources Information Center

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could…

  7. Recasting a traditional laboratory practical as a "Design-your-own protocol" to teach a universal research skill.

    PubMed

    Whitworth, David E

    2016-07-08

    Laboratory-based practical classes are a common feature of life science teaching, during which students learn how to perform experiments and generate/interpret data. Practical classes are typically instructional, concentrating on providing topic- and technique-specific skills, however to produce research-capable graduates it is also important to develop generic practical skills. To provide an opportunity for students to develop the skills needed to create bespoke protocols for experimental benchwork, a traditional practical was repurposed. Students were given a list of available resources and an experimental goal, and directed to create a bench protocol to achieve the aim (measuring the iron in hemoglobin). In a series of teaching events students received feedback from staff, and peers prototyped the protocols, before protocols were finally implemented. Graduates highlighted this exercise as one of the most important of their degrees, primarily because of the clear relevance of the skills acquired to professional practice. The exercise exemplifies a range of pedagogic principles, but arguably its most important innovation is that it repurposed a pre-existing practical. This had the benefits of automatically providing scaffolding to direct the students' thought processes, while retaining the advantages of a "discovery learning" exercise, and allowing facile adoption of the approach across the sector. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):377-380, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  8. Heralded quantum repeater based on the scattering of photons off single emitters using parametric down-conversion source.

    PubMed

    Song, Guo-Zhu; Wu, Fang-Zhou; Zhang, Mei; Yang, Guo-Jian

    2016-06-28

    Quantum repeater is the key element in quantum communication and quantum information processing. Here, we investigate the possibility of achieving a heralded quantum repeater based on the scattering of photons off single emitters in one-dimensional waveguides. We design the compact quantum circuits for nonlocal entanglement generation, entanglement swapping, and entanglement purification, and discuss the feasibility of our protocols with current experimental technology. In our scheme, we use a parametric down-conversion source instead of ideal single-photon sources to realize the heralded quantum repeater. Moreover, our protocols can turn faulty events into the detection of photon polarization, and the fidelity can reach 100% in principle. Our scheme is attractive and scalable, since it can be realized with artificial solid-state quantum systems. With developed experimental technique on controlling emitter-waveguide systems, the repeater may be very useful in long-distance quantum communication.

  9. Heralded quantum repeater based on the scattering of photons off single emitters using parametric down-conversion source

    PubMed Central

    Song, Guo-Zhu; Wu, Fang-Zhou; Zhang, Mei; Yang, Guo-Jian

    2016-01-01

    Quantum repeater is the key element in quantum communication and quantum information processing. Here, we investigate the possibility of achieving a heralded quantum repeater based on the scattering of photons off single emitters in one-dimensional waveguides. We design the compact quantum circuits for nonlocal entanglement generation, entanglement swapping, and entanglement purification, and discuss the feasibility of our protocols with current experimental technology. In our scheme, we use a parametric down-conversion source instead of ideal single-photon sources to realize the heralded quantum repeater. Moreover, our protocols can turn faulty events into the detection of photon polarization, and the fidelity can reach 100% in principle. Our scheme is attractive and scalable, since it can be realized with artificial solid-state quantum systems. With developed experimental technique on controlling emitter-waveguide systems, the repeater may be very useful in long-distance quantum communication. PMID:27350159

  10. MsLDR-creator: a web service to design msLDR assays.

    PubMed

    Bormann, Felix; Dahl, Andreas; Sers, Christine

    2012-03-01

    MsLDR-creator is a free web service to design assays for the new DNA methylation detection method msLDR. The service provides the user with all necessary information about the oligonucleotides required for the measurement of a given CpG within a sequence of interest. The parameters are calculated by the nearest neighbour approach to achieve optimal behaviour during the experimental procedure. In addition, to guarantee a good start using msLDR, further information, like protocols and hints and tricks, are provided.

  11. Architecture Design and Experimental Platform Demonstration of Optical Network based on OpenFlow Protocol

    NASA Astrophysics Data System (ADS)

    Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang

    2016-02-01

    With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.

  12. Automated recycling of chemistry for virtual screening and library design.

    PubMed

    Vainio, Mikko J; Kogej, Thierry; Raubacher, Florian

    2012-07-23

    An early stage drug discovery project needs to identify a number of chemically diverse and attractive compounds. These hit compounds are typically found through high-throughput screening campaigns. The diversity of the chemical libraries used in screening is therefore important. In this study, we describe a virtual high-throughput screening system called Virtual Library. The system automatically "recycles" validated synthetic protocols and available starting materials to generate a large number of virtual compound libraries, and allows for fast searches in the generated libraries using a 2D fingerprint based screening method. Virtual Library links the returned virtual hit compounds back to experimental protocols to quickly assess the synthetic accessibility of the hits. The system can be used as an idea generator for library design to enrich the screening collection and to explore the structure-activity landscape around a specific active compound.

  13. A multiple-alignment based primer design algorithm for genetically highly variable DNA targets

    PubMed Central

    2013-01-01

    Background Primer design for highly variable DNA sequences is difficult, and experimental success requires attention to many interacting constraints. The advent of next-generation sequencing methods allows the investigation of rare variants otherwise hidden deep in large populations, but requires attention to population diversity and primer localization in relatively conserved regions, in addition to recognized constraints typically considered in primer design. Results Design constraints include degenerate sites to maximize population coverage, matching of melting temperatures, optimizing de novo sequence length, finding optimal bio-barcodes to allow efficient downstream analyses, and minimizing risk of dimerization. To facilitate primer design addressing these and other constraints, we created a novel computer program (PrimerDesign) that automates this complex procedure. We show its powers and limitations and give examples of successful designs for the analysis of HIV-1 populations. Conclusions PrimerDesign is useful for researchers who want to design DNA primers and probes for analyzing highly variable DNA populations. It can be used to design primers for PCR, RT-PCR, Sanger sequencing, next-generation sequencing, and other experimental protocols targeting highly variable DNA samples. PMID:23965160

  14. An efficacious oral health care protocol for immunocompromised patients.

    PubMed

    Solomon, C S; Shaikh, A B; Arendorf, T M

    1995-01-01

    A twice-weekly oral and perioral examination was provided to 120 patients receiving antineoplastic therapy. Sixty patients were monitored while following the traditional hospital oral care protocol (chlorhexidine, hydrogen peroxide, sodium bicarbonate, thymol glycol, benzocaine mouthrinse, and nystatin). The mouth care protocol was then changed (experimental protocol = chlorhexidine, benzocaine lozenges, amphotericin B lozenges), and patients were monitored until the sample size matched that of the hospital mouth care regime. There was a statistically significant reduction in oral complications upon introduction and maintenance of the experimental protocol.

  15. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Supporting Learning with Weblogs in Science Education: A Comparison of Blogging and Hand-Written Reflective Writing with and without Prompts

    ERIC Educational Resources Information Center

    Petko, Dominik; Egger, Nives; Graber, Marc

    2014-01-01

    The goal of this study was to compare how weblogs and traditional handwritten reflective learning protocols compare regarding the use of cognitive and metacognitive strategies for knowledge acquisition as well as learning gains in secondary school students. The study used a quasi-experimental control group design with repeated measurements…

  17. Iowa Virtual Literacy Protocol: A Pre-Experimental Design Using Kurzweil 3000 Text-to-Speech Software with Incarcerated Adult Learners

    ERIC Educational Resources Information Center

    McCulley, Yvette K.

    2012-01-01

    The problem: The increasingly competitive global economy demands literate, educated workers. Both men and women experience the effects of education on employment rates and income. Racial and ethnic minorities, English language learners, and especially those with prison records are most deeply affected by the economic consequences of dropping out…

  18. How to Improve Pupils' Literacy? A Cost-Effectiveness Analysis of a French Educational Project

    ERIC Educational Resources Information Center

    Massoni, Sebastien; Vergnaud, Jean-Christophe

    2012-01-01

    The "Action Lecture" program is an innovative teaching method run in some nursery and primary schools in Paris and designed to improve pupils' literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program's impact on several types of indicators. Data were…

  19. Lightweight and confidential data discovery and dissemination for wireless body area networks.

    PubMed

    He, Daojing; Chan, Sammy; Zhang, Yan; Yang, Haomiao

    2014-03-01

    As a special sensor network, a wireless body area network (WBAN) provides an economical solution to real-time monitoring and reporting of patients' physiological data. After a WBAN is deployed, it is sometimes necessary to disseminate data into the network through wireless links to adjust configuration parameters of body sensors or distribute management commands and queries to sensors. A number of such protocols have been proposed recently, but they all focus on how to ensure reliability and overlook security vulnerabilities. Taking into account the unique features and application requirements of a WBAN, this paper presents the design, implementation, and evaluation of a secure, lightweight, confidential, and denial-of-service-resistant data discovery and dissemination protocol for WBANs to ensure the data items disseminated are not altered or tampered. Based on multiple one-way key hash chains, our protocol provides instantaneous authentication and can tolerate node compromise. Besides the theoretical analysis that demonstrates the security and performance of the proposed protocol, this paper also reports the experimental evaluation of our protocol in a network of resource-limited sensor nodes, which shows its efficiency in practice. In particular, extensive security analysis shows that our protocol is provably secure.

  20. Design and creation of an experimental program of advanced training in reconstructive microsurgery.

    PubMed

    Lorenzo, Andrés R; Alvarez, Angel; Garcia-Barreiro, Juan; Centeno, Alberto; Lopez, Eduardo; Martelo, Francisco

    2006-01-01

    In this study, we design an experimental protocol for the purpose of enhancing performance in training in microsurgery. It is based on five free tissue transfer exercises in rat (epigastric cutaneous flap, saphenous fasciocutaneous flap, epigastric neurovascular flap, saphenous muscular flap, and hindlimb replantation), which simulate the principal clinical procedures of reconstructive microsurgery. The first part of the study consists of an anatomical review of the flaps of 5 rats and in the second part we have carried out the free transfer of flaps on 25 rats divided into 5 groups. To differentiate between them, we have created a mathematical function, referred to as difficulty in a microsurgical exercise, which has enabled us to establish a scale of progression for training, ranging form the easiest to the most difficult. As a conclusion, we believe that this protocol is a useful instrument as it allows for a more precise assessment of microsurgical capacity due to enhanced accuracy in the reproduction of global procedures and the fact that the quantification of progress in training is based on clinical monitoring after 7 days. (c) 2006 Wiley-Liss, Inc. Microsurgery, 2006.

  1. An analytical approach to test and design upper limb prosthesis.

    PubMed

    Veer, Karan

    2015-01-01

    In this work the signal acquiring technique, the analysis models and the design protocols of the prosthesis are discussed. The different methods to estimate the motion intended by the amputee from surface electromyogram (SEMG) signals based on time and frequency domain parameters are presented. The experiment proposed that the used techniques can help significantly in discriminating the amputee's motions among four independent activities using dual channel set-up. Further, based on experimental results, the design and working of an artificial arm have been covered under two constituents--the electronics design and the mechanical assembly. Finally, the developed hand prosthesis allows the amputated persons to perform daily routine activities easily.

  2. [Study protocol of a prevention of recurrent suicidal behaviour program based on case management (PSyMAC)].

    PubMed

    Sáiz, Pilar A; Rodríguez-Revuelta, Julia; González-Blanco, Leticia; Burón, Patricia; Al-Halabí, Susana; Garrido, Marlen; García-Alvarez, Leticia; García-Portilla, Paz; Bobes, Julio

    2014-01-01

    Prevention of suicidal behaviour is a public health priority in the European Union. A previous suicide attempt is the best risk predictor for future attempts, as well as completed suicides. The primary aim of this article is to describe a controlled study protocol designed for prevention of recurrent suicidal behaviour that proposes case management, and includes a psychoeducation program, as compared with the standard intervention (PSyMAC). Patients admitted from January 2011 to June 2013 to the emergency room of the Hospital Universitario Central de Asturias were evaluated using a protocol including sociodemographic, psychiatric, and psychosocial assessment. Patients were randomly assigned to either a group receiving continuous case management including participation in a psychoeducation program (experimental group), or a control group receiving standard care. The primary objective is to examine whether or not the period of time until recurrent suicidal behaviour in the experimental group is significantly different from that of the control group. PSyMAC proposes low cost and easily adaptable interventions to the usual clinical setting that can help to compensate the shortcoming of specific action protocols and suicidal behaviour prevention programs in our country. The evaluation of PSyMAC results will determine their real effectivity as a case-magament program to reduce suicidal risk. Copyright © 2013 SEP y SEPB. Published by Elsevier España. All rights reserved.

  3. A smart checkpointing scheme for improving the reliability of clustering routing protocols.

    PubMed

    Min, Hong; Jung, Jinman; Kim, Bongjae; Cho, Yookun; Heo, Junyoung; Yi, Sangho; Hong, Jiman

    2010-01-01

    In wireless sensor networks, system architectures and applications are designed to consider both resource constraints and scalability, because such networks are composed of numerous sensor nodes with various sensors and actuators, small memories, low-power microprocessors, radio modules, and batteries. Clustering routing protocols based on data aggregation schemes aimed at minimizing packet numbers have been proposed to meet these requirements. In clustering routing protocols, the cluster head plays an important role. The cluster head collects data from its member nodes and aggregates the collected data. To improve reliability and reduce recovery latency, we propose a checkpointing scheme for the cluster head. In the proposed scheme, backup nodes monitor and checkpoint the current state of the cluster head periodically. We also derive the checkpointing interval that maximizes reliability while using the same amount of energy consumed by clustering routing protocols that operate without checkpointing. Experimental comparisons with existing non-checkpointing schemes show that our scheme reduces both energy consumption and recovery latency.

  4. A Smart Checkpointing Scheme for Improving the Reliability of Clustering Routing Protocols

    PubMed Central

    Min, Hong; Jung, Jinman; Kim, Bongjae; Cho, Yookun; Heo, Junyoung; Yi, Sangho; Hong, Jiman

    2010-01-01

    In wireless sensor networks, system architectures and applications are designed to consider both resource constraints and scalability, because such networks are composed of numerous sensor nodes with various sensors and actuators, small memories, low-power microprocessors, radio modules, and batteries. Clustering routing protocols based on data aggregation schemes aimed at minimizing packet numbers have been proposed to meet these requirements. In clustering routing protocols, the cluster head plays an important role. The cluster head collects data from its member nodes and aggregates the collected data. To improve reliability and reduce recovery latency, we propose a checkpointing scheme for the cluster head. In the proposed scheme, backup nodes monitor and checkpoint the current state of the cluster head periodically. We also derive the checkpointing interval that maximizes reliability while using the same amount of energy consumed by clustering routing protocols that operate without checkpointing. Experimental comparisons with existing non-checkpointing schemes show that our scheme reduces both energy consumption and recovery latency. PMID:22163389

  5. Duration Dependent Effect of Static Stretching on Quadriceps and Hamstring Muscle Force.

    PubMed

    Alizadeh Ebadi, Leyla; Çetin, Ebru

    2018-03-13

    The aim of this study was to determine the acute effect of static stretching on hamstring and quadriceps muscles' isokinetic strength when applied for various durations to elite athletes, to investigate the effect of different static stretching durations on isokinetic strength, and finally to determine the optimal stretching duration. Fifteen elite male athletes from two different sport branches (10 football and five basketball) participated in this study. Experimental protocol was designed as 17 repetitive static stretching exercises for hamstring and quadriceps muscle groups according to the indicated experimental protocols; ((A) 5 min jogging; (B) 5 min jogging followed by 15 s static stretching; (C) 5 min jogging followed by 30 s static stretching; (D) 5 min jogging, followed by static stretching for 45 s). Immediately after each protocol, an isokinetic strength test consisting of five repetitions at 60°/s speed and 20 repetitions at 180°/s speed was recorded for the right leg by the Isomed 2000 device. Friedman variance analysis test was employed for data analysis. According to the analyzes, it was observed that 5 min jogging and 15 s stretching exercises increased the isokinetic strength, whereas 30 and 45 s stretching exercises caused a decrease.

  6. Duration Dependent Effect of Static Stretching on Quadriceps and Hamstring Muscle Force

    PubMed Central

    Çetin, Ebru

    2018-01-01

    The aim of this study was to determine the acute effect of static stretching on hamstring and quadriceps muscles’ isokinetic strength when applied for various durations to elite athletes, to investigate the effect of different static stretching durations on isokinetic strength, and finally to determine the optimal stretching duration. Fifteen elite male athletes from two different sport branches (10 football and five basketball) participated in this study. Experimental protocol was designed as 17 repetitive static stretching exercises for hamstring and quadriceps muscle groups according to the indicated experimental protocols; ((A) 5 min jogging; (B) 5 min jogging followed by 15 s static stretching; (C) 5 min jogging followed by 30 s static stretching; (D) 5 min jogging, followed by static stretching for 45 s). Immediately after each protocol, an isokinetic strength test consisting of five repetitions at 60°/s speed and 20 repetitions at 180°/s speed was recorded for the right leg by the Isomed 2000 device. Friedman variance analysis test was employed for data analysis. According to the analyzes, it was observed that 5 min jogging and 15 s stretching exercises increased the isokinetic strength, whereas 30 and 45 s stretching exercises caused a decrease.

  7. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    PubMed Central

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-01-01

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results. PMID:26610495

  8. A Passive Testing Approach for Protocols in Wireless Sensor Networks.

    PubMed

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-11-19

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  9. Optimisation of the synthesis of vancomycin-selective molecularly imprinted polymer nanoparticles using automatic photoreactor

    PubMed Central

    2014-01-01

    A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions. PMID:24685151

  10. Optimisation of the synthesis of vancomycin-selective molecularly imprinted polymer nanoparticles using automatic photoreactor.

    PubMed

    Muzyka, Kateryna; Karim, Khalku; Guerreiro, Antonio; Poma, Alessandro; Piletsky, Sergey

    2014-03-31

    A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions.

  11. Optimisation of the synthesis of vancomycin-selective molecularly imprinted polymer nanoparticles using automatic photoreactor

    NASA Astrophysics Data System (ADS)

    Muzyka, Kateryna; Karim, Khalku; Guerreiro, Antonio; Poma, Alessandro; Piletsky, Sergey

    2014-03-01

    A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions.

  12. Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review.

    PubMed

    Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C

    2018-01-01

    The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n  = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework ( n  = 12 each). While several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.

  13. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  14. On-line data display

    NASA Astrophysics Data System (ADS)

    Lang, Sherman Y. T.; Brooks, Martin; Gauthier, Marc; Wein, Marceli

    1993-05-01

    A data display system for embedded realtime systems has been developed for use as an operator's user interface and debugging tool. The motivation for development of the On-Line Data Display (ODD) have come from several sources. In particular the design reflects the needs of researchers developing an experimental mobile robot within our laboratory. A proliferation of specialized user interfaces revealed a need for a flexible communications and graphical data display system. At the same time the system had to be readily extensible for arbitrary graphical display formats which would be required for data visualization needs of the researchers. The system defines a communication protocol transmitting 'datagrams' between tasks executing on the realtime system and virtual devices displaying the data in a meaningful way on a graphical workstation. The communication protocol multiplexes logical channels on a single data stream. The current implementation consists of a server for the Harmony realtime operating system and an application written for the Macintosh computer. Flexibility requirements resulted in a highly modular server design, and a layered modular object- oriented design for the Macintosh part of the system. Users assign data types to specific channels at run time. Then devices are instantiated by the user and connected to channels to receive datagrams. The current suite of device types do not provide enough functionality for most users' specialized needs. Instead the system design allows the creation of new device types with modest programming effort. The protocol, design and use of the system are discussed.

  15. Integration of a Computer-Based Consultant into the Clinical Setting*

    PubMed Central

    Bischoff, Miriam B.; Shortliffe, Edward H.

    1983-01-01

    Studies of the attitudes of medical personnel regarding computer-based clinical consultation systems have shown that successful programs must be designed not only to satisfy a need for expert level advice but also to fit smoothly into the dally routine of physician/users. Planning for system use should accordingly be emphasized in all aspects of the system design. ONCOCIN is an oncology protocol management system that assists physicians with the management of outpatients enrolled in experimental cancer chemotherapy protocols. ONCOCIN was designed for initial implementation in the Stanford Oncology Day Care Center, where it has been in limited use since May of 1981. The clinic's physicians currently use the system dally in the management of patients with Hodgkin's and non-Hodgkin's lymphoma. This work has allowed us to study physician-computer interaction and to explore artificial intelligence research issues. This paper discusses the practical issues to consider when designing a consultation system for physicians and the logistical issues to address when integrating such a system into a clinic setting. We describe how ONCOCIN has addressed these issues, the problems encountered, their resolution, and the lessons learned.

  16. The Behavior of TCP and Its Extensions in Space

    NASA Technical Reports Server (NTRS)

    Wang, Ruhai; Horan, Stephen

    2001-01-01

    The performance of Transmission Control Protocol (TCP) in space has been examined from the observations of simulation and experimental tests for several years at National Aeronautics and Space Administration (NASA), Department of Defense (DoD) and universities. At New Mexico State University (NMSU), we have been concentrating on studying the performance of two protocol suites: the file transfer protocol (ftp) running over Transmission Control Protocol/Internet Protocol (TCP/IP) stack and the file protocol (fp) running over the Space Communications Protocol Standards (SCPS)-Transport Protocol (TP) developed under the Consultative Committee for Space Data Systems (CCSDS) standards process. SCPS-TP is considered to be TCP's extensions for space communications. This dissertation experimentally studies the behavior of TCP and SCPS-TP by running the protocol suites over both the Space-to-Ground Link Simulator (SGLS) test-bed and realistic satellite link. The study concentrates on comparing protocol behavior by plotting the averaged file transfer times for different experimental configurations and analyzing them using Statistical Analysis System (SAS) based procedures. The effects of different link delays and various Bit-Error-Rates (BERS) on each protocol performance are also studied and linear regression models are built for experiments over SGLS test-bed to reflect the relationships between the file transfer time and various transmission conditions.

  17. Experimental purification of single qubits.

    PubMed

    Ricci, M; De Martini, F; Cerf, N J; Filip, R; Fiurásek, J; Macchiavello, C

    2004-10-22

    We report the experimental realization of the purification protocol for single qubits sent through a depolarizing channel. The qubits are associated with polarization states of single photons and the protocol is achieved by means of passive linear optical elements. The present approach may represent a convenient alternative to the distillation and error correction protocols of quantum information.

  18. Dietary Intake Following Experimentally Restricted Sleep in Adolescents

    PubMed Central

    Beebe, Dean W.; Simon, Stacey; Summer, Suzanne; Hemmer, Stephanie; Strotman, Daniel; Dolan, Lawrence M.

    2013-01-01

    Study Objective: To examine the relationship between sleep and dietary intake in adolescents using an experimental sleep restriction protocol. Design: Randomized crossover sleep restriction-extension paradigm. Setting: Sleep obtained and monitored at home, diet measured during an office visit. Participants: Forty-one typically developing adolescents age 14-16 years. Interventions: The 3-week protocol consisting of a baseline week designed to stabilize the circadian rhythm, followed randomly by 5 consecutive nights of sleep restriction (6.5 hours in bed Monday-Friday) versus healthy sleep duration (10 hours in bed), a 2-night washout period, and a 5-night crossover period. Measurements: Sleep was monitored via actigraphy and teens completed validated 24-hour diet recall interviews following each experimental condition. Results: Paired-sample t-tests examined differences between conditions for consumption of key macronutrients and choices from dietary categories. Compared with the healthy sleep condition, sleep-restricted adolescents' diets were characterized by higher glycemic index and glycemic load and a trend toward more calories and carbohydrates, with no differences in fat or protein consumption. Exploratory analyses revealed the consumption of significantly more desserts and sweets during sleep restriction than healthy sleep. Conclusions: Chronic sleep restriction during adolescence appears to cause increased consumption of foods with a high glycemic index, particularly desserts/sweets. The chronic sleep restriction common in adolescence may cause changes in dietary behaviors that increase risk of obesity and associated morbidity. Citation: Beebe DW; Simon S; Summer S; Hemmer S; Strotman D; Dolan LM. Dietary intake following experimentally restricted sleep in adolescents. SLEEP 2013;36(6):827-834. PMID:23729925

  19. Maternal Opioid Treatment: Human Experimental Research (MOTHER) – Approach, Issues, and Lessons Learned

    PubMed Central

    Jones, Hendrée E.; Fischer, Gabriele; Heil, Sarah H.; Kaltenbach, Karol; Martin, Peter R.; Coyle, Mara G.; Selby, Peter; Stine, Susan M.; O’Grady, Kevin E.; Arria, Amelia M.

    2015-01-01

    Aims The Maternal Opioid Treatment: Human Experimental Research (MOTHER) project, an eight-site randomized, double-blind, double-dummy, flexible-dosing, parallel-group clinical trial is described. This study is the most current – and single most comprehensive – research effort to investigate the safety and efficacy of maternal and prenatal exposure to methadone and buprenorphine. Methods The MOTHER study design is outlined, and its basic features are presented. Conclusions At least seven important lessons have been learned from the MOTHER study: (1) an interdisciplinary focus improves the design and methods of a randomized clinical trial; (2) multiple sites in a clinical trial present continuing challenges to the investigative team due to variations in recruitment goals, patient populations, and hospital practices that in turn differentially impact recruitment rates, treatment compliance, and attrition; (3) study design and protocols must be flexible in order to meet the unforeseen demands of both research and clinical management; (4) staff turnover needs to be addressed with a proactive focus on both hiring and training; (5) the implementation of a protocol for the treatment of a particular disorder may identify important ancillary clinical issues worthy of investigation; (6) timely tracking of data in a multi-site trial is both demanding and unforgiving; and, (7) complex multi-site trials pose unanticipated challenges that complicate the choice of statistical methods, thereby placing added demands on investigators to effectively communicate their results. PMID:23106924

  20. Slow drilling speeds for single-drill implant bed preparation. Experimental in vitro study.

    PubMed

    Delgado-Ruiz, R A; Velasco Ortega, E; Romanos, G E; Gerhke, S; Newen, I; Calvo-Guirado, J L

    2018-01-01

    To evaluate the real-time bone temperature changes during the preparation of the implant bed with a single-drill protocol with different drill designs and different slow drilling speeds in artificial type IV bone. For this experimental in vitro study, 600 implant bed preparations were performed in 10 bovine bone disks using three test slow drilling speeds (50/150/300 rpm) and a control drilling speed (1200 rpm). The temperature at crestal and apical areas and time variations produced during drilling with three different drill designs with similar diameter and length but different geometry were recorded with real-life thermographic analysis. Statistical analysis was performed by two-way analysis of variance. Multiple comparisons of temperatures and time with the different drill designs and speeds were performed with the Tukey's test. T Max values for the control drilling speed with all the drill designs (D1 + 1200; D2 + 1200; D3 + 1200) were higher compared to those for the controls for 11 ± 1.32 °C (p < 0.05). The comparison of T Max within the test groups showed that drilling at 50 rpm resulted in the lowest temperature increment (22.11 ± 0.8 °C) compared to the other slow drilling speeds of 150 (24.752 ± 1.1 °C) and 300 rpm (25.977 ± 1.2 °C) (p < 0.042). Temperature behavior at crestal and apical areas was similar being lower for slow drilling speeds compared to that for the control drilling speed. Slow drilling speeds required significantly more time to finish the preparation of the implant bed shown as follows: 50 rpm > 150 rpm > 300 rpm > control (p < 0.05). A single-drill protocol with slow drilling speeds (50, 150, and 300 rpm) without irrigation in type IV bone increases the temperature at the coronal and apical levels but is below the critical threshold of 47 °C. The drill design in single-drill protocols using slow speeds (50, 150, and 300 rpm) does not have an influence on the thermal variations. The time to accomplish the implant bed preparation with a single-drill protocol in type IV bone is influenced by the drilling speed and not by the drill design. As the speed decreases, then more time is required.

  1. A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks

    NASA Technical Reports Server (NTRS)

    Cui, Zhenqian

    1999-01-01

    In this thesis, we analyze various factors that affect quality of service (QoS) communication in high-speed, packet-switching sub-networks. We hypothesize that sub-network-wide bandwidth reservation and guaranteed CPU processing power at endpoint systems for handling data traffic are indispensable to achieving hard end-to-end quality of service. Different bandwidth reservation strategies, traffic characterization schemes, and scheduling algorithms affect the network resources and CPU usage as well as the extent that QoS can be achieved. In order to analyze those factors, we design and implement a communication layer. Our experimental analysis supports our research hypothesis. The Resource ReSerVation Protocol (RSVP) is designed to realize resource reservation. Our analysis of RSVP shows that using RSVP solely is insufficient to provide hard end-to-end quality of service in a high-speed sub-network. Analysis of the IEEE 802.lp protocol also supports the research hypothesis.

  2. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  3. Developments in fiber optics for distribution automation

    NASA Technical Reports Server (NTRS)

    Kirkham, H.; Friend, H.; Jackson, S.; Johnston, A.

    1991-01-01

    An optical fiber based communications system of unusual design is described. The system consists of a network of optical fibers overlaid on the distribution system. It is configured as a large number of interconnected rings, with some spurs. Protocols for access to and control of the network are described. Because of the way they function, the protocols are collectively called AbNET, in commemoration of the microbiologists' abbreviation Ab for antibody. Optical data links that could be optically powered are described. There are two versions, each of which has a good frequency response and minimal filtering requirements. In one, a conventional FM pulse train is used at the transmitter, and a novel form of phase-locked loop is used as demodulator. In the other, the FM transmitter is replaced with a pulse generator arranged so that the period between pulses represents the modulating signal. Transmitter and receiver designs, including temperature compensation methods, are presented. Experimental results are given.

  4. E-novo: an automated workflow for efficient structure-based lead optimization.

    PubMed

    Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit

    2009-07-01

    An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.

  5. Weathering of iron sulfides under Mars surface ambient conditions

    NASA Technical Reports Server (NTRS)

    Blackburn, T. R.

    1981-01-01

    The study of iron sulfide surface alternation reactions under Mars' surface ambient conditions begun during 1980 was extended through improved irradiation design and experimental protocols. A wider range of humidities and more intense irradiation were incorporated in the study. X-ray photoelectron spectra of irradiated chips suggest formation of FeSO4, FeCO3, and an iron oxide on the iron sulfide substrates studied.

  6. Inflectional Morphology and Dyslexia: Italian Children's Performance in a Nonword Pluralization Task

    ERIC Educational Resources Information Center

    Vender, Maria; Mantione, Federica; Savazzi, Silvia; Delfitto, Denis; Melloni, Chiara

    2017-01-01

    In this study, we present the results of an original experimental protocol designed to assess the performance in a pluralization task of 52 Italian children divided into two groups: 24 children with developmental dyslexia (mean age 10.0 years old) and 28 typically developing children (mean age 9.11 years old). Our task, inspired by Berko's Wug…

  7. Experimental eavesdropping attack against Ekert's protocol based on Wigner's inequality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bovino, F. A.; Colla, A. M.; Castagnoli, G.

    2003-09-01

    We experimentally implemented an eavesdropping attack against the Ekert protocol for quantum key distribution based on the Wigner inequality. We demonstrate a serious lack of security of this protocol when the eavesdropper gains total control of the source. In addition we tested a modified Wigner inequality which should guarantee a secure quantum key distribution.

  8. Design and Fabrication of N-Alkyl-Polyethylenimine-Stabilized Iron Oxide Nanoclusters for Gene Delivery

    PubMed Central

    Liu, Gang; Wang, Zhiyong; Lee, Seulki; Ai, Hua; Chen, Xiaoyuan

    2013-01-01

    With the rapid development of nanotechnology, inorganic magnetic nanoparticles, especially iron oxide nanoparticles (IOs), have emerged as great vehicles for biomedical diagnostic and therapeutic applications. In order to rationally design IO-based gene delivery nanovectors, surface modification is essential and determines the loading and release of the gene of interest. Here we highlight the basic concepts and applications of nonviral gene delivery vehicles based on low molecular weight N-alkyl polyethylenimine-stabilized IOs. The experimental protocols related to these topics are described in this chapter. PMID:22568910

  9. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  10. Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols

    DTIC Science & Technology

    2016-06-01

    unlimited. v List of Tables Table 1 Single-lap-joint experimental parameters ..............................................7 Table 2 Survey ...Joints: Experimental and Workflow Protocols by Robert E Jensen, Daniel C DeSchepper, and David P Flanagan Approved for...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental

  11. Complete Bell-state analysis for superconducting-quantum-interference-device qubits with a transitionless tracking algorithm

    NASA Astrophysics Data System (ADS)

    Kang, Yi-Hao; Chen, Ye-Hong; Shi, Zhi-Cheng; Huang, Bi-Hua; Song, Jie; Xia, Yan

    2017-08-01

    We propose a protocol for complete Bell-state analysis for two superconducting-quantum-interference-device qubits. The Bell-state analysis could be completed by using a sequence of microwave pulses designed by the transitionless tracking algorithm, which is a useful method in the technique of shortcut to adiabaticity. After the whole process, the information for distinguishing four Bell states will be encoded on two auxiliary qubits, while the Bell states remain unchanged. One can read out the information by detecting the auxiliary qubits. Thus the Bell-state analysis is nondestructive. The numerical simulations show that the protocol possesses a high success probability of distinguishing each Bell state with current experimental technology even when decoherence is taken into account. Thus, the protocol may have potential applications for the information readout in quantum communications and quantum computations in superconducting quantum networks.

  12. High-rate measurement-device-independent quantum cryptography

    NASA Astrophysics Data System (ADS)

    Pirandola, Stefano; Ottaviani, Carlo; Spedalieri, Gaetana; Weedbrook, Christian; Braunstein, Samuel L.; Lloyd, Seth; Gehring, Tobias; Jacobsen, Christian S.; Andersen, Ulrik L.

    2015-06-01

    Quantum cryptography achieves a formidable task—the remote distribution of secret keys by exploiting the fundamental laws of physics. Quantum cryptography is now headed towards solving the practical problem of constructing scalable and secure quantum networks. A significant step in this direction has been the introduction of measurement-device independence, where the secret key between two parties is established by the measurement of an untrusted relay. Unfortunately, although qubit-implemented protocols can reach long distances, their key rates are typically very low, unsuitable for the demands of a metropolitan network. Here we show, theoretically and experimentally, that a solution can come from the use of continuous-variable systems. We design a coherent-state network protocol able to achieve remarkably high key rates at metropolitan distances, in fact three orders of magnitude higher than those currently achieved. Our protocol could be employed to build high-rate quantum networks where devices securely connect to nearby access points or proxy servers.

  13. Peer-led diabetes self-management programme for community-dwelling older people in China: study protocol for a quasi-experimental design.

    PubMed

    Shen, Huixia; Edwards, Helen; Courtney, Mary; McDowell, Jan; Wu, Ming

    2012-12-01

    A protocol for a new peer-led self-management programme for community-dwelling older people with diabetes in Shanghai, China. The increasing prevalence of type 2 diabetes poses major public health challenges. Appropriate education programmes could help people with diabetes to achieve self-management and better health outcomes. Providing education programmes to the fast growing number of people with diabetes present a real challenge to Chinese healthcare system, which is strained for personnel and funding shortages. Empirical literature and expert opinions suggest that peer education programmes are promising. Quasi-experimental. This study is a non-equivalent control group design (protocol approved in January, 2008). A total of 190 people, with 95 participants in each group, will be recruited from two different, but similar, communities. The programme, based on Social Cognitive Theory, will consist of basic diabetes instruction and social support and self-efficacy enhancing group activities. Basic diabetes instruction sessions will be delivered by health professionals, whereas social support and self-efficacy enhancing group activities will be led by peer leaders. Outcome variables include: self-efficacy, social support, self-management behaviours, depressive status, quality of life and healthcare utilization, which will be measured at baseline, 4 and 12 weeks. This theory-based programme tailored to Chinese patients has potential for improving diabetes self-management and subsequent health outcomes. In addition, the delivery mode, through involvement of peer leaders and existing community networks, is especially promising considering healthcare resource shortage in China. © 2012 Blackwell Publishing Ltd.

  14. Use of video observation and motor imagery on jumping performance in national rhythmic gymnastics athletes.

    PubMed

    Battaglia, Claudia; D'Artibale, Emanuele; Fiorilli, Giovanni; Piazza, Marina; Tsopani, Despina; Giombini, Arrigo; Calcagno, Giuseppe; di Cagno, Alessandra

    2014-12-01

    The aim of this study was to evaluate whether a mental training protocol could improve gymnastic jumping performance. Seventy-two rhythmic gymnasts were randomly divided into an experimental and control group. At baseline, experimental group completed the Movement Imagery Questionnaire Revised (MIQ-R) to assess the gymnast ability to generate movement imagery. A repeated measures design was used to compare two different types of training aimed at improving jumping performance: (a) video observation and PETTLEP mental training associated with physical practice, for the experimental group, and (b) physical practice alone for the control group. Before and after six weeks of training, their jumping performance was measured using the Hopping Test (HT), Drop Jump (DJ), and Counter Movement Jump (CMJ). Results revealed differences between jumping parameters F(1,71)=11.957; p<.01, and between groups F(1,71)=10.620; p<.01. In the experimental group there were significant correlations between imagery ability and the post-training Flight Time of the HT, r(34)=-.295, p<.05 and the DJ, r(34)=-.297, p<.05. The application of the protocol described herein was shown to improve jumping performance, thereby preserving the elite athlete's energy for other tasks. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. MIDAS: a practical Bayesian design for platform trials with molecularly targeted agents.

    PubMed

    Yuan, Ying; Guo, Beibei; Munsell, Mark; Lu, Karen; Jazaeri, Amir

    2016-09-30

    Recent success of immunotherapy and other targeted therapies in cancer treatment has led to an unprecedented surge in the number of novel therapeutic agents that need to be evaluated in clinical trials. Traditional phase II clinical trial designs were developed for evaluating one candidate treatment at a time and thus not efficient for this task. We propose a Bayesian phase II platform design, the multi-candidate iterative design with adaptive selection (MIDAS), which allows investigators to continuously screen a large number of candidate agents in an efficient and seamless fashion. MIDAS consists of one control arm, which contains a standard therapy as the control, and several experimental arms, which contain the experimental agents. Patients are adaptively randomized to the control and experimental agents based on their estimated efficacy. During the trial, we adaptively drop inefficacious or overly toxic agents and 'graduate' the promising agents from the trial to the next stage of development. Whenever an experimental agent graduates or is dropped, the corresponding arm opens immediately for testing the next available new agent. Simulation studies show that MIDAS substantially outperforms the conventional approach. The proposed design yields a significantly higher probability for identifying the promising agents and dropping the futile agents. In addition, MIDAS requires only one master protocol, which streamlines trial conduct and substantially decreases the overhead burden. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. MIDAS: A Practical Bayesian Design for Platform Trials with Molecularly Targeted Agents

    PubMed Central

    Yuan, Ying; Guo, Beibei; Munsell, Mark; Lu, Karen; Jazaeri, Amir

    2016-01-01

    Recent success of immunotherapy and other targeted therapies in cancer treatment has led to an unprecedented surge in the number of novel therapeutic agents that need to be evaluated in clinical trials. Traditional phase II clinical trial designs were developed for evaluating one candidate treatment at a time, and thus not efficient for this task. We propose a Bayesian phase II platform design, the Multi-candidate Iterative Design with Adaptive Selection (MIDAS), which allows investigators to continuously screen a large number of candidate agents in an efficient and seamless fashion. MIDAS consists of one control arm, which contains a standard therapy as the control, and several experimental arms, which contain the experimental agents. Patients are adaptively randomized to the control and experimental agents based on their estimated efficacy. During the trial, we adaptively drop inefficacious or overly toxic agents and “graduate” the promising agents from the trial to the next stage of development. Whenever an experimental agent graduates or is dropped, the corresponding arm opens immediately for testing the next available new agent. Simulation studies show that MIDAS substantially outperforms the conventional approach. The proposed design yields a significantly higher probability for identifying the promising agents and dropping the futile agents. In addition, MIDAS requires only one master protocol, which streamlines trial conduct and substantially decreases the overhead burden. PMID:27112322

  17. Interface design for CMOS-integrated Electrochemical Impedance Spectroscopy (EIS) biosensors.

    PubMed

    Manickam, Arun; Johnson, Christopher Andrew; Kavusi, Sam; Hassibi, Arjang

    2012-10-29

    Electrochemical Impedance Spectroscopy (EIS) is a powerful electrochemical technique to detect biomolecules. EIS has the potential of carrying out label-free and real-time detection, and in addition, can be easily implemented using electronic integrated circuits (ICs) that are built through standard semiconductor fabrication processes. This paper focuses on the various design and optimization aspects of EIS ICs, particularly the bio-to-semiconductor interface design. We discuss, in detail, considerations such as the choice of the electrode surface in view of IC manufacturing, surface linkers, and development of optimal bio-molecular detection protocols. We also report experimental results, using both macro- and micro-electrodes to demonstrate the design trade-offs and ultimately validate our optimization procedures.

  18. Mobile phone-based clinical guidance for rural health providers in India.

    PubMed

    Gautham, Meenakshi; Iyengar, M Sriram; Johnson, Craig W

    2015-12-01

    There are few tried and tested mobile technology applications to enhance and standardize the quality of health care by frontline rural health providers in low-resource settings. We developed a media-rich, mobile phone-based clinical guidance system for management of fevers, diarrhoeas and respiratory problems by rural health providers. Using a randomized control design, we field tested this application with 16 rural health providers and 128 patients at two rural/tribal sites in Tamil Nadu, Southern India. Protocol compliance for both groups, phone usability, acceptability and patient feedback for the experimental group were evaluated. Linear mixed-model analyses showed statistically significant improvements in protocol compliance in the experimental group. Usability and acceptability among patients and rural health providers were very high. Our results indicate that mobile phone-based, media-rich procedural guidance applications have significant potential for achieving consistently standardized quality of care by diverse frontline rural health providers, with patient acceptance. © The Author(s) 2014.

  19. Mussel micronucleus cytome assay.

    PubMed

    Bolognesi, Claudia; Fenech, Michael

    2012-05-17

    The micronucleus (MN) assay is one of the most widely used genotoxicity biomarkers in aquatic organisms, providing an efficient measure of chromosomal DNA damage occurring as a result of either chromosome breakage or chromosome mis-segregation during mitosis. The MN assay is today applied in laboratory and field studies using hemocytes and gill cells from bivalves, mainly from the genera Mytilus. These represent 'sentinel' organisms because of their ability to survive under polluted conditions and to accumulate both organic and inorganic pollutants. Because the mussel MN assay also includes scoring of different cell types, including necrotic and apoptotic cells and other nuclear anomalies, it is in effect an MN cytome assay. The mussel MN cytome (MUMNcyt) assay protocol we describe here reports the recommended experimental design, sample size, cell preparation, cell fixation and staining methods. The protocol also includes criteria and photomicrographs for identifying different cell types and scoring criteria for micronuclei (MNi) and nuclear buds. The complete procedure requires approximately 10 h for each experimental point/sampling station (ten animals).

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishihara, T

    Currently, the problem at hand is in distributing identical copies of OEP and filter software to a large number of farm nodes. One of the common methods used to transfer these softwares is through unicast. Unicast protocol faces the problem of repetitiously sending the same data over the network. Since the sending rate is limited, this process poses to be a bottleneck. Therefore, one possible solution to this problem lies in creating a reliable multicast protocol. A specific type of multicast protocol is the Bulk Multicast Protocol [4]. This system consists of one sender distributing data to many receivers. Themore » sender delivers data at a given rate of data packets. In response to that, the receiver replies to the sender with a status packet which contains information about the packet loss in terms of Negative Acknowledgment. The probability of the status packet sent back to the sender is+, where N is the number of receivers. The protocol is designed to have approximately 1 status packet for each data packet sent. In this project, we were able to show that the time taken for the complete transfer of a file to multiple receivers was about 12 times faster with multicast than by the use of unicast. The implementation of this experimental protocol shows remarkable improvement in mass data transfer to a large number of farm machines.« less

  1. Cognitive behavioral therapy program for cannabis use cessation in first-episode psychosis patients: study protocol for a randomized controlled trial.

    PubMed

    González-Ortega, Itxaso; Echeburúa, Enrique; García-Alocén, Adriana; Vega, Patricia; González-Pinto, Ana

    2016-07-29

    The high rate of cannabis use among patients with first-episode psychosis (FEP), as well as the associated negative impact on illness course and treatment outcomes, underlines the need for effective interventions in these populations. However, to date, there have been few clinical treatment trials (of pharmacological or psychological interventions) that have specifically focused on addressing comorbid cannabis use among these patients. The aim of this paper is to describe the design of a study protocol for a randomized controlled trial in which the objective is to assess the efficacy of a specific cognitive behavioral therapy program for cannabis cessation in patients with FEP compared to standard treatment (psychoeducation). This is a single-blind randomized study with 1 year of follow-up. Patients are to be randomly assigned to one of two treatments: (1) specific cognitive behavioral therapy for cannabis cessation composed of 1-hour sessions once a week for 16 weeks, in addition to pharmacological treatment scheduled by the psychiatrist, or (2) a control group (psychoeducation + pharmacological treatment) following the same format as the experimental group. Participants in both groups will be evaluated at baseline (pre-treatment), at 16 weeks (post-treatment), and at 3 and 6 months and 1 year of follow-up. The primary outcome will be that patients in the experimental group will have greater cannabis cessation than patients in the control group at post-treatment. The secondary outcome will be that the experimental group will have better clinical and functional outcomes than the control group. This study provides the description of a clinical trial design based on specific cognitive behavioral therapy for cannabis cessation in FEP patients, aiming to improve clinical and functional outcome, as well as tackling the addictive disorder. NCT02319746 ClinicalTrials.gov Identifier. ClinicalTrials.gov Protocol and Results Registration System (PRS) Receipt Release Date: 15 December 2014.

  2. Opportunistic detection of atrial fibrillation in subjects aged 65 years or older in primare care: a randomised clinical trial of efficacy. DOFA-AP study protocol

    PubMed Central

    2012-01-01

    Background Clinical Practice Guidelines recommend using peripheral blood pulse measuring as a screening test for Atrial Fibrillation. However, there is no adequate evidence supporting the efficacy of such procedure in primary care clinical practice. This paper describes a study protocol designed to verify whether early opportunistic screening for Atrial Fibrillation by measuring blood pulse is more effective than regular practice in subjects aged 65 years attending primary care centers. Methods/design An cluster-randomized controlled trial conducted in Primary Care Centers of the Spanish National Health Service. A total of 269 physicians and nurses will be allocated to one of the two arms of the trial by stratified randomization with a 3:2 ratio (three practitioners will be assigned to the Control Group for every two practitioners assigned to the Experimental Group). As many as 12 870 patients aged 65 years or older and meeting eligibility criteria will be recruited (8 580 will be allocated to the Experimental Group and 4 290 to the Control Group). Randomization and allocation to trial groups will be carried out by a central computer system. The Experimental Group practitioners will conduct an opportunistic case finding for patients with Atrial Fibrillation, while the Control Group practitioners will follow the regular guidelines. The first step will be finding new Atrial Fibrillation cases. A descriptive inferential analysis will be performed (bivariate and multivariate by multilevel logistic regression analysis). Discussion If our hypothesis is confirmed, we expect Primary Care professionals to take a more proactive approach and adopt a new protocol when a patient meeting the established screening criteria is identified. Finally, we expect this measure to be incorporated into Clinical Practice Guidelines. Trial registration The study is registered as NCT01291953 (ClinicalTrials.gob) PMID:23130754

  3. The MaxQuant computational platform for mass spectrometry-based shotgun proteomics.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Cox, Juergen

    2016-12-01

    MaxQuant is one of the most frequently used platforms for mass-spectrometry (MS)-based proteomics data analysis. Since its first release in 2008, it has grown substantially in functionality and can be used in conjunction with more MS platforms. Here we present an updated protocol covering the most important basic computational workflows, including those designed for quantitative label-free proteomics, MS1-level labeling and isobaric labeling techniques. This protocol presents a complete description of the parameters used in MaxQuant, as well as of the configuration options of its integrated search engine, Andromeda. This protocol update describes an adaptation of an existing protocol that substantially modifies the technique. Important concepts of shotgun proteomics and their implementation in MaxQuant are briefly reviewed, including different quantification strategies and the control of false-discovery rates (FDRs), as well as the analysis of post-translational modifications (PTMs). The MaxQuant output tables, which contain information about quantification of proteins and PTMs, are explained in detail. Furthermore, we provide a short version of the workflow that is applicable to data sets with simple and standard experimental designs. The MaxQuant algorithms are efficiently parallelized on multiple processors and scale well from desktop computers to servers with many cores. The software is written in C# and is freely available at http://www.maxquant.org.

  4. Immunosuppression for in vivo research: state-of-the-art protocols and experimental approaches

    PubMed Central

    Diehl, Rita; Ferrara, Fabienne; Müller, Claudia; Dreyer, Antje Y; McLeod, Damian D; Fricke, Stephan; Boltze, Johannes

    2017-01-01

    Almost every experimental treatment strategy using non-autologous cell, tissue or organ transplantation is tested in small and large animal models before clinical translation. Because these strategies require immunosuppression in most cases, immunosuppressive protocols are a key element in transplantation experiments. However, standard immunosuppressive protocols are often applied without detailed knowledge regarding their efficacy within the particular experimental setting and in the chosen model species. Optimization of such protocols is pertinent to the translation of experimental results to human patients and thus warrants further investigation. This review summarizes current knowledge regarding immunosuppressive drug classes as well as their dosages and application regimens with consideration of species-specific drug metabolization and side effects. It also summarizes contemporary knowledge of novel immunomodulatory strategies, such as the use of mesenchymal stem cells or antibodies. Thus, this review is intended to serve as a state-of-the-art compendium for researchers to refine applied experimental immunosuppression and immunomodulation strategies to enhance the predictive value of preclinical transplantation studies. PMID:27721455

  5. Recovery After Prolonged Bed-Rest Deconditioning

    NASA Technical Reports Server (NTRS)

    Greenleaf, John E.; Quach, David T.

    2003-01-01

    Recovery data were analyzed from normal healthy test subjects maintained in the horizontal or head-down body position in well-controlled bed rest (BR) studies in which adherence to the well-designed protocol was monitored. Because recovery data were almost always of secondary importance to the data collected during the BR period, there was little consistency in the recovery experimental designs regarding control factors (e.g., diet or exercise), duration, or timing of data collection. Thus, only about half of the BR studies that provided appropriate data were analyzed here. These recovery data were sorted into two groups: those from BR protocols of less than 37 days, and those from protocols greater than 36 days. There was great disparity in the unchanged responses at the end of BR in these two groups. Likewise with the variables that required more than 40 days for recovery; for example, some immune variables required more than 180 days. Knowledge of the recovery process after BR in healthy people should assist rehabilitation workers in differentiating "healthy" BR recovery responses from those of the infirmity of sick or injured patients; this should result in more appropriate and efficient health care.

  6. The effect of standardized patient feedback in teaching surgical residents informed consent: results of a pilot study.

    PubMed

    Leeper-Majors, Kristine; Veale, James R; Westbrook, Thomas S; Reed, Kendall

    2003-01-01

    The purpose of this pilot study was to determine the effectiveness of using feedback from a standardized patient (SP) to teach a surgical resident (SR) informed consent (IC) protocol. Four general case types of increasing difficulty were tested in a longitudinal experimental design format. The four types of cases were appendectomy, cholecystectomy, colorectal cancer, and breast cancer. Eight SRs of varying years of completion in medical school served as subjects-four in the experimental group (received performance feedback from an SP) and four in the control group (received no SP feedback). Both the control and experimental groups participated in two patient encounters per case type. The first patient encounter served as the pretest, and the second patient encounter was the posttest. In each encounter, an SP rated the resident on 14 measures using an open-ended seven-point rating scale adopted and modified from the Brown University Interpersonal Skill Evaluation (BUISE). Each resident also reviewed a videotape of an expert giving IC between pretest and the posttest for basic instructional protocol. Random stratified sampling was used to equally distribute the residents by postgraduate years. A total of 16 SPs were used in this study. All patient/SR encounters were videotaped. There was a statistically significant overall change--pretest to posttest and across cases (p = 0.001). The group effect was statistically significant (p = 0.000), with the experimental group averaging about 10 points greater than the control group. Standardized patient feedback is an effective modality in teaching surgical residents informed consent protocol. This conclusion is tentative, due to the limitations of sample size. The results of this study support continued research on the effects of standardized patient feedback to teach informed consent to surgical residents.

  7. The Social Dimension of Stress: Experimental Manipulations of Social Support and Social Identity in the Trier Social Stress Test

    PubMed Central

    Frisch, Johanna U.; Häusser, Jan A.; van Dick, Rolf; Mojzisch, Andreas

    2015-01-01

    In many situations humans are influenced by the behavior of other people and their relationships with them. For example, in stressful situations supportive behavior of other people as well as positive social relationships can act as powerful resources to cope with stress. In order to study the interplay between these variables, this protocol describes two effective experimental manipulations of social relationships and supportive behavior in the laboratory. In the present article, these two manipulations are implemented in the Trier Social Stress Test (TSST)—a standard stress induction paradigm in which participants are subjected to a simulated job interview. More precisely, we propose (a) a manipulation of the relationship between different protagonists in the TSST by making a shared social identity salient and (b) a manipulation of the behavior of the TSST-selection committee, which acts either supportively or unsupportively. These two experimental manipulations are designed in a modular fashion and can be applied independently of each other but can also be combined. Moreover, these two manipulations can also be integrated into other stress protocols and into other standardized social interactions such as trust games, negotiation tasks, or other group tasks. PMID:26649856

  8. The BIMDA shuttle flight mission - A low cost MPS payload

    NASA Technical Reports Server (NTRS)

    Holemans, Jaak; Cassanto, John M.; Morrison, Dennis; Rose, Alan; Luttges, Marvin

    1990-01-01

    The design, operation, and experimental protocol of the Bioserve-ITA Materials Dispersion Apparatus Payload (BIMDA) to be flown on the Space Shuttle on STS-37 are described. The aim of BIMDA is to investigate the methods and commercial potential of biomedical and fluid science applications in the microgravity environment. The BIMDA payload operations are diagrammed, and the payload components and experiments are listed, including the investigators and sponsoring institutions.

  9. Comparative Resuscitation Measures for Drug Toxicities Utilizing Lipid Emulsions

    DTIC Science & Technology

    2015-01-13

    experimental, mixed research design Methods: For each drug studied, seven swine were assigned to eight ACLS or BLS protocol resuscitation groups ...studied drug overdose. For example, with bupivacaine, seventy- one percent of the epinephrine/lipid group survived compared to 19% of all the groups ...surviving. The Epinephrine only group yielded three survivors and the Lipid emulsion only group yielded one survivor. No swine in the CPR only or

  10. Effects of organic matter removal, soil compaction and vegetation control on 10th year biomass and foliar nutrition: LTSP continent-wide comparisons

    Treesearch

    Felix Ponder Jr.; Robert L. Fleming; Shannon Berch; Matt D. Busse; John D. Elioff; Paul W. Hazlett; Richard D. Kabzems; J. Marty Kranabetter; David M. Morris; Deborah Page-Dumroese; Brian J. Palik; Robert F. Powers; Felipe G. Sanchez; D. Andrew Scott; Richard H. Stagg; Douglas M. Stone; David H. Young; Jianwei Zhang; Kim H. Ludovici; Daniel W. McKenney; Debbie S Mossa; Paul T. Sanborn; Richard A. Voldseth

    2012-01-01

    We examined 10th year above-ground planted tree and total stand biomass, and planted tree foliar N and P concentrations across gradients in soil disturbance at 45 North American Long-Term Soil Productivity (LTSP) installations. While ranging across several climate regions, these installations all share a common experimental design with similar measurement protocols....

  11. The effects of a walking program on older Chinese American immigrants with hypertension: a pretest and posttest quasi-experimental design.

    PubMed

    Chiang, Chun-Ying; Sun, Fan-Ko

    2009-01-01

    Hypertension is known to have high rates among Chinese Americans. Identifying culturally specific interventions to reduce sedentary behavior may be effective in reducing hypertension. This study examines the effects of an 8-week walking program with and without cultural modification. The study used a 2-group, pretest and posttest, quasi-experimental design. A total sample of 128 Chinese American immigrants with hypertension were assigned to walking groups. The results showed that the walking program had no significant effects upon participant blood pressure or walking endurance. The results also revealed that individuals in the maintenance stage walked longer than those in the preparation stage. A comparison of demographic data showed that subjects with a lower level of education walked more minutes per week, which contributed to lower systolic blood pressures among this group as compared with those with a higher level of education. These results suggest that this walking protocol, when translated into Chinese and when accompanied by a weekly telephone reminder and other interactions with a Chinese-speaking nurse, is appropriate to use without additional cultural modification. Future research should examine other components of Chinese culture or should apply this protocol for a longer period of time.

  12. Comparison of protocols for measuring cosmetic ingredient distribution in human and pig skin.

    PubMed

    Gerstel, D; Jacques-Jamin, C; Schepky, A; Cubberley, R; Eilstein, J; Grégoire, S; Hewitt, N; Klaric, M; Rothe, H; Duplan, H

    2016-08-01

    The Cosmetics Europe Skin Bioavailability and Metabolism Task Force aims to improve the measurement and prediction of the bioavailability of topically-exposed compounds for risk assessment. Key parameters of the experimental design of the skin penetration studies were compared. Penetration studies with frozen human and pig skin were conducted in two laboratories, according to the SCCS and OECD 428 guidelines. The disposition in skin was measured 24h after finite topical doses of caffeine, resorcinol and 7-ethoxycoumarin. The bioavailability distribution in skin layers of cold and radiolabelled chemicals were comparable. Furthermore, the distribution of each chemical was comparable in human and pig skin. The protocol was reproducible across the two laboratories. There were small differences in the amount of chemical detected in the skin layers, which were attributed to differences in washing procedures and anatomical sites of the skin used. In conclusion, these studies support the use of pig skin as an alternative source of skin should the availability of human skin become a limiting factor. If radiolabelled chemicals are not available, cold chemicals can be used, provided that the influence of chemical stability, reactivity or metabolism on the experimental design and the relevance of the data obtained is considered. Copyright © 2016. Published by Elsevier Ltd.

  13. Biofeedback treatment of constipation: a critical review.

    PubMed

    Heymen, Steve; Jones, Kenneth R; Scarlett, Yolanda; Whitehead, William E

    2003-09-01

    This review was designed to 1) critically examine the research design used in investigations of biofeedback for pelvic floor dyssynergia, 2) compare the various biofeedback treatment protocols for pelvic floor dyssynergia-type constipation used in this research, 3) identify factors that influence treatment outcome, and 4) identify goals for future biofeedback research for pelvic floor dyssynergia. A comprehensive review of both the pediatric and adult research from 1970 to 2002 on "biofeedback for constipation" was conducted using a Medline search in all languages. Only prospective studies including five or more subjects that described the treatment protocol were included. In addition, a meta-analysis of these studies was performed to compare the outcome of different biofeedback protocols for treating constipation. Thirty-eight studies were reviewed, and sample size, treatment protocol, outcome rates, number of sessions, and etiology are shown in a table. Ten studies using a parallel treatment design were reviewed in detail, including seven that randomized subjects to treatment groups. A meta-analysis (weighted by subjects) was performed to compare the results of two treatment protocols prevalent in the literature. The mean success rate of studies using pressure biofeedback (78 percent) was superior (P = 0.018) to the mean success rate for studies using electromyography biofeedback (70 percent). However, the mean success rates comparing studies using intra-anal electromyography sensors to studies using perianal electromyography sensors were 69 and 72 percent, respectively, indicating no advantages for one type of electromyography protocol over the other (P = 0.428). In addition to the varied protocols and instrumentation used, there also are inconsistencies in the literature regarding the severity and etiology of symptoms, patient selection criteria, and the definition of a successful outcome. Finally, no anatomic, physiologic, or demographic variables were identified that would assist in predicting successful outcome. Having significant psychological symptoms was identified as a factor that may influence treatment outcome, but this requires further study. Although most studies report positive results using biofeedback to treat constipation, quality research is lacking. Specific recommendations are made for future investigations to 1) improve experimental design, 2) clearly define outcome measures, 3) identify the etiology and severity of symptoms, 4) determine which treatment protocol and which component of treatment is most effective for different types of subjects, 5) systematically explore the role of psychopathology in this population, 6) use an adequate sample size that allows for meaningful analysis, and 7) include long-term follow-up data.

  14. Translational cognitive endocrinology: Designing rodent experiments with the goal to ultimately enhance cognitive health in women

    PubMed Central

    Mennenga, S.E.; Bimonte-Nelson, H.A.

    2014-01-01

    Understanding the cognitive impact of endogenously derived, and exogenously administered, hormone alterations is necessary for developing hormone treatments to support healthy brain function in women, especially during aging. The increasing number of studies in the burgeoning area of translational cognitive neuroendocrinology has revealed numerous factors that influence the extent and direction of female steroid effects on cognition. Here, we discuss the decision processes underlying the design of rodent hormone manipulation experiments evaluating learning and memory. It is noted that even when beginning with a clear hypothesis-driven question, there are numerous factors to consider in order to solidify a sound experimental design that will yield clean, interpretable results. Decisions and considerations include: age of animals at hormone administration and test, ovariectomy implementation, when to administer hormones relative to ovarian hormone loss, how and whether to monitor the estrous cycle if animals are ovary-intact, dose of hormone, administration route of hormone, hormone treatment confirmation protocols, handling procedures required for hormone administration and treatment confirmation, cognitive domains to be tested and which mazes should be utilized to test these cognitive domains, and control measures to be used. A balanced view of optimal design and realistic experimental practice and protocol is presented. The emerging results from translational cognitive neuroendocrinology studies have been diverse, but also enlightening and exciting as we realize the broad scope and powerful nature of ovarian hormone effects on the brain and its function. We must design, implement, and interpret hormone and cognition experiments with sensitivity to these tenets, acknowledging and respecting the breadth and depth of the impact gonadal hormones have on brain functioning and its rich plasticity. PMID:23391594

  15. The importance of communication in promoting voluntary participation in an experimental trial: A qualitative study based on the assessment of the gamma-interferon test for the diagnosis of bovine tuberculosis in France

    PubMed Central

    Dufour, Barbara; Praud, Anne

    2017-01-01

    Understanding the factors leading each stakeholder to participate in an experimental trial is a key element for improving trial set-up and for identifying selection bias in statistical analyses. An experimental protocol, validated by the European Commission, was developed in France to assess the ability of the gamma-interferon test in terms of accuracy to replace the second intradermal skin test in cases of suspected bovine tuberculosis. Implemented between 2013 and 2015, this experimental trial was based on voluntary participation. To determine and understand the motivation or reluctance of farmers to take part in this trial, we carried out a sociological survey in France. Our study was based on semi-structured interviews with the farmers and other stakeholders involved. The analysis of findings demonstrated that shortening the lock-up period during tuberculosis suspicion, following the use of a gamma-interferon test, was an important aim and a genuine challenge for the animal health stakeholders. However, some farmers did not wish to continue the trial because it could potentially have drastic consequences for them. Moreover, misunderstandings and confusion concerning the objectives and consequences of the trial led stakeholders to reject it forcefully. Based on our results, we offer some recommendations: clear and appropriate communication tools should be prepared to explain the protocol and its aims. In addition, these types of animal health trials should be designed with the stakeholders’ interests in mind. This study provides a better understanding of farmer motivations and stakeholder influences on trial participation and outcomes. The findings can be used to help design trials so that they promote participation by farmers and by all animal health stakeholders in general. PMID:28973018

  16. Link-state-estimation-based transmission power control in wireless body area networks.

    PubMed

    Kim, Seungku; Eom, Doo-Seop

    2014-07-01

    This paper presents a novel transmission power control protocol to extend the lifetime of sensor nodes and to increase the link reliability in wireless body area networks (WBANs). We first experimentally investigate the properties of the link states using the received signal strength indicator (RSSI). We then propose a practical transmission power control protocol based on both short- and long-term link-state estimations. Both the short- and long-term link-state estimations enable the transceiver to adapt the transmission power level and target the RSSI threshold range, respectively, to simultaneously satisfy the requirements of energy efficiency and link reliability. Finally, the performance of the proposed protocol is experimentally evaluated in two experimental scenarios-body posture change and dynamic body motion-and compared with the typical WBAN transmission power control protocols, a real-time reactive scheme, and a dynamic postural position inference mechanism. From the experimental results, it is found that the proposed protocol increases the lifetime of the sensor nodes by a maximum of 9.86% and enhances the link reliability by reducing the packet loss by a maximum of 3.02%.

  17. Continuous-variable protocol for oblivious transfer in the noisy-storage model.

    PubMed

    Furrer, Fabian; Gehring, Tobias; Schaffner, Christian; Pacher, Christoph; Schnabel, Roman; Wehner, Stephanie

    2018-04-13

    Cryptographic protocols are the backbone of our information society. This includes two-party protocols which offer protection against distrustful players. Such protocols can be built from a basic primitive called oblivious transfer. We present and experimentally demonstrate here a quantum protocol for oblivious transfer for optical continuous-variable systems, and prove its security in the noisy-storage model. This model allows us to establish security by sending more quantum signals than an attacker can reliably store during the protocol. The security proof is based on uncertainty relations which we derive for continuous-variable systems, that differ from the ones used in quantum key distribution. We experimentally demonstrate in a proof-of-principle experiment the proposed oblivious transfer protocol for various channel losses by using entangled two-mode squeezed states measured with balanced homodyne detection. Our work enables the implementation of arbitrary two-party quantum cryptographic protocols with continuous-variable communication systems.

  18. Sampling and measurement protocols for long-term silvicultural studies on the Penobscot Experimental Forest

    Treesearch

    Justin D. Waskiewicz; Laura S. Kenefic; Nicole S. Rogers; Joshua J. Puhlick; John C. Brissette; Richard J. Dionne

    2015-01-01

    The U.S. Forest Service, Northern Research Station has been conducting research on the silviculture of northern conifers on the Penobscot Experimental Forest (PEF) in Maine since 1950. Formal study plans provide guidance and specifications for the experimental treatments, but documentation is also needed to ensure consistency in data collection and sampling protocols....

  19. Quasi-experimental study designs series-paper 11: supporting the production and use of health systems research syntheses that draw on quasi-experimental study designs.

    PubMed

    Lavis, John N; Bärnighausen, Till; El-Jardali, Fadi

    2017-09-01

    To describe the infrastructure available to support the production of policy-relevant health systems research syntheses, particularly those incorporating quasi-experimental evidence, and the tools available to support the use of these syntheses. Literature review. The general challenges associated with the available infrastructure include their sporadic nature or limited coverage of issues and countries, whereas the specific ones related to policy-relevant syntheses of quasi-experimental evidence include the lack of mechanism to register synthesis titles and scoping review protocols, the limited number of groups preparing user-friendly summaries, and the difficulty of finding quasi-experimental studies for inclusion in rapid syntheses and research syntheses more generally. Although some new tools have emerged in recent years, such as guidance workbooks and citizen briefs and panels, challenges related to using available tools to support the use of policy-relevant syntheses of quasi-experimental evidence arise from such studies potentially being harder for policymakers and stakeholders to commission and understand. Policymakers, stakeholders, and researchers need to expand the coverage and institutionalize the use of the available infrastructure and tools to support the use of health system research syntheses containing quasi-experimental evidence. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Examining clinical supervision as a mechanism for changes in practice: a research protocol.

    PubMed

    Dilworth, Sophie; Higgins, Isabel; Parker, Vicki; Kelly, Brian; Turner, Jane

    2014-02-01

    This paper describes the research protocol for a study exploring if and how clinical supervision facilitates change in practice relating to psychosocial aspects of care for Health Professionals, who have been trained to deliver a psychosocial intervention to adults with cancer. There is a recognized need to implement care that is in line with clinical practice guidelines for the psychosocial care of adults with cancer. Clinical supervision is recommended as a means to support Health Professionals in providing the recommended psychosocial care. A qualitative design embedded within an experimental, stepped wedge randomized control trial. The study will use discourse analysis to analyse audio-recorded data collected in clinical supervision sessions that are being delivered as one element of a large randomized control trial. The sessions will be attended primarily by nurses, but including physiotherapists, radiation therapists, occupational therapists. The Health Professionals are participants in a randomized control trial designed to reduce anxiety and depression of distressed adults with cancer. The sessions will be facilitated by psychiatrists experienced in psycho-oncology and the provision of clinical supervision. The proposed research is designed specifically to facilitate exploration of the mechanisms by which clinical supervision enables Health Professionals to deliver a brief, tailored psychosocial intervention in the context of their everyday practice. This is the first study to use discourse analysis embedded within an experimental randomized control trial to explore the mechanisms of change generated within clinical supervision by analysing the discourse within the clinical supervision sessions. © 2013 John Wiley & Sons Ltd.

  1. Regular Topologies for Gigabit Wide-Area Networks. Volume 1

    NASA Technical Reports Server (NTRS)

    Shacham, Nachum; Denny, Barbara A.; Lee, Diane S.; Khan, Irfan H.; Lee, Danny Y. C.; McKenney, Paul

    1994-01-01

    In general terms, this project aimed at the analysis and design of techniques for very high-speed networking. The formal objectives of the project were to: (1) Identify switch and network technologies for wide-area networks that interconnect a large number of users and can provide individual data paths at gigabit/s rates; (2) Quantitatively evaluate and compare existing and proposed architectures and protocols, identify their strength and growth potentials, and ascertain the compatibility of competing technologies; and (3) Propose new approaches to existing architectures and protocols, and identify opportunities for research to overcome deficiencies and enhance performance. The project was organized into two parts: 1. The design, analysis, and specification of techniques and protocols for very-high-speed network environments. In this part, SRI has focused on several key high-speed networking areas, including Forward Error Control (FEC) for high-speed networks in which data distortion is the result of packet loss, and the distribution of broadband, real-time traffic in multiple user sessions. 2. Congestion Avoidance Testbed Experiment (CATE). This part of the project was done within the framework of the DARTnet experimental T1 national network. The aim of the work was to advance the state of the art in benchmarking DARTnet's performance and traffic control by developing support tools for network experimentation, by designing benchmarks that allow various algorithms to be meaningfully compared, and by investigating new queueing techniques that better satisfy the needs of best-effort and reserved-resource traffic. This document is the final technical report describing the results obtained by SRI under this project. The report consists of three volumes: Volume 1 contains a technical description of the network techniques developed by SRI in the areas of FEC and multicast of real-time traffic. Volume 2 describes the work performed under CATE. Volume 3 contains the source code of all software developed under CATE.

  2. EXPERIMENTAL PROTOCOL FOR DETERMINING PROTOLYSIS REACTION RATE CONSTANTS

    EPA Science Inventory

    An experimental protocol to determine photolysis rates of chemicals which photolyze relatively rapidly in the gas phase has been developed. This procedure provides a basis for evaluating the relative importance of one atmospheric reaction pathway (i.e., photolysis) for organic su...

  3. Challenges for Preclinical Investigations of Human Biofield Modalities

    PubMed Central

    Gronowicz, Gloria; Bengston, William

    2015-01-01

    Preclinical models for studying the effects of the human biofield have great potential to advance our understanding of human biofield modalities, which include external qigong, Johrei, Reiki, therapeutic touch, healing touch, polarity therapy, pranic healing, and other practices. A short history of Western biofield studies using preclinical models is presented and demonstrates numerous and consistent examples of human biofields significantly affecting biological systems both in vitro and in vivo. Methodological issues arising from these studies and practical solutions in experimental design are presented. Important questions still left unanswered with preclinical models include variable reproducibility, dosing, intentionality of the practitioner, best preclinical systems, and mechanisms. Input from the biofield practitioners in the experimental design is critical to improving experimental outcomes; however, the development of standard criteria for uniformity of practice and for inclusion of multiple practitioners is needed. Research in human biofield studies involving preclinical models promises a better understanding of the mechanisms underlying the efficacy of biofield therapies and will be important in guiding clinical protocols and integrating treatments with conventional medical therapies. PMID:26665042

  4. [The Effect of Structured Group Reminiscence Therapy on the Life Satisfaction of Institutionalized Elderly].

    PubMed

    Chen, Shu-Mei; Kuo, Chien-Lin; Chen, Mei-Rong; Lee, Lai-Ling; Lee, Pi-Yueh; Wang, Shu-Fen

    2016-08-01

    Long-term care institutions have become an option for older people who are dependent in daily living. However, insufficient attention has been focused on assessing the life satisfaction of those currently residing in these institutions in Taiwan. Previous research indicates that group reminiscence may improve the life satisfaction of older adults. However, there is currently no consensus regarding the implementation and evaluation of reminiscence interventions. To examine the effect of a structured group reminiscence protocol on the life satisfaction of institutionalized older adults. The study used a quasi-experimental design. A total of 48 older adults were conveniently recruited from two long-term care institutions in southern Taiwan. The experimental group (n = 23) received 8 weeks of structured-group reminiscence for 40 minutes weekly, while the control group (n = 25) received routine care from the institution. Both groups were evaluated using a life-satisfaction questionnaire before and after the intervention and again four weeks later. Life satisfaction scores were statistically similar on the pre-test and significantly different on both post-test questionnaires for the two groups. The scores for the experimental and control groups were pre-test: 24.22 vs 23.36 (p = .063); post-test I: 27.22 vs 23.32 (p < .001); and post-test II: 26.43 vs 23.00 (p < .001). The mean post-test scores for the experimental group were significantly higher than the pre-test score (p < .001). The generalized estimating equation test showed that the overall score of life satisfaction for the experimental group increased by 0.85-points (p = .042) more than the control group, which is a significant difference. The results support that the 8-week structured group reminiscence protocol effectively enhances life satisfaction in older adults. The results of this study may be referenced in the continuing education of nurses working in long-term care institutions in the context of helping nurses organize, facilitate, and evaluate this protocol.

  5. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    PubMed

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  6. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator

    PubMed Central

    Drewes, Rich; Zou, Quan; Goodman, Philip H.

    2008-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707

  7. Electro-mechanical probe positioning system for large volume plasma device

    NASA Astrophysics Data System (ADS)

    Sanyasi, A. K.; Sugandhi, R.; Srivastava, P. K.; Srivastav, Prabhakar; Awasthi, L. M.

    2018-05-01

    An automated electro-mechanical system for the positioning of plasma diagnostics has been designed and implemented in a Large Volume Plasma Device (LVPD). The system consists of 12 electro-mechanical assemblies, which are orchestrated using the Modbus communication protocol on 4-wire RS485 communications to meet the experimental requirements. Each assembly has a lead screw-based mechanical structure, Wilson feed-through-based vacuum interface, bipolar stepper motor, micro-controller-based stepper drive, and optical encoder for online positioning correction of probes. The novelty of the system lies in the orchestration of multiple drives on a single interface, fabrication and installation of the system for a large experimental device like the LVPD, in-house developed software, and adopted architectural practices. The paper discusses the design, description of hardware and software interfaces, and performance results in LVPD.

  8. Orbiting Quarantine Facility. The Antaeus report, summary

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Requirements for handling extraterrestrial samples in an orbiting quarantine facility are examined. The major concepts and findings of the study are outlined. One approach that could be taken for receiving, containing, and analyzing samples returned from the surface of Mars in a mission analogous to the lunar return missions of the late 1960s and early 1970s is described. It constructs a general mission scenario and presents an overall systems design, including an approach to cost assessment. Particular attention is paid to the design of system hardware components and to the elaboration of an experimental protocol.

  9. Design and fabrication of N-alkyl-polyethylenimine-stabilized iron oxide nanoclusters for gene delivery.

    PubMed

    Liu, Gang; Wang, Zhiyong; Lee, Seulki; Ai, Hua; Chen, Xiaoyuan

    2012-01-01

    With the rapid development of nanotechnology, inorganic magnetic nanoparticles, especially iron oxide nanoparticles (IOs), have emerged as great vehicles for biomedical diagnostic and therapeutic applications. In order to rationally design IO-based gene delivery nanovectors, surface modification is essential and determines the loading and release of the gene of interest. Here we highlight the basic concepts and applications of nonviral gene delivery vehicles based on low molecular weight N-alkyl polyethylenimine-stabilized IOs. The experimental protocols related to these topics are described in this chapter. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. A critical evaluation of the experimental design of studies of mechanism based enzyme inhibition, with implications for in vitro-in vivo extrapolation.

    PubMed

    Ghanbari, F; Rowland-Yeo, K; Bloomer, J C; Clarke, S E; Lennard, M S; Tucker, G T; Rostami-Hodjegan, A

    2006-04-01

    The published literature on mechanism based inhibition (MBI) of CYPs was evaluated with respect to experimental design, methodology and data analysis. Significant variation was apparent in the dilution factor, ratio of preincubation to incubation times and probe substrate concentrations used, and there were some anomalies in the estimation of associated kinetic parameters (k(inact), K(I), r). The impact of the application of inaccurate values of k(inact) and K(I) when extrapolating to the extent of inhibition in vivo is likely to be greatest for those compounds of intermediate inhibitory potency, but this also depends on the fraction of the net clearance of substrate subject to MBI and the pre-systemic and systemic exposure to the inhibitor. For potent inhibitors, the experimental procedure is unlikely to have a material influence on the maximum inhibition. Nevertheless, the bias in the values of the kinetic parameters may influence the time for recovery of enzyme activity following re-synthesis of the enzyme. Careful attention to the design of in vitro experiments to obtain accurate kinetic parameters is necessary for a reliable prediction of different aspects of the in vivo consequences of MBI. The review calls for experimental studies to quantify the impact of study design in studies of MBI, with a view to better harmonisation of protocols.

  11. Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)

    PubMed Central

    George, Ansel; Dorval, Alan D.; Christini, David J.

    2017-01-01

    The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998

  12. Phase II Clinical Trial of Intraoral Grafting of Human Tissue Engineered Oral Mucosa

    DTIC Science & Technology

    2017-10-01

    experimental arm subject in the small defect study. A protocol amendment in early 2017revised the study inclusionary criteria to include all non ...construed as an official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE...group phase II study to assess the safety and efficacy for use of human EVPOME for soft tissue intraoral grafting procedures compared to the “gold

  13. Three-Dimensional Computer Graphics Brain-Mapping Project

    DTIC Science & Technology

    1988-03-24

    1975-76, one of these brains was hand digitized. It was then reconstructed three dimensionally, using an Evans and Sutherland Picture System 2. This...Yakovlev Collection, we use the Evans and Sutherland Picture System 2 which we have been employing for this purpose for a dozen years. Its virtue is...careful, experimentally designed new protocol (See Figure 20). Most of these heads were imaged with Computed Tomography, thanks to Clint Stiles of Picker

  14. Coexistence of twitch potentiation and tetanic force decline in rat hindlimb muscle

    NASA Technical Reports Server (NTRS)

    Rankin, Lucinda L.; Enoka, Roger M.; Volz, Kathryn A.; Stuart, Douglas G.

    1988-01-01

    The effect of whole-muscle fatigue on the isometric twitch was investigated in various hindlimb muscles of anesthetized rats, using an experimental protocol designed to assess the levels of fatigability in motor units. The results of EMG and force measurements revealed the existence of a linear relationship between fatigability and the magnitude of the twitch force following the fatigue test in both soleus and extensor digitorum longus muscles.

  15. Improving practices in nanomedicine through near real-time pharmacokinetic analysis

    NASA Astrophysics Data System (ADS)

    Magafia, Isidro B.

    More than a decade into the development of gold nanoparticles, with multiple clinical trials underway, ongoing pre-clinical research continues towards better understanding in vivo interactions. The goal is treatment optimization through improved best practices. In an effort to collect information for healthcare providers enabling informed decisions in a relevant time frame, instrumentation for real-time plasma concentration (multi-wavelength photoplethysmography) and protocols for rapid elemental analysis (energy dispersive X-Ray fluorescence) of biopsied tumor tissue have been developed in a murine model. An initial analysis, designed to demonstrate the robust nature and utility of the techniques, revealed that area under the bioavailability curve (AUC) alone does not currently inform tumor accumulation with a high degree of accuracy (R2=0.56), marginally better than injected dose (R2=0.46). This finding suggests that the control of additional experimental and physiological variables (chosen through modeling efforts) may yield more predictable tumor accumulation. Subject core temperature, blood pressure, and tumor perfusion are evaluated relative to particle uptake in a murine tumor model. New research efforts are also focused on adjuvant therapies that are employed to modify circulation parameters, including the AUC, of nanorods and gold nanoshells. Preliminary studies demonstrated a greater than 300% increase in average AUC using a reticuloendothelial blockade agent versus control groups. Given a better understanding of the relative importance of the physiological factors that influence rates of tumor accumulation, a set of experimental best practices is presented. This dissertation outlines the experimental protocols conducted, and discusses the real-world needs discovered and how these needs became specifications of developed protocols.

  16. Predicting the affinity of Farnesoid X Receptor ligands through a hierarchical ranking protocol: a D3R Grand Challenge 2 case study

    NASA Astrophysics Data System (ADS)

    Réau, Manon; Langenfeld, Florent; Zagury, Jean-François; Montes, Matthieu

    2018-01-01

    The Drug Design Data Resource (D3R) Grand Challenges are blind contests organized to assess the state-of-the-art methods accuracy in predicting binding modes and relative binding free energies of experimentally validated ligands for a given target. The second stage of the D3R Grand Challenge 2 (GC2) was focused on ranking 102 compounds according to their predicted affinity for Farnesoid X Receptor. In this task, our workflow was ranked 5th out of the 77 submissions in the structure-based category. Our strategy consisted in (1) a combination of molecular docking using AutoDock 4.2 and manual edition of available structures for binding poses generation using SeeSAR, (2) the use of HYDE scoring for pose selection, and (3) a hierarchical ranking using HYDE and MM/GBSA. In this report, we detail our pose generation and ligands ranking protocols and provide guidelines to be used in a prospective computer aided drug design program.

  17. Application of SNMP on CATV

    NASA Astrophysics Data System (ADS)

    Huang, Hong-bin; Liu, Wei-ping; Chen, Shun-er; Zheng, Liming

    2005-02-01

    A new type of CATV network management system developed by universal MCU, which supports SNMP, is proposed in this paper. From the point of view in both hardware and software, the function and method of every modules inside the system, which include communications in the physical layer, protocol process, data process, and etc, are analyzed. In our design, the management system takes IP MAN as data transmission channel and every controlled object in the management structure has a SNMP agent. In the SNMP agent developed, there are four function modules, including physical layer communication module, protocol process module, internal data process module and MIB management module. In the paper, the structure and function of every module are designed and demonstrated while the related hardware circuit, software flow as well as the experimental results are tested. Furthermore, by introducing RTOS into the software programming, the universal MCU procedure can conducts such multi-thread management as fast Ethernet controller driving, TCP/IP process, serial port signal monitoring and so on, which greatly improves efficiency of CPU.

  18. Translating evidence-based protocol of wound drain management for total joint arthroplasty into practice: A quasi-experimental study.

    PubMed

    Tsang, Lap Fung; Cheng, Hang Cheong; Ho, Hon Shuen; Hsu, Yung Chak; Chow, Chiu Man; Law, Heung Wah; Fong, Lup Chau; Leung, Lok Ming; Kong, Ivy Ching Yan; Chan, Chi Wai; Sham, Alice So Yuen

    2016-05-01

    Although various drains have long been used in total joint replacement, evidence suggests inconsistent practice exists in the use of drainage systems including intermittently applying suction or free of drainage suction, and variations in the optimal timing for wound drain removal. A comprehensive systematic review of available evidence up to 2013 was conducted in a previous study and a protocol was adapted for clinical application according to the summary of the retrieved information (Tsang, 2015). To determine if the protocol could reduce blood loss and blood transfusion after operation and to develop a record form so as to enhance communication of drainage record amongst surgeons and nurses. A quasi-experimental time-series design was undertaken. In the conventional group, surgeons ordered free drainage if the drain output was more than 300 ml. The time of removal of the drain was based on their professional judgement. In the protocol group the method of drainage was dependant of the drainage output as was the timing of the removal of the drain. A standardized record form was developed to guide operating room and orthopaedic ward nurses to manage the drainage system. The drain was removed significantly earlier in the protocol group. Blood loss rate at the first hour of post-operation was extremely low in the protocol group due to clamping effect. Blood loss in volume during the first three hours in the protocol group was significantly lower than that in the conventional group. Only in 11.1% and 4% of cases was it necessary to clamp at the three and four hour post-operative hours. No clamping was required at the two and eight hour postoperative period. There was no significant difference in blood loss during the removal of the drain and during blood transfusion, which was required for patients upon removal of the drain in the two groups. This is the first clinical study to develop an evidence-based protocol to manage wound drain effectively in Hong Kong. Total blood loss and blood transfusions were not significantly different between the conventional and protocol groups. A standard documentation document is beneficial to enhance communication between doctors and nurses as well as to monitor and observe drainage effectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. "May I Buy a Pack of Marlboros, Please?" A Systematic Review of Evidence to Improve the Validity and Impact of Youth Undercover Buy Inspections.

    PubMed

    Lee, Joseph G L; Gregory, Kyle R; Baker, Hannah M; Ranney, Leah M; Goldstein, Adam O

    2016-01-01

    Most smokers become addicted to tobacco products before they are legally able to purchase these products. We systematically reviewed the literature on protocols to assess underage purchase and their ecological validity. We conducted a systematic search in May 2015 in PubMed and PsycINFO. We independently screened records for inclusion. We conducted a narrative review and examined implications of two types of legal authority for protocols that govern underage buy enforcement in the United States: criminal (state-level laws prohibiting sales to youth) and administrative (federal regulations prohibiting sales to youth). Ten studies experimentally assessed underage buy protocols and 44 studies assessed the association between youth characteristics and tobacco sales. Protocols that mimicked real-world youth behaviors were consistently associated with substantially greater likelihood of a sale to a youth. Many of the tested protocols appear to be designed for compliance with criminal law rather than administrative enforcement in ways that limited ecological validity. This may be due to concerns about entrapment. For administrative enforcement in particular, entrapment may be less of an issue than commonly thought. Commonly used underage buy protocols poorly represent the reality of youths' access to tobacco from retailers. Compliance check programs should allow youth to present themselves naturally and attempt to match the community's demographic makeup.

  20. "May I Buy a Pack of Marlboros, Please?" A Systematic Review of Evidence to Improve the Validity and Impact of Youth Undercover Buy Inspections

    PubMed Central

    Lee, Joseph G. L.; Gregory, Kyle R.; Baker, Hannah M.; Ranney, Leah M.; Goldstein, Adam O.

    2016-01-01

    Most smokers become addicted to tobacco products before they are legally able to purchase these products. We systematically reviewed the literature on protocols to assess underage purchase and their ecological validity. We conducted a systematic search in May 2015 in PubMed and PsycINFO. We independently screened records for inclusion. We conducted a narrative review and examined implications of two types of legal authority for protocols that govern underage buy enforcement in the United States: criminal (state-level laws prohibiting sales to youth) and administrative (federal regulations prohibiting sales to youth). Ten studies experimentally assessed underage buy protocols and 44 studies assessed the association between youth characteristics and tobacco sales. Protocols that mimicked real-world youth behaviors were consistently associated with substantially greater likelihood of a sale to a youth. Many of the tested protocols appear to be designed for compliance with criminal law rather than administrative enforcement in ways that limited ecological validity. This may be due to concerns about entrapment. For administrative enforcement in particular, entrapment may be less of an issue than commonly thought. Commonly used underage buy protocols poorly represent the reality of youths' access to tobacco from retailers. Compliance check programs should allow youth to present themselves naturally and attempt to match the community’s demographic makeup. PMID:27050671

  1. Shortcuts to Adiabaticity in Transport of a Single Trapped Ion

    NASA Astrophysics Data System (ADS)

    An, Shuoming; Lv, Dingshun; Campo, Adolfo Del; Kim, Kihwan

    2015-05-01

    We report an experimental study on shortcuts to adiabaticity in the transport of a single 171Yb+ ion trapped in a harmonic potential. In these driving schemes, the application of a force induces a nonadiabatic dynamics in which excitations are tailored so as to preserve the ion motional state in the ground state upon completion of the process. We experimentally apply the laser induced force and realize three different protocols: (1) a transitionless driving with a counterdiabatic term out of phase with the displacement force, (2) a classical protocol assisted by counterdiabatic fields in phase with the main force, (3) and an engineered transport protocol based on the Fourier transform of the trap acceleration. We experimentally compare and discuss the robustness of these protocols under given experimental limitations such as trap frequency drifts. This work was supported by the National Basic Research Program of China under Grants No. 2011CBA00300 (No. 2011CBA00301), the National Natural Science Foundation of China 11374178, and the University of Massachusetts Boston (No. P20150000029279).

  2. Experimentally superposing two pure states with partial prior knowledge

    NASA Astrophysics Data System (ADS)

    Li, Keren; Long, Guofei; Katiyar, Hemant; Xin, Tao; Feng, Guanru; Lu, Dawei; Laflamme, Raymond

    2017-02-01

    Superposition, arguably the most fundamental property of quantum mechanics, lies at the heart of quantum information science. However, how to create the superposition of any two unknown pure states remains as a daunting challenge. Recently, it was proved that such a quantum protocol does not exist if the two input states are completely unknown, whereas a probabilistic protocol is still available with some prior knowledge about the input states [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403]. The knowledge is that both of the two input states have nonzero overlaps with some given referential state. In this work, we experimentally realize the probabilistic protocol of superposing two pure states in a three-qubit nuclear magnetic resonance system. We demonstrate the feasibility of the protocol by preparing a families of input states, and the average fidelity between the prepared state and expected superposition state is over 99%. Moreover, we experimentally illustrate the limitation of the protocol that it is likely to fail or yields very low fidelity, if the nonzero overlaps are approaching zero. Our experimental implementation can be extended to more complex situations and other quantum systems.

  3. A Survey on Underwater Acoustic Sensor Network Routing Protocols.

    PubMed

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-03-22

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research.

  4. A Survey on Underwater Acoustic Sensor Network Routing Protocols

    PubMed Central

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-01-01

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research. PMID:27011193

  5. Design and Implementation of a MAC Protocol for Timely and Reliable Delivery of Command and Data in Dynamic Wireless Sensor Networks

    PubMed Central

    Oh, Hoon; Van Vinh, Phan

    2013-01-01

    This paper proposes and implements a new TDMA-based MAC protocol for providing timely and reliable delivery of data and command for monitoring and control networks. In this kind of network, sensor nodes are required to sense data from the monitoring environment periodically and then send the data to a sink. The sink determines whether the environment is safe or not by analyzing the acquired data. Sometimes, a command or control message is sent from the sink to a particular node or a group of nodes to execute the services or request further interested data. The proposed MAC protocol enables bidirectional communication, controls active and sleep modes of a sensor node to conserve energy, and addresses the problem of load unbalancing between the nodes near a sink and the other nodes. It can improve reliability of communication significantly while extending network lifetime. These claims are supported by the experimental results. PMID:24084116

  6. Design and implementation of a MAC protocol for timely and reliable delivery of command and data in dynamic wireless sensor networks.

    PubMed

    Oh, Hoon; Van Vinh, Phan

    2013-09-30

    This paper proposes and implements a new TDMA-based MAC protocol for providing timely and reliable delivery of data and command for monitoring and control networks. In this kind of network, sensor nodes are required to sense data from the monitoring environment periodically and then send the data to a sink. The sink determines whether the environment is safe or not by analyzing the acquired data. Sometimes, a command or control message is sent from the sink to a particular node or a group of nodes to execute the services or request further interested data. The proposed MAC protocol enables bidirectional communication, controls active and sleep modes of a sensor node to conserve energy, and addresses the problem of load unbalancing between the nodes near a sink and the other nodes. It can improve reliability of communication significantly while extending network lifetime. These claims are supported by the experimental results.

  7. Computational Biology Methods for Characterization of Pluripotent Cells.

    PubMed

    Araúzo-Bravo, Marcos J

    2016-01-01

    Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.

  8. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  9. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  10. Experimental animal studies of radon and cigarette smoke

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cross, F.T.; Dagle, G.E.; Gies, R.A.

    Cigarette-smoking is a dominant cause of lung cancer and confounds risk assessment of exposure to radon decay products. Evidence in humans on the interaction between cigarette-smoking and exposure to radon decay products, although limited, indicates a possible synergy. Experimental animal data, in addition to showing synergy, also show a decrease or no change in risk with added cigarette-smoke exposures. This article reviews previous animal data developed at Compagnie Generale des Matieres Nucleaires and Pacific Northwest Laboratory (PNL) on mixed exposures to radon and cigarette smoke, and highlights new initiation-promotion-initiation (IPI) studies at PNL that were designed within the framework ofmore » a two-mutation carcinogenesis model. Also presented are the PNL exposure system, experimental protocols, dosimetry, and biological data observed to date in IPI animals.« less

  11. A Skyline Plugin for Pathway-Centric Data Browsing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach SRM assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). Themore » same plugin can be used for hypothesis-drive DIA data analysis, again utilizing the pathway view to help narrow down the set of proteins which will be investigated. The plugin is backed by the PNNL Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.« less

  12. A Skyline Plugin for Pathway-Centric Data Browsing

    NASA Astrophysics Data System (ADS)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.

    2016-11-01

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.

  13. P-value interpretation and alpha allocation in clinical trials.

    PubMed

    Moyé, L A

    1998-08-01

    Although much value has been placed on type I error event probabilities in clinical trials, interpretive difficulties often arise that are directly related to clinical trial complexity. Deviations of the trial execution from its protocol, the presence of multiple treatment arms, and the inclusion of multiple end points complicate the interpretation of an experiment's reported alpha level. The purpose of this manuscript is to formulate the discussion of P values (and power for studies showing no significant differences) on the basis of the event whose relative frequency they represent. Experimental discordance (discrepancies between the protocol's directives and the experiment's execution) is linked to difficulty in alpha and beta interpretation. Mild experimental discordance leads to an acceptable adjustment for alpha or beta, while severe discordance results in their corruption. Finally, guidelines are provided for allocating type I error among a collection of end points in a prospectively designed, randomized controlled clinical trial. When considering secondary end point inclusion in clinical trials, investigators should increase the sample size to preserve the type I error rates at acceptable levels.

  14. Understanding Gulf War Illness: An Integrative Modeling Approach

    DTIC Science & Technology

    2017-10-01

    group (Task 2; Subtask1). The latest iteration of this analysis focused on n=11 animals control and n=11 DFP exposed animals without corticosterone... Groups : 1 – Control Group (Male and Female Intact) 2 – GWI Model – Cort+DFP (Male and Female OVX) 3 – Control OVX (Female) 4 – GWI Model + Enbrel...The designed protocol counts with five basic experimental groups : Group 1 (Untreated Control no toxic exposure); Group 2 (Toxic exposed, no

  15. Modeling of the Space Station Freedom data management system

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1990-01-01

    The Data Management System (DMS) is the information and communications system onboard Space Station Freedom (SSF). Extensive modeling of the DMS is being conducted throughout NASA to aid in the design and development of this vital system. Activities discussed at NASA Ames Research Center to model the DMS network infrastructure are discussed with focus on the modeling of the Fiber Distributed Data Interface (FDDI) token-ring protocol and experimental testbedding of networking aspects of the DMS.

  16. Impact of study design on development and evaluation of an activity-type classifier.

    PubMed

    van Hees, Vincent T; Golubic, Rajna; Ekelund, Ulf; Brage, Søren

    2013-04-01

    Methods to classify activity types are often evaluated with an experimental protocol involving prescribed physical activities under confined (laboratory) conditions, which may not reflect real-life conditions. The present study aims to evaluate how study design may impact on classifier performance in real life. Twenty-eight healthy participants (21-53 yr) were asked to wear nine triaxial accelerometers while performing 58 activity types selected to simulate activities in real life. For each sensor location, logistic classifiers were trained in subsets of up to 8 activities to distinguish between walking and nonwalking activities and were then evaluated in all 58 activities. Different weighting factors were used to convert the resulting confusion matrices into an estimation of the confusion matrix as would apply in the real-life setting by creating four different real-life scenarios, as well as one traditional laboratory scenario. The sensitivity of a classifier estimated with a traditional laboratory protocol is within the range of estimates derived from real-life scenarios for any body location. The specificity, however, was systematically overestimated by the traditional laboratory scenario. Walking time was systematically overestimated, except for lower back sensor data (range: 7-757%). In conclusion, classifier performance under confined conditions may not accurately reflect classifier performance in real life. Future studies that aim to evaluate activity classification methods are warranted to pay special attention to the representativeness of experimental conditions for real-life conditions.

  17. In silico simulations of experimental protocols for cardiac modeling.

    PubMed

    Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther

    2014-01-01

    A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.

  18. Design of a stateless low-latency router architecture for green software-defined networking

    NASA Astrophysics Data System (ADS)

    Saldaña Cercós, Silvia; Ramos, Ramon M.; Ewald Eller, Ana C.; Martinello, Magnos; Ribeiro, Moisés. R. N.; Manolova Fagertun, Anna; Tafur Monroy, Idelfonso

    2015-01-01

    Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane offering a cost- and energy-efficient forwarding engine. This paper presents an overview of a new approach named KeyFlow to simultaneously reduce latency, jitter, and power consumption in core network nodes. Results on an emulation platform indicate that round trip time (RTT) can be reduced above 50% compared to the reference protocol OpenFlow, specially when flow tables are densely populated. Jitter reduction has been demonstrated experimentally on a NetFPGA-based platform, and 57.3% power consumption reduction has been achieved.

  19. Assessing the hodgepodge of non-mapped reads in bacterial transcriptomes: real or artifactual RNA chimeras?

    PubMed

    Lloréns-Rico, Verónica; Serrano, Luis; Lluch-Senar, Maria

    2014-07-29

    RNA sequencing methods have already altered our view of the extent and complexity of bacterial and eukaryotic transcriptomes, revealing rare transcript isoforms (circular RNAs, RNA chimeras) that could play an important role in their biology. We performed an analysis of chimera formation by four different computational approaches, including a custom designed pipeline, to study the transcriptomes of M. pneumoniae and P. aeruginosa, as well as mixtures of both. We found that rare transcript isoforms detected by conventional pipelines of analysis could be artifacts of the experimental procedure used in the library preparation, and that they are protocol-dependent. By using a customized pipeline we show that optimal library preparation protocol and the pipeline to analyze the results are crucial to identify real chimeric RNAs.

  20. Experimental Quantum Coin Tossing

    NASA Astrophysics Data System (ADS)

    Molina-Terriza, G.; Vaziri, A.; Ursin, R.; Zeilinger, A.

    2005-01-01

    In this Letter we present the first implementation of a quantum coin-tossing protocol. This protocol belongs to a class of “two-party” cryptographic problems, where the communication partners distrust each other. As with a number of such two-party protocols, the best implementation of the quantum coin tossing requires qutrits, resulting in a higher security than using qubits. In this way, we have also performed the first complete quantum communication protocol with qutrits. In our experiment the two partners succeeded to remotely toss a row of coins using photons entangled in the orbital angular momentum. We also show the experimental bounds of a possible cheater and the ways of detecting him.

  1. Quantum fingerprinting with coherent states and a constant mean number of photons

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan Miguel; Lütkenhaus, Norbert

    2014-06-01

    We present a protocol for quantum fingerprinting that is ready to be implemented with current technology and is robust to experimental errors. The basis of our scheme is an implementation of the signal states in terms of a coherent state in a superposition of time-bin modes. Experimentally, this requires only the ability to prepare coherent states of low amplitude and to interfere them in a balanced beam splitter. The states used in the protocol are arbitrarily close in trace distance to states of O (log2n) qubits, thus exhibiting an exponential separation in abstract communication complexity compared to the classical case. The protocol uses a number of optical modes that is proportional to the size n of the input bit strings but a total mean photon number that is constant and independent of n. Given the expended resources, our protocol achieves a task that is provably impossible using classical communication only. In fact, even in the presence of realistic experimental errors and loss, we show that there exist a large range of input sizes for which our quantum protocol transmits an amount of information that can be more than two orders of magnitude smaller than a classical fingerprinting protocol.

  2. Quality by design: optimization of a freeze-drying cycle via design space in case of heterogeneous drying behavior and influence of the freezing protocol.

    PubMed

    Pisano, Roberto; Fissore, Davide; Barresi, Antonello A; Brayard, Philippe; Chouvenc, Pierre; Woinet, Bertrand

    2013-02-01

    This paper shows how to optimize the primary drying phase, for both product quality and drying time, of a parenteral formulation via design space. A non-steady state model, parameterized with experimentally determined heat and mass transfer coefficients, is used to define the design space when the heat transfer coefficient varies with the position of the vial in the array. The calculations recognize both equipment and product constraints, and also take into account model parameter uncertainty. Examples are given of cycles designed for the same formulation, but varying the freezing conditions and the freeze-dryer scale. These are then compared in terms of drying time. Furthermore, the impact of inter-vial variability on design space, and therefore on the optimized cycle, is addressed. With this regard, a simplified method is presented for the cycle design, which reduces the experimental effort required for the system qualification. The use of mathematical modeling is demonstrated to be very effective not only for cycle development, but also for solving problem of process transfer. This study showed that inter-vial variability remains significant when vials are loaded on plastic trays, and how inter-vial variability can be taken into account during process design.

  3. The Challenge of Reproducibility and Accuracy in Nutrition Research: Resources and Pitfalls1234

    PubMed Central

    Kuszak, Adam J; Williamson, John S; Hopp, D Craig; Betz, Joseph M

    2016-01-01

    Inconsistent and contradictory results from nutrition studies conducted by different investigators continue to emerge, in part because of the inherent variability of natural products, as well as the unknown and therefore uncontrolled variables in study populations and experimental designs. Given these challenges inherent in nutrition research, it is critical for the progress of the field that researchers strive to minimize variability within studies and enhance comparability between studies by optimizing the characterization, control, and reporting of products, reagents, and model systems used, as well as the rigor and reporting of experimental designs, protocols, and data analysis. Here we describe some recent developments relevant to research on plant-derived products used in nutrition research, highlight some resources for optimizing the characterization and reporting of research using these products, and describe some of the pitfalls that may be avoided by adherence to these recommendations. PMID:26980822

  4. Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models

    PubMed Central

    Eckert, Alissa M.; Tumpey, Terrence M.; Maines, Taronna R.

    2016-01-01

    SUMMARY Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. PMID:27412880

  5. Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models.

    PubMed

    Belser, Jessica A; Eckert, Alissa M; Tumpey, Terrence M; Maines, Taronna R

    2016-09-01

    Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  6. In silico design of smart binders to anthrax PA

    NASA Astrophysics Data System (ADS)

    Sellers, Michael; Hurley, Margaret M.

    2012-06-01

    The development of smart peptide binders requires an understanding of the fundamental mechanisms of recognition which has remained an elusive grail of the research community for decades. Recent advances in automated discovery and synthetic library science provide a wealth of information to probe fundamental details of binding and facilitate the development of improved models for a priori prediction of affinity and specificity. Here we present the modeling portion of an iterative experimental/computational study to produce high affinity peptide binders to the Protective Antigen (PA) of Bacillus anthracis. The result is a general usage, HPC-oriented, python-based toolkit based upon powerful third-party freeware, which is designed to provide a better understanding of peptide-protein interactions and ultimately predict and measure new smart peptide binder candidates. We present an improved simulation protocol with flexible peptide docking to the Anthrax Protective Antigen, reported within the context of experimental data presented in a companion work.

  7. Experimental Design and Data Analysis Issues Contribute to Inconsistent Results of C-Bouton Changes in Amyotrophic Lateral Sclerosis.

    PubMed

    Dukkipati, S Shekar; Chihi, Aouatef; Wang, Yiwen; Elbasiouny, Sherif M

    2017-01-01

    The possible presence of pathological changes in cholinergic synaptic inputs [cholinergic boutons (C-boutons)] is a contentious topic within the ALS field. Conflicting data reported on this issue makes it difficult to assess the roles of these synaptic inputs in ALS. Our objective was to determine whether the reported changes are truly statistically and biologically significant and why replication is problematic. This is an urgent question, as C-boutons are an important regulator of spinal motoneuron excitability, and pathological changes in motoneuron excitability are present throughout disease progression. Using male mice of the SOD1-G93A high-expresser transgenic ( G93A ) mouse model of ALS, we examined C-boutons on spinal motoneurons. We performed histological analysis at high statistical power, which showed no difference in C-bouton size in G93A versus wild-type motoneurons throughout disease progression. In an attempt to examine the underlying reasons for our failure to replicate reported changes, we performed further histological analyses using several variations on experimental design and data analysis that were reported in the ALS literature. This analysis showed that factors related to experimental design, such as grouping unit, sampling strategy, and blinding status, potentially contribute to the discrepancy in published data on C-bouton size changes. Next, we systematically analyzed the impact of study design variability and potential bias on reported results from experimental and preclinical studies of ALS. Strikingly, we found that practices such as blinding and power analysis are not systematically reported in the ALS field. Protocols to standardize experimental design and minimize bias are thus critical to advancing the ALS field.

  8. A Protocol for Using Gene Set Enrichment Analysis to Identify the Appropriate Animal Model for Translational Research.

    PubMed

    Weidner, Christopher; Steinfath, Matthias; Wistorf, Elisa; Oelgeschläger, Michael; Schneider, Marlon R; Schönfelder, Gilbert

    2017-08-16

    Recent studies that compared transcriptomic datasets of human diseases with datasets from mouse models using traditional gene-to-gene comparison techniques resulted in contradictory conclusions regarding the relevance of animal models for translational research. A major reason for the discrepancies between different gene expression analyses is the arbitrary filtering of differentially expressed genes. Furthermore, the comparison of single genes between different species and platforms often is limited by technical variance, leading to misinterpretation of the con/discordance between data from human and animal models. Thus, standardized approaches for systematic data analysis are needed. To overcome subjective gene filtering and ineffective gene-to-gene comparisons, we recently demonstrated that gene set enrichment analysis (GSEA) has the potential to avoid these problems. Therefore, we developed a standardized protocol for the use of GSEA to distinguish between appropriate and inappropriate animal models for translational research. This protocol is not suitable to predict how to design new model systems a-priori, as it requires existing experimental omics data. However, the protocol describes how to interpret existing data in a standardized manner in order to select the most suitable animal model, thus avoiding unnecessary animal experiments and misleading translational studies.

  9. Rosetta:MSF: a modular framework for multi-state computational protein design.

    PubMed

    Löffler, Patrick; Schmitz, Samuel; Hupfeld, Enrico; Sterner, Reinhard; Merkl, Rainer

    2017-06-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta's protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta's single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design.

  10. Rosetta:MSF: a modular framework for multi-state computational protein design

    PubMed Central

    Hupfeld, Enrico; Sterner, Reinhard

    2017-01-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta’s protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta’s single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design. PMID:28604768

  11. Fixation and Commitment while Designing and Its Measurement

    ERIC Educational Resources Information Center

    Gero, John S.

    2011-01-01

    This paper introduces the notion that fixation and commitment while designing can be measured by studying the protocol of the design session. It is hypothesized that the dynamic entropy of the linkograph of the protocol provides the basis for such a measurement. The hypothesis is empirically tested using a design protocol and the results…

  12. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  13. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  14. Predicting the behavior of microfluidic circuits made from discrete elements

    PubMed Central

    Bhargava, Krisna C.; Thompson, Bryant; Iqbal, Danish; Malmstadt, Noah

    2015-01-01

    Microfluidic devices can be used to execute a variety of continuous flow analytical and synthetic chemistry protocols with a great degree of precision. The growing availability of additive manufacturing has enabled the design of microfluidic devices with new functionality and complexity. However, these devices are prone to larger manufacturing variation than is typical of those made with micromachining or soft lithography. In this report, we demonstrate a design-for-manufacturing workflow that addresses performance variation at the microfluidic element and circuit level, in context of mass-manufacturing and additive manufacturing. Our approach relies on discrete microfluidic elements that are characterized by their terminal hydraulic resistance and associated tolerance. Network analysis is employed to construct simple analytical design rules for model microfluidic circuits. Monte Carlo analysis is employed at both the individual element and circuit level to establish expected performance metrics for several specific circuit configurations. A protocol based on osmometry is used to experimentally probe mixing behavior in circuits in order to validate these approaches. The overall workflow is applied to two application circuits with immediate use at on the bench-top: series and parallel mixing circuits that are modularly programmable, virtually predictable, highly precise, and operable by hand. PMID:26516059

  15. Preliminary experimental results from a MARS Micro-CT system.

    PubMed

    He, Peng; Yu, Hengyong; Thayer, Patrick; Jin, Xin; Xu, Qiong; Bennett, James; Tappenden, Rachael; Wei, Biao; Goldstein, Aaron; Renaud, Peter; Butler, Anthony; Butler, Phillip; Wang, Ge

    2012-01-01

    The Medipix All Resolution System (MARS) system is a commercial spectral/multi-energy micro-CT scanner designed and assembled by the MARS Bioimaging, Ltd. in New Zealand. This system utilizes the state-of-the-art Medipix photon-counting, energy-discriminating detector technology developed by a collaboration at European Organization for Nuclear Research (CERN). In this paper, we report our preliminary experimental results using this system, including geometrical alignment, photon energy characterization, protocol optimization, and spectral image reconstruction. We produced our scan datasets with a multi-material phantom, and then applied ordered subset-simultaneous algebraic reconstruction technique (OS-SART) to reconstruct images in different energy ranges and principal component analysis (PCA) to evaluate spectral deviation among the energy ranges.

  16. Minimizing irreversible losses in quantum systems by local counterdiabatic driving

    PubMed Central

    Sels, Dries; Polkovnikov, Anatoli

    2017-01-01

    Counterdiabatic driving protocols have been proposed [Demirplak M, Rice SA (2003) J Chem Phys A 107:9937–9945; Berry M (2009) J Phys A Math Theor 42:365303] as a means to make fast changes in the Hamiltonian without exciting transitions. Such driving in principle allows one to realize arbitrarily fast annealing protocols or implement fast dissipationless driving, circumventing standard adiabatic limitations requiring infinitesimally slow rates. These ideas were tested and used both experimentally and theoretically in small systems, but in larger chaotic systems, it is known that exact counterdiabatic protocols do not exist. In this work, we develop a simple variational approach allowing one to find the best possible counterdiabatic protocols given physical constraints, like locality. These protocols are easy to derive and implement both experimentally and numerically. We show that, using these approximate protocols, one can drastically suppress heating and increase fidelity of quantum annealing protocols in complex many-particle systems. In the fast limit, these protocols provide an effective dual description of adiabatic dynamics, where the coupling constant plays the role of time and the counterdiabatic term plays the role of the Hamiltonian. PMID:28461472

  17. Establishing the first institutional animal care and use committee in Egypt.

    PubMed

    Fahmy, Sohair R; Gaafar, Khadiga

    2016-04-09

    Although animal research ethics committees (AREC) are well established in Western countries, this field is weakly developed and its concept is poorly understood in the Middle East and North Africa region. Our main objective was to introduce the concept and requirements of ethical approaches in dealing with experimental animal in research and teaching in Egypt. Due to its very recent inception, Cairo University, Faculty of Science IACUC decided to operate in accordance with Guide for the Care and Use of Laboratory Animals 8th Edition 2011 (the Guide) since Egypt has not yet compiled its own guide. Fifty protocols were reviewed in 2013-2014. Only ten protocols were reviewed in 2013, but in 2014, forty protocols were reviewed. In 2013 all protocols were approved and in 2014, number of approvals were 35, the number of deferrals were 4, and one refused protocol. Master's theses (MSc) research protocols constituted the majority of the total reviewed protocols. This is attributed to the decision of the Board of the Faculty of Science, Cairo University in September, 2013 that the approval of the IACUC is mandatory before conducting any research involving animals or theses registration. The first IACUC was established in the Cairo University, Faculty of Science, since 2012. The challenges encountered by the committee were diverse, such as the absence of laws that control the use of animal models in scientific research, lack of guidelines (protocols for experimental animals in research) and, mandatory ethical approval for any experimental animal research.

  18. From theory to experimental design-Quantifying a trait-based theory of predator-prey dynamics.

    PubMed

    Laubmeier, A N; Wootton, Kate; Banks, J E; Bommarco, Riccardo; Curtsdotter, Alva; Jonsson, Tomas; Roslin, Tomas; Banks, H T

    2018-01-01

    Successfully applying theoretical models to natural communities and predicting ecosystem behavior under changing conditions is the backbone of predictive ecology. However, the experiments required to test these models are dictated by practical constraints, and models are often opportunistically validated against data for which they were never intended. Alternatively, we can inform and improve experimental design by an in-depth pre-experimental analysis of the model, generating experiments better targeted at testing the validity of a theory. Here, we describe this process for a specific experiment. Starting from food web ecological theory, we formulate a model and design an experiment to optimally test the validity of the theory, supplementing traditional design considerations with model analysis. The experiment itself will be run and described in a separate paper. The theory we test is that trophic population dynamics are dictated by species traits, and we study this in a community of terrestrial arthropods. We depart from the Allometric Trophic Network (ATN) model and hypothesize that including habitat use, in addition to body mass, is necessary to better model trophic interactions. We therefore formulate new terms which account for micro-habitat use as well as intra- and interspecific interference in the ATN model. We design an experiment and an effective sampling regime to test this model and the underlying assumptions about the traits dominating trophic interactions. We arrive at a detailed sampling protocol to maximize information content in the empirical data obtained from the experiment and, relying on theoretical analysis of the proposed model, explore potential shortcomings of our design. Consequently, since this is a "pre-experimental" exercise aimed at improving the links between hypothesis formulation, model construction, experimental design and data collection, we hasten to publish our findings before analyzing data from the actual experiment, thus setting the stage for strong inference.

  19. Opportunistic detection of atrial fibrillation in subjects aged 65 years or older in primare care: a randomised clinical trial of efficacy. DOFA-AP study protocol.

    PubMed

    Pérula-de-Torres, Luis Á; Martínez-Adell, Miguel Á; González-Blanco, Virginia; Baena-Díez, José M; Martín-Rioboó, Enrique; Parras-Rejano, Juan M; González-Lama, Jesús; Martín-Alvarez, Remedios; Ruiz-Moral, Roger; Fernández-García, José Á; Pérez-Díaz, Modesto; Ruiz-de-Castroviejo, Joaquin; Pérula-de-Torres, Carlos; Valero-Martín, Antonio; Roldán-Villalobos, Ana; Criado-Larumbe, Margarita; Burdoy-Joaquín, Emili; Coma-Solé, Montserrat; Cervera-León, Mercè; Cuixart-Costa, Lluís

    2012-10-30

    Clinical Practice Guidelines recommend using peripheral blood pulse measuring as a screening test for Atrial Fibrillation. However, there is no adequate evidence supporting the efficacy of such procedure in primary care clinical practice. This paper describes a study protocol designed to verify whether early opportunistic screening for Atrial Fibrillation by measuring blood pulse is more effective than regular practice in subjects aged 65 years attending primary care centers. An cluster-randomized controlled trial conducted in Primary Care Centers of the Spanish National Health Service. A total of 269 physicians and nurses will be allocated to one of the two arms of the trial by stratified randomization with a 3:2 ratio (three practitioners will be assigned to the Control Group for every two practitioners assigned to the Experimental Group). As many as 12 870 patients aged 65 years or older and meeting eligibility criteria will be recruited (8 580 will be allocated to the Experimental Group and 4 290 to the Control Group). Randomization and allocation to trial groups will be carried out by a central computer system. The Experimental Group practitioners will conduct an opportunistic case finding for patients with Atrial Fibrillation, while the Control Group practitioners will follow the regular guidelines. The first step will be finding new Atrial Fibrillation cases. A descriptive inferential analysis will be performed (bivariate and multivariate by multilevel logistic regression analysis). If our hypothesis is confirmed, we expect Primary Care professionals to take a more proactive approach and adopt a new protocol when a patient meeting the established screening criteria is identified. Finally, we expect this measure to be incorporated into Clinical Practice Guidelines. The study is registered as NCT01291953 (ClinicalTrials.gob).

  20. Real-time Electrophysiology: Using Closed-loop Protocols to Probe Neuronal Dynamics and Beyond

    PubMed Central

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2015-01-01

    Experimental neuroscience is witnessing an increased interest in the development and application of novel and often complex, closed-loop protocols, where the stimulus applied depends in real-time on the response of the system. Recent applications range from the implementation of virtual reality systems for studying motor responses both in mice1 and in zebrafish2, to control of seizures following cortical stroke using optogenetics3. A key advantage of closed-loop techniques resides in the capability of probing higher dimensional properties that are not directly accessible or that depend on multiple variables, such as neuronal excitability4 and reliability, while at the same time maximizing the experimental throughput. In this contribution and in the context of cellular electrophysiology, we describe how to apply a variety of closed-loop protocols to the study of the response properties of pyramidal cortical neurons, recorded intracellularly with the patch clamp technique in acute brain slices from the somatosensory cortex of juvenile rats. As no commercially available or open source software provides all the features required for efficiently performing the experiments described here, a new software toolbox called LCG5 was developed, whose modular structure maximizes reuse of computer code and facilitates the implementation of novel experimental paradigms. Stimulation waveforms are specified using a compact meta-description and full experimental protocols are described in text-based configuration files. Additionally, LCG has a command-line interface that is suited for repetition of trials and automation of experimental protocols. PMID:26132434

  1. Wet scrubbing of biomass producer gas tars using vegetable oil

    NASA Astrophysics Data System (ADS)

    Bhoi, Prakashbhai Ramabhai

    The overall aims of this research study were to generate novel design data and to develop an equilibrium stage-based thermodynamic model of a vegetable oil based wet scrubbing system for the removal of model tar compounds (benzene, toluene and ethylbenzene) found in biomass producer gas. The specific objectives were to design, fabricate and evaluate a vegetable oil based wet scrubbing system and to optimize the design and operating variables; i.e., packed bed height, vegetable oil type, solvent temperature, and solvent flow rate. The experimental wet packed bed scrubbing system includes a liquid distributor specifically designed to distribute a high viscous vegetable oil uniformly and a mixing section, which was designed to generate a desired concentration of tar compounds in a simulated air stream. A method and calibration protocol of gas chromatography/mass spectroscopy was developed to quantify tar compounds. Experimental data were analyzed statistically using analysis of variance (ANOVA) procedure. Statistical analysis showed that both soybean and canola oils are potential solvents, providing comparable removal efficiency of tar compounds. The experimental height equivalent to a theoretical plate (HETP) was determined as 0.11 m for vegetable oil based scrubbing system. Packed bed height and solvent temperature had highly significant effect (p0.05) effect on the removal of model tar compounds. The packing specific constants, Ch and CP,0, for the Billet and Schultes pressure drop correlation were determined as 2.52 and 2.93, respectively. The equilibrium stage based thermodynamic model predicted the removal efficiency of model tar compounds in the range of 1-6%, 1-4% and 1-2% of experimental data for benzene, toluene and ethylbenzene, respectively, for the solvent temperature of 30° C. The NRTL-PR property model and UNIFAC for estimating binary interaction parameters are recommended for modeling absorption of tar compounds in vegetable oils. Bench scale experimental data from the wet scrubbing system would be useful in the design and operation of a pilot scale vegetable oil based system. The process model, validated using experimental data, would be a key design tool for the design and optimization of a pilot scale vegetable oil based system.

  2. How to design a single-cell RNA-sequencing experiment: pitfalls, challenges and perspectives.

    PubMed

    Dal Molin, Alessandra; Di Camillo, Barbara

    2018-01-31

    The sequencing of the transcriptome of single cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types in heterogeneous cell populations or for the study of stochastic gene expression. In recent years, various experimental methods and computational tools for analysing single-cell RNA-sequencing data have been proposed. However, most of them are tailored to different experimental designs or biological questions, and in many cases, their performance has not been benchmarked yet, thus increasing the difficulty for a researcher to choose the optimal single-cell transcriptome sequencing (scRNA-seq) experiment and analysis workflow. In this review, we aim to provide an overview of the current available experimental and computational methods developed to handle single-cell RNA-sequencing data and, based on their peculiarities, we suggest possible analysis frameworks depending on specific experimental designs. Together, we propose an evaluation of challenges and open questions and future perspectives in the field. In particular, we go through the different steps of scRNA-seq experimental protocols such as cell isolation, messenger RNA capture, reverse transcription, amplification and use of quantitative standards such as spike-ins and Unique Molecular Identifiers (UMIs). We then analyse the current methodological challenges related to preprocessing, alignment, quantification, normalization, batch effect correction and methods to control for confounding effects. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Manual Therapy, Therapeutic Patient Education, and Therapeutic Exercise, an Effective Multimodal Treatment of Nonspecific Chronic Neck Pain: A Randomized Controlled Trial.

    PubMed

    Beltran-Alacreu, Hector; López-de-Uralde-Villanueva, Ibai; Fernández-Carnero, Josué; La Touche, Roy

    2015-10-01

    The aim of this study was to determine the effectiveness of a multimodal treatment in the short and medium term for disability in nonspecific chronic neck pain. The design of this study is a single-blinded randomized controlled trial carried out in a university research laboratory. Forty-five patients between 18 and 65 yrs with nonspecific chronic neck pain were included in this study. Each patient was treated eight times over a 4-wk period. The sample was divided into three groups: control group, subjected to a protocol of manual therapy; experimental group 1, subjected to a protocol of manual therapy and therapeutic patient education; and experimental group 2, subjected to manual therapy, therapeutic patient education, and a therapeutic exercise protocol. Assessments were performed at baseline and at 4, 8, and 16 wks using the following measurements: the Neck Disability Index, the 11-item Tampa Scale of Kinesiophobia, the Fear Avoidance Beliefs Questionnaire, the Neck Flexor Muscle Endurance Test, and the Visual Analog Fatigue Scale. The nonparametric Kruskal-Wallis test for the Neck Disability Index showed statistically significant differences between baseline outcomes and all follow-up periods (P < 0.01). In the Kruskal-Wallis test, differences were found for the Visual Analog Fatigue Scale and the Neck Flexor Muscle Endurance Test in the follow-ups at 8 and 16 wks (P < 0.05). Analysis of variance for group × time interaction showed statistically significant changes (Tampa Scale of Kinesiophobia, F = 3.613, P = 0.005; Fear Avoidance Beliefs Questionnaire, F = 2.803, P = 0.022). Minimal detectable changes were obtained in both experimental groups for the 11-item Tampa Scale of Kinesiophobia but not in the control group. Differences between experimental groups and the control group were found in the short and medium term. A multimodal treatment is a good method for reducing disability in patients with nonspecific chronic neck pain in the short and medium term.

  4. Deterministic generation of remote entanglement with active quantum feedback

    DOE PAGES

    Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...

    2015-12-10

    We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less

  5. Extra-Vehicular Activity (EVA) glove evaluation test protocol

    NASA Technical Reports Server (NTRS)

    Hinman-Sweeney, E. M.

    1994-01-01

    One of the most critical components of a space suit is the gloves, yet gloves have traditionally presented significant design challenges. With continued efforts at glove development, a method for evaluating glove performance is needed. This paper presents a pressure-glove evaluation protocol. A description of this evaluation protocol, and its development is provided. The protocol allows comparison of one glove design to another, or any one design to bare-handed performance. Gloves for higher pressure suits may be evaluated at current and future design pressures to drive out differences in performance due to pressure effects. Using this protocol, gloves may be evaluated during design to drive out design problems and determine areas for improvement, or fully mature designs may be evaluated with respect to mission requirements. Several different test configurations are presented to handle these cases. This protocol was run on a prototype glove. The prototype was evaluated at two operating pressures and in the unpressurized state, with results compared to bare-handed performance. Results and analysis from this test series are provided, as is a description of the configuration used for this test.

  6. EXPERIMENTAL KETOSIS IN MAN. ’COLD KETOSIS’ COMPARED WITH POST-EXERCISE KETOSIS AND NUTRITIONAL KETOSIS.

    DTIC Science & Technology

    This investigation was designed to answer three questions: (1) Does repetition of a ketosis following a 10 mile walk cause adaptive responses; (2...Does repeated exposure to cold result in a diminished ketotic response; (3) Do women show a post-exercise ketosis like men. Protocols for the three...exercise ketosis similar to that shown by men, despite much individual variability. Prolonged moderate exercise, exposure to cold and starvation all produce similar metabolic effects. (Author)

  7. Association between post-game recovery protocols, physical and perceived recovery, and performance in elite Australian Football League players.

    PubMed

    Bahnert, Andrew; Norton, Kevin; Lock, Phillip

    2013-03-01

    To determine the associations between post-game recovery protocols and physical and perceptual recovery, and game performance in Australian Football League players. A longitudinal quasi-experimental study design was used across a season. A full squad of 44 footballers was monitored weekly across a 23-game season. Players were required to choose from a number of recovery modalities available immediately post-game. These included floor stretching, pool stretching, bike active recovery, pool active recovery, cold-water immersion, contrast therapy and use of a compression garment. Perceptual measures of recovery were recorded throughout the week and a test of physical performance was conducted two days post-game. Game performance ratings were also recorded. The associations between the post-game recovery protocols chosen and players' perceived recovery, and physical and game performances were determined by the association rule data-mining strategy. Statistically significant associations were found between a number of post-game recovery protocols and perceptual recovery. In general, players who chose cold-water immersion, floor stretching, no active recovery (neither bike or pool) and the use of a compression garment post-game, had an increased probability of reporting greater perceptual recovery across the following week, relative to all other permutations of recovery protocols chosen. There were no associations found between post-game recovery protocol combinations and physical recovery. No associations were found between the post-game recovery methods and the next game performance. Perceptual recovery among players was enhanced through the selection of specific combinations of recovery protocols post game. However, no links were found between recovery protocols and physical or game performance measures. Copyright © 2012 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  8. Rational design of small-molecule stabilizers of spermine synthase dimer by virtual screening and free energy-based approach.

    PubMed

    Zhang, Zhe; Martiny, Virginie; Lagorce, David; Ikeguchi, Yoshihiko; Alexov, Emil; Miteva, Maria A

    2014-01-01

    Snyder-Robinson Syndrome (SRS) is a rare mental retardation disorder which is caused by the malfunctioning of an enzyme, the spermine synthase (SMS), which functions as a homo-dimer. The malfunctioning of SMS in SRS patients is associated with several identified missense mutations that occur away from the active site. This investigation deals with a particular SRS-causing mutation, the G56S mutation, which was shown computationally and experimentally to destabilize the SMS homo-dimer and thus to abolish SMS enzymatic activity. As a proof-of-concept, we explore the possibility to restore the enzymatic activity of the malfunctioning SMS mutant G56S by stabilizing the dimer through small molecule binding at the mutant homo-dimer interface. For this purpose, we designed an in silico protocol that couples virtual screening and a free binding energy-based approach to identify potential small-molecule binders on the destabilized G56S dimer, with the goal to stabilize it and thus to increase SMS G56S mutant activity. The protocol resulted in extensive list of plausible stabilizers, among which we selected and tested 51 compounds experimentally for their capability to increase SMS G56S mutant enzymatic activity. In silico analysis of the experimentally identified stabilizers suggested five distinctive chemical scaffolds. This investigation suggests that druggable pockets exist in the vicinity of the mutation sites at protein-protein interfaces which can be used to alter the disease-causing effects by small molecule binding. The identified chemical scaffolds are drug-like and can serve as original starting points for development of lead molecules to further rescue the disease-causing effects of the Snyder-Robinson syndrome for which no efficient treatment exists up to now.

  9. Strategies for Achieving High Sequencing Accuracy for Low Diversity Samples and Avoiding Sample Bleeding Using Illumina Platform

    PubMed Central

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer’s, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how analysis can be repeated from saved sequencing images using the Long Template Protocol to increase accuracy. PMID:25860802

  10. A quasi-experimental, before-after trial examining the impact of an emergency department mechanical ventilator protocol on clinical outcomes and lung-protective ventilation in acute respiratory distress syndrome

    PubMed Central

    Fuller, Brian M.; Ferguson, Ian T.; Mohr, Nicholas M.; Drewry, Anne M.; Palmer, Christopher; Wessman, Brian T.; Ablordeppey, Enyo; Keeperman, Jacob; Stephens, Robert J.; Briscoe, Cristopher C.; Kolomiets, Angelina A.; Hotchkiss, Richard S.; Kollef, Marin H.

    2017-01-01

    Objective To evaluate the impact of an emergency department (ED) mechanical ventilation protocol on clinical outcomes and adherence to lung-protective ventilation in patients with acute respiratory distress syndrome (ARDS). Design Quasi-experimental, before-after trial. Setting ED and intensive care units (ICU) of an academic center. Patients Mechanically ventilated ED patients experiencing ARDS while in the ED or after admission to the ICU. Interventions An ED ventilator protocol which targeted parameters in need of quality improvement, as identified by prior work: 1) lung-protective tidal volume; 2) appropriate setting of positive end-expiratory pressure (PEEP); 3) oxygen weaning; and 4) head-of-bed elevation. Measurements and Main Results A total of 229 patients (186 pre-intervention group, 43 intervention group) were studied. In the ED, the intervention was associated with significant changes (P < 0.01 for all) in tidal volume, PEEP, respiratory rate, oxygen administration, and head-of-bed elevation. There was a reduction in ED tidal volume from 8.1 mL/kg PBW (7.0 – 9.1) to 6.4 mL/kg PBW (6.1 – 6.7), and an increase in lung-protective ventilation from 11.1% to 61.5%, P < 0.01. The intervention was associated with a reduction in mortality from 54.8% to 39.5% (OR 0.38, 95% CI 0.17 – 0.83, P = 0.02), and a 3.9 day increase in ventilator-free days, P = 0.01. Conclusions This before-after study of mechanically ventilated patients with ARDS demonstrates that implementing a mechanical ventilator protocol in the ED is feasible, and associated with improved clinical outcomes. PMID:28157140

  11. Serious gaming during multidisciplinary rehabilitation for patients with complex chronic pain or fatigue complaints: study protocol for a controlled trial and process evaluation

    PubMed Central

    Joosen, Margot C W; Mert, Agali; Zedlitz, Aglaia; Vrijhoef, Hubertus J M

    2017-01-01

    Introduction Many individuals suffer from chronic pain or functional somatic syndromes and face boundaries for diminishing functional limitations by means of biopsychosocial interventions. Serious gaming could complement multidisciplinary interventions through enjoyment and independent accessibility. A study protocol is presented for studying whether, how, for which patients and under what circumstances, serious gaming improves patient health outcomes during regular multidisciplinary rehabilitation. Methods and analysis A mixed-methods design is described that prioritises a two-armed naturalistic quasi-experiment. An experimental group is composed of patients who follow serious gaming during an outpatient multidisciplinary programme at two sites of a Dutch rehabilitation centre. Control group patients follow the same programme without serious gaming in two similar sites. Multivariate mixed-modelling analysis is planned for assessing how much variance in 250 patient records of routinely monitored pain intensity, pain coping and cognition, fatigue and psychopathology outcomes is attributable to serious gaming. Embedded qualitative methods include unobtrusive collection and analyses of stakeholder focus group interviews, participant feedback and semistructured patient interviews. Process analyses are carried out by a systematic approach of mixing qualitative and quantitative methods at various stages of the research. Ethics and dissemination The Ethics Committee of the Tilburg School of Social and Behavioural Sciences approved the research after reviewing the protocol for the protection of patients’ interests in conformity to the letter and rationale of the applicable laws and research practice (EC 2016.25t). Findings will be presented in research articles and international scientific conferences. Trial registration number A prospective research protocol for the naturalistic quasi-experimental outcome evaluation was entered in the Dutch trial register (registration number: NTR6020; Pre-results). PMID:28600377

  12. Gut Microbiome Standardization in Control and Experimental Mice.

    PubMed

    McCoy, Kathy D; Geuking, Markus B; Ronchi, Francesca

    2017-04-03

    Mouse models are used extensively to study human health and to investigate the mechanisms underlying human disease. In the past, most animal studies were performed without taking into consideration the impact of the microbiota. However, the microbiota that colonizes all body surfaces, including the gastrointestinal tract, respiratory tract, genitourinary tract, and skin, heavily impacts nearly every aspect of host physiology. When performing studies utilizing mouse models it is critical to understand that the microbiome is heavily impacted by environmental factors, including (but not limited to) food, bedding, caging, and temperature. In addition, stochastic changes in the microbiota can occur over time that also play a role in shaping microbial composition. These factors lead to massive variability in the composition of the microbiota between animal facilities and research institutions, and even within a single facility. Lack of experimental reproducibility between research groups has highlighted the necessity for rigorously controlled experimental designs in order to standardize the microbiota between control and experimental animals. Well controlled experiments are mandatory in order to reduce variability and allow correct interpretation of experimental results, not just of host-microbiome studies but of all mouse models of human disease. The protocols presented are aimed to design experiments that control the microbiota composition between different genetic strains of experimental mice within an animal unit. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  13. Supervised versus unsupervised technology-based levodopa monitoring in Parkinson's disease: an intrasubject comparison.

    PubMed

    Lopane, Giovanna; Mellone, Sabato; Corzani, Mattia; Chiari, Lorenzo; Cortelli, Pietro; Calandra-Buonaura, Giovanna; Contin, Manuela

    2018-06-01

    We aimed to assess the intrasubject reproducibility of a technology-based levodopa (LD) therapeutic monitoring protocol administered in supervised versus unsupervised conditions in patients with Parkinson's disease (PD). The study design was pilot, intrasubject, single center, open and prospective. Twenty patients were recruited. Patients performed a standardized monitoring protocol instrumented by an ad hoc embedded platform after their usual first morning LD dose in two different randomized ambulatory sessions: one under a physician's supervision, the other self-administered. The protocol is made up of serial motor and non-motor tests, including alternate finger tapping, Timed Up and Go test, and measurement of blood pressure. Primary motor outcomes included comparisons of intrasubject LD subacute motor response patterns over the 3-h test in the two experimental conditions. Secondary outcomes were the number of intrasession serial test repetitions due to technical or handling errors and patients' satisfaction with the unsupervised LD monitoring protocol. Intrasubject LD motor response patterns were concordant between the two study sessions in all patients but one. Platform handling problems averaged 4% of total planned serial tests for both sessions. Ninety-five percent of patients were satisfied with the self-administered LD monitoring protocol. To our knowledge, this study is the first to explore the potential of unsupervised technology-based objective motor and non-motor tasks to monitor subacute LD dosing effects in PD patients. The results are promising for future telemedicine applications.

  14. Loaded hip thrust-based PAP protocol effects on acceleration and sprint performance of handball players.

    PubMed

    Dello Iacono, Antonio; Padulo, Johnny; Seitz, Laurent D

    2018-06-01

    This study aimed to investigate the acute effects of two barbell hip thrust-based (BHT) post-activation potentiation (PAP) protocols on subsequent sprint performance. Using a crossover design, eighteen handball athletes performed maximal 15-m sprints before and 15s, 4min and 8min after two experimental protocols consisting of BHT loaded with either 50% or 85% 1RM (50PAP and 85PAP, respectively), in order to profile the transient PAP effects. The resulting sprint performances were significantly impaired at 15s only after the 85PAP protocol, which induced likely and very likely greater decreases compared to the 50PAP. At 4min and 8min, significant improvements and very likely beneficial effects were observed in the 10m and 15m performances following both protocols. Significant differences were found when comparing the two PAPs over time; the results suggested very likely greater performance improvements in 10m following the 85PAP after 4min and 8min, and possible greater performance improvements in 15m after 4min. Positive correlations between BHT 1RMs values and the greatest individual PAP responses on sprint performance were found. This investigation showed that both moderate and intensive BHT exercises can induce a PAP response, but the effects may differ according to the recovery following the potentiating stimulus and the individual`s strength level.

  15. Experimental aspect of solid-state nuclear magnetic resonance studies of biomaterials such as bones.

    PubMed

    Singh, Chandan; Rai, Ratan Kumar; Sinha, Neeraj

    2013-01-01

    Solid-state nuclear magnetic resonance (SSNMR) spectroscopy is increasingly becoming a popular technique to probe micro-structural details of biomaterial such as bone with pico-meter resolution. Due to high-resolution structural details probed by SSNMR methods, handling of bone samples and experimental protocol are very crucial aspects of study. We present here first report of the effect of various experimental protocols and handling methods of bone samples on measured SSNMR parameters. Various popular SSNMR experiments were performed on intact cortical bone sample collected from fresh animal, immediately after removal from animal systems, and results were compared with bone samples preserved in different conditions. We find that the best experimental conditions for SSNMR parameters of bones correspond to preservation at -20 °C and in 70% ethanol solution. Various other SSNMR parameters were compared corresponding to different experimental conditions. Our study has helped in finding best experimental protocol for SSNMR studies of bone. This study will be of further help in the application of SSNMR studies on large bone disease related animal model systems for statistically significant results. © 2013 Elsevier Inc. All rights reserved.

  16. Designing of routing algorithms in autonomous distributed data transmission system for mobile computing devices with ‘WiFi-Direct’ technology

    NASA Astrophysics Data System (ADS)

    Nikitin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.; Botygin, I. A.

    2017-02-01

    The results of the research of existent routing protocols in wireless networks and their main features are discussed in the paper. Basing on the protocol data, the routing protocols in wireless networks, including search routing algorithms and phone directory exchange algorithms, are designed with the ‘WiFi-Direct’ technology. Algorithms without IP-protocol were designed, and that enabled one to increase the efficiency of the algorithms while working only with the MAC-addresses of the devices. The developed algorithms are expected to be used in the mobile software engineering with the Android platform taken as base. Easier algorithms and formats of the well-known route protocols, rejection of the IP-protocols enables to use the developed protocols on more primitive mobile devices. Implementation of the protocols to the engineering industry enables to create data transmission networks among working places and mobile robots without any access points.

  17. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  18. Characterization of human plasma proteome dynamics using deuterium oxide

    PubMed Central

    Wang, Ding; Liem, David A; Lau, Edward; Ng, Dominic CM; Bleakley, Brian J; Cadeiras, Martin; Deng, Mario C; Lam, Maggie PY; Ping, Peipei

    2016-01-01

    Purpose High-throughput quantification of human protein turnover via in vivo administration of deuterium oxide (2H2O) is a powerful new approach to examine potential disease mechanisms. Its immediate clinical translation is contingent upon characterizations of the safety and hemodynamic effects of in vivo administration of 2H2O to human subjects. Experimental design We recruited 10 healthy human subjects with a broad demographic variety to evaluate the safety, feasibility, efficacy, and reproducibility of 2H2O intake for studying protein dynamics. We designed a protocol where each subject orally consumed weight-adjusted doses of 70% 2H2O daily for 14 days to enrich body water and proteins with deuterium. Plasma proteome dynamics was measured using a high-resolution MS method we recently developed. Results This protocol was successfully applied in 10 human subjects to characterize the endogenous turnover rates of 542 human plasma proteins, the largest such human dataset to-date. Throughout the study, we did not detect physiological effects or signs of discomfort from 2H2O consumption. Conclusions and clinical relevance Our investigation supports the utility of a 2H2O intake protocol that is safe, accessible, and effective for clinical investigations of large-scale human protein turnover dynamics. This workflow shows promising clinical translational value for examining plasma protein dynamics in human diseases. PMID:24946186

  19. A protocol for rat in vitro fertilization during conventional laboratory working hours.

    PubMed

    Aoto, Toshihiro; Takahashi, Ri-ichi; Ueda, Masatsugu

    2011-12-01

    In vitro fertilization (IVF) is a valuable technique for the propagation of experimental animals. IVF has typically been used in mice to rapidly expand breeding colonies and create large numbers of embryos. However, applications of IVF in rat breeding experiments have stalled due to the inconvenient laboratory work schedules imposed by current IVF protocols for this species. Here, we developed a new rat IVF protocol that consists of experimental steps performed during common laboratory working hours. Our protocol can be completed within 12 h by shortening the period of sperm capacitation from 5 to 1 h and the fertilization time from 10 to 8 h in human tubal fluid (HTF) medium. This new protocol generated an excellent birth rate and was applicable not only to closed colony rat strains, such as Wistar, Long-Evans, and Sprague-Dawley (SD), but also to the inbred Lewis strain. Moreover, Wistar and Long-Evans embryos prepared by this protocol were successfully frozen by vitrification and later successfully thawed and resuscitated. This protocol is practical and can be easily adopted by laboratory workers.

  20. Serverification of Molecular Modeling Applications: The Rosetta Online Server That Includes Everyone (ROSIE)

    PubMed Central

    Conchúir, Shane Ó.; Der, Bryan S.; Drew, Kevin; Kuroda, Daisuke; Xu, Jianqing; Weitzner, Brian D.; Renfrew, P. Douglas; Sripakdeevong, Parin; Borgo, Benjamin; Havranek, James J.; Kuhlman, Brian; Kortemme, Tanja; Bonneau, Richard; Gray, Jeffrey J.; Das, Rhiju

    2013-01-01

    The Rosetta molecular modeling software package provides experimentally tested and rapidly evolving tools for the 3D structure prediction and high-resolution design of proteins, nucleic acids, and a growing number of non-natural polymers. Despite its free availability to academic users and improving documentation, use of Rosetta has largely remained confined to developers and their immediate collaborators due to the code’s difficulty of use, the requirement for large computational resources, and the unavailability of servers for most of the Rosetta applications. Here, we present a unified web framework for Rosetta applications called ROSIE (Rosetta Online Server that Includes Everyone). ROSIE provides (a) a common user interface for Rosetta protocols, (b) a stable application programming interface for developers to add additional protocols, (c) a flexible back-end to allow leveraging of computer cluster resources shared by RosettaCommons member institutions, and (d) centralized administration by the RosettaCommons to ensure continuous maintenance. This paper describes the ROSIE server infrastructure, a step-by-step ‘serverification’ protocol for use by Rosetta developers, and the deployment of the first nine ROSIE applications by six separate developer teams: Docking, RNA de novo, ERRASER, Antibody, Sequence Tolerance, Supercharge, Beta peptide design, NCBB design, and VIP redesign. As illustrated by the number and diversity of these applications, ROSIE offers a general and speedy paradigm for serverification of Rosetta applications that incurs negligible cost to developers and lowers barriers to Rosetta use for the broader biological community. ROSIE is available at http://rosie.rosettacommons.org. PMID:23717507

  1. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID:25653655

  2. Sim3C: simulation of Hi-C and Meta3C proximity ligation sequencing technologies.

    PubMed

    DeMaere, Matthew Z; Darling, Aaron E

    2018-02-01

    Chromosome conformation capture (3C) and Hi-C DNA sequencing methods have rapidly advanced our understanding of the spatial organization of genomes and metagenomes. Many variants of these protocols have been developed, each with their own strengths. Currently there is no systematic means for simulating sequence data from this family of sequencing protocols, potentially hindering the advancement of algorithms to exploit this new datatype. We describe a computational simulator that, given simple parameters and reference genome sequences, will simulate Hi-C sequencing on those sequences. The simulator models the basic spatial structure in genomes that is commonly observed in Hi-C and 3C datasets, including the distance-decay relationship in proximity ligation, differences in the frequency of interaction within and across chromosomes, and the structure imposed by cells. A means to model the 3D structure of randomly generated topologically associating domains is provided. The simulator considers several sources of error common to 3C and Hi-C library preparation and sequencing methods, including spurious proximity ligation events and sequencing error. We have introduced the first comprehensive simulator for 3C and Hi-C sequencing protocols. We expect the simulator to have use in testing of Hi-C data analysis algorithms, as well as more general value for experimental design, where questions such as the required depth of sequencing, enzyme choice, and other decisions can be made in advance in order to ensure adequate statistical power with respect to experimental hypothesis testing.

  3. Skin Microbiome Surveys Are Strongly Influenced by Experimental Design.

    PubMed

    Meisel, Jacquelyn S; Hannigan, Geoffrey D; Tyldsley, Amanda S; SanMiguel, Adam J; Hodkinson, Brendan P; Zheng, Qi; Grice, Elizabeth A

    2016-05-01

    Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provides more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e., gastrointestinal) and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource and cost intensive, provides evidence of a community's functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This study highlights the importance of experimental design for downstream results in skin microbiome surveys. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Skin microbiome surveys are strongly influenced by experimental design

    PubMed Central

    Meisel, Jacquelyn S.; Hannigan, Geoffrey D.; Tyldsley, Amanda S.; SanMiguel, Adam J.; Hodkinson, Brendan P.; Zheng, Qi; Grice, Elizabeth A.

    2016-01-01

    Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provide more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e. gastrointestinal), and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource- and cost-intensive, provides evidence of a community’s functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This work highlights the importance of experimental design for downstream results in skin microbiome surveys. PMID:26829039

  5. Functional genomics platform for pooled screening and mammalian genetic interaction maps

    PubMed Central

    Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.

    2014-01-01

    Systematic genetic interaction maps in microorganisms are powerful tools for identifying functional relationships between genes and defining the function of uncharacterized genes. We have recently implemented this strategy in mammalian cells as a two-stage approach. First, genes of interest are robustly identified in a pooled genome-wide screen using complex shRNA libraries. Second, phenotypes for all pairwise combinations of hit genes are measured in a double-shRNA screen and used to construct a genetic interaction map. Our protocol allows for rapid pooled screening under various conditions without a requirement for robotics, in contrast to arrayed approaches. Each stage of the protocol can be implemented in ~2 weeks, with additional time for analysis and generation of reagents. We discuss considerations for screen design, and present complete experimental procedures as well as a full computational analysis suite for identification of hits in pooled screens and generation of genetic interaction maps. While the protocols outlined here were developed for our original shRNA-based approach, they can be applied more generally, including to CRISPR-based approaches. PMID:24992097

  6. The use of high-fidelity human patient simulation as an evaluative tool in the development of clinical research protocols and procedures.

    PubMed

    Wright, Melanie C; Taekman, Jeffrey M; Barber, Linda; Hobbs, Gene; Newman, Mark F; Stafford-Smith, Mark

    2005-12-01

    Errors in clinical research can be costly, in terms of patient safety, data integrity, and data collection. Data inaccuracy in early subjects of a clinical study may be associated with problems in the design of the protocol, procedures, and data collection tools. High-fidelity patient simulation centers provide an ideal environment to apply human-centered design to clinical trial development. A draft of a complex clinical protocol was designed, evaluated and modified using a high-fidelity human patient simulator in the Duke University Human Simulation and Patient Safety Center. The process included walk-throughs, detailed modifications of the protocol and development of procedural aids. Training of monitors and coordinators provided an opportunity for observation of performance that was used to identify further improvements to the protocol. Evaluative steps were used to design the research protocol and procedures. Iterative modifications were made to the protocol and data collection tools. The success in use of human simulation in the preparation of a complex clinical drug trial suggests the benefits of human patient simulation extend beyond training and medical equipment evaluation. Human patient simulation can provide a context for informal expert evaluation of clinical protocol design and for formal "rehearsal" to evaluate the efficacy of procedures and support tools.

  7. Status and trends monitoring of riparian and aquatic habitat in the Olympic Experimental State Forest: Monitoring protocols

    Treesearch

    Teodora Minkova; Alex D. Foster

    2017-01-01

    Presented here are the monitoring protocols for the Status and Trends Monitoring of Riparian and Aquatic Habitats project in the Olympic Experimental State Forest (OESF). The procedures yield the empirical data needed to address key uncertainties regarding the integration of timber production and habitat conservation across landscapes and assess progress toward...

  8. SPAR: a security- and power-aware routing protocol for wireless ad hoc and sensor networks

    NASA Astrophysics Data System (ADS)

    Oberoi, Vikram; Chigan, Chunxiao

    2005-05-01

    Wireless Ad Hoc and Sensor Networks (WAHSNs) are vulnerable to extensive attacks as well as severe resource constraints. To fulfill the security needs, many security enhancements have been proposed. Like wise, from resource constraint perspective, many power aware schemes have been proposed to save the battery power. However, we observe that for the severely resource limited and extremely vulnerable WAHSNs, taking security or power (or any other resource) alone into consideration for protocol design is rather inadequate toward the truly "secure-and-useful" WAHSNs. For example, from resource constraint perspective, we identify one of the potential problems, the Security-Capable-Congestion (SCC) behavior, for the WAHSNs routing protocols where only the security are concerned. On the other hand, the design approach where only scarce resource is concerned, such as many power-aware WAHSNs protocols, leaves security unconsidered and is undesirable to many WAHSNs application scenarios. Motivated by these observations, we propose a co-design approach, where both the high security and effective resource consumption are targeted for WAHSNs protocol design. Specifically, we propose a novel routing protocol, Security- and Power- Aware Routing (SPAR) protocol based on this co-design approach. In SPAR, the routing decisions are made based on both security and power as routing criteria. The idea of the SPAR mechanism is routing protocol independent and therefore can be broadly integrated into any of the existing WAHSNs routing protocols. The simulation results show that SPAR outperforms the WAHSNs routing protocols where security or power alone is considered, significantly. This research finding demonstrates the proposed security- and resource- aware co-design approach is promising towards the truly "secure-and-useful" WAHSNs.

  9. Weight training in youth-growth, maturation, and safety: an evidence-based review.

    PubMed

    Malina, Robert M

    2006-11-01

    To review the effects of resistance training programs on pre- and early-pubertal youth in the context of response, potential influence on growth and maturation, and occurrence of injury. Evidence-based review. Twenty-two reports dealing with experimental resistance training protocols, excluding isometric programs, in pre- and early-pubertal youth, were reviewed in the context of subject characteristics, training protocol, responses, and occurrence of injury. Experimental programs most often used isotonic machines and free weights, 2- and 3-day protocols, and 8- and 12-week durations, with significant improvements in muscular strength during childhood and early adolescence. Strength gains were lost during detraining. Experimental resistance training programs did not influence growth in height and weight of pre- and early-adolescent youth, and changes in estimates of body composition were variable and quite small. Only 10 studies systematically monitored injuries, and only three injuries were reported. Estimated injury rates were 0.176, 0.053, and 0.055 per 100 participant-hours in the respective programs. Experimental training protocols with weights and resistance machines and with supervision and low instructor/participant ratios are relatively safe and do not negatively impact growth and maturation of pre- and early-pubertal youth.

  10. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching.

  11. Unitary n -designs via random quenches in atomic Hubbard and spin models: Application to the measurement of Rényi entropies

    NASA Astrophysics Data System (ADS)

    Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.

  12. Water Adsorption and Dissociation on Polycrystalline Copper Oxides: Effects of Environmental Contamination and Experimental Protocol

    DOE PAGES

    Trotochaud, Lena; Head, Ashley R.; Pletincx, Sven; ...

    2017-11-02

    We use ambient-pressure X-ray photoelectron spectroscopy (APXPS) to study chemical changes, including hydroxylation and water adsorption, at copper oxide surfaces from ultrahigh vacuum to ambient relative humidities of ~5%. Polycrystalline CuO and Cu 2O surfaces were prepared by selective oxidation of metallic copper foils. For both oxides, hydroxylation occurs readily, even at high-vacuum conditions. Hydroxylation on both oxides plateaus near ~0.01% relative humidity (RH) at a coverage of ~1 monolayer. In contrast to previous studies, neither oxide shows significant accumulation of molecular water; rather, both surfaces show a high affinity for adventitious carbon contaminants. Results of isobaric and isothermic experimentsmore » are compared, and the strengths and potential drawbacks of each method are discussed. We also provide critical evaluations of the effects of the hot filament of the ion pressure gauge on the reactivity of gas-phase species, the peak fitting procedure on the quantitative analysis of spectra, and rigorous accounting of carbon contamination on data analysis and interpretation. Lastly, this work underscores the importance of considering experimental design and data analysis protocols during APXPS experiments with water vapor in order to minimize misinterpretations arising from these factors.« less

  13. Water Adsorption and Dissociation on Polycrystalline Copper Oxides: Effects of Environmental Contamination and Experimental Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trotochaud, Lena; Head, Ashley R.; Pletincx, Sven

    We use ambient-pressure X-ray photoelectron spectroscopy (APXPS) to study chemical changes, including hydroxylation and water adsorption, at copper oxide surfaces from ultrahigh vacuum to ambient relative humidities of ~5%. Polycrystalline CuO and Cu 2O surfaces were prepared by selective oxidation of metallic copper foils. For both oxides, hydroxylation occurs readily, even at high-vacuum conditions. Hydroxylation on both oxides plateaus near ~0.01% relative humidity (RH) at a coverage of ~1 monolayer. In contrast to previous studies, neither oxide shows significant accumulation of molecular water; rather, both surfaces show a high affinity for adventitious carbon contaminants. Results of isobaric and isothermic experimentsmore » are compared, and the strengths and potential drawbacks of each method are discussed. We also provide critical evaluations of the effects of the hot filament of the ion pressure gauge on the reactivity of gas-phase species, the peak fitting procedure on the quantitative analysis of spectra, and rigorous accounting of carbon contamination on data analysis and interpretation. Lastly, this work underscores the importance of considering experimental design and data analysis protocols during APXPS experiments with water vapor in order to minimize misinterpretations arising from these factors.« less

  14. Small-molecule ligand docking into comparative models with Rosetta

    PubMed Central

    Combs, Steven A; DeLuca, Samuel L; DeLuca, Stephanie H; Lemmon, Gordon H; Nannemann, David P; Nguyen, Elizabeth D; Willis, Jordan R; Sheehan, Jonathan H; Meiler, Jens

    2017-01-01

    Structure-based drug design is frequently used to accelerate the development of small-molecule therapeutics. Although substantial progress has been made in X-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy, the availability of high-resolution structures is limited owing to the frequent inability to crystallize or obtain sufficient NMR restraints for large or flexible proteins. Computational methods can be used to both predict unknown protein structures and model ligand interactions when experimental data are unavailable. This paper describes a comprehensive and detailed protocol using the Rosetta modeling suite to dock small-molecule ligands into comparative models. In the protocol presented here, we review the comparative modeling process, including sequence alignment, threading and loop building. Next, we cover docking a small-molecule ligand into the protein comparative model. In addition, we discuss criteria that can improve ligand docking into comparative models. Finally, and importantly, we present a strategy for assessing model quality. The entire protocol is presented on a single example selected solely for didactic purposes. The results are therefore not representative and do not replace benchmarks published elsewhere. We also provide an additional tutorial so that the user can gain hands-on experience in using Rosetta. The protocol should take 5–7 h, with additional time allocated for computer generation of models. PMID:23744289

  15. Attacks exploiting deviation of mean photon number in quantum key distribution and coin tossing

    NASA Astrophysics Data System (ADS)

    Sajeed, Shihan; Radchenko, Igor; Kaiser, Sarah; Bourgoin, Jean-Philippe; Pappa, Anna; Monat, Laurent; Legré, Matthieu; Makarov, Vadim

    2015-03-01

    The security of quantum communication using a weak coherent source requires an accurate knowledge of the source's mean photon number. Finite calibration precision or an active manipulation by an attacker may cause the actual emitted photon number to deviate from the known value. We model effects of this deviation on the security of three quantum communication protocols: the Bennett-Brassard 1984 (BB84) quantum key distribution (QKD) protocol without decoy states, Scarani-Acín-Ribordy-Gisin 2004 (SARG04) QKD protocol, and a coin-tossing protocol. For QKD we model both a strong attack using technology possible in principle and a realistic attack bounded by today's technology. To maintain the mean photon number in two-way systems, such as plug-and-play and relativistic quantum cryptography schemes, bright pulse energy incoming from the communication channel must be monitored. Implementation of a monitoring detector has largely been ignored so far, except for ID Quantique's commercial QKD system Clavis2. We scrutinize this implementation for security problems and show that designing a hack-proof pulse-energy-measuring detector is far from trivial. Indeed, the first implementation has three serious flaws confirmed experimentally, each of which may be exploited in a cleverly constructed Trojan-horse attack. We discuss requirements for a loophole-free implementation of the monitoring detector.

  16. Interreality for the management and training of psychological stress: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Psychological stress occurs when an individual perceives that environmental demands tax or exceed his or her adaptive capacity. Its association with severe health and emotional diseases, points out the necessity to find new efficient strategies to treat it. Moreover, psychological stress is a very personal problem and requires training focused on the specific needs of individuals. To overcome the above limitations, the INTERSTRESS project suggests the adoption of a new paradigm for e-health - Interreality - that integrates contextualized assessment and treatment within a hybrid environment, bridging the physical and the virtual worlds. According to this premise, the aim of this study is to investigate the advantages of using advanced technologies, in combination with cognitive behavioral therapy (CBT), based on a protocol for reducing psychological stress. Methods/Design The study is designed as a randomized controlled trial. It includes three groups of approximately 50 subjects each who suffer from psychological stress: (1) the experimental group, (2) the control group, (3) the waiting list group. Participants included in the experimental group will receive a treatment based on cognitive behavioral techniques combined with virtual reality, biofeedback and mobile phone, while the control group will receive traditional stress management CBT-based training, without the use of new technologies. The wait-list group will be reassessed and compared with the two other groups five weeks after the initial evaluation. After the reassessment, the wait-list patients will randomly receive one of the two other treatments. Psychometric and physiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as the qualitative dependent variable. Discussion What we would like to show with the present trial is that bridging virtual experiences, used to learn coping skills and emotional regulation, with real experiences using advanced technologies (virtual reality, advanced sensors and smartphones) is a feasible way to address actual limitations of existing protocols for psychological stress. Trial registration http://clinicaltrials.gov/ct2/show/NCT01683617 PMID:23806013

  17. Developing leadership capacity for guideline use: a pilot cluster randomized control trial.

    PubMed

    Gifford, Wendy A; Davies, Barbara L; Graham, Ian D; Tourangeau, Ann; Woodend, A Kirsten; Lefebre, Nancy

    2013-02-01

    The importance of leadership to influence nurses' use of clinical guidelines has been well documented. However, little is known about how to develop and evaluate leadership interventions for guideline use. The purpose of this study was to pilot a leadership intervention designed to influence nurses' use of guideline recommendations when caring for patients with diabetic foot ulcers in home care nursing. This paper reports on the feasibility of implementing the study protocol, the trial findings related to nursing process outcomes, and leadership behaviors. A mixed methods pilot study was conducted with a post-only cluster randomized controlled trial and descriptive qualitative interviews. Four units were randomized to control or experimental groups. Clinical and management leadership teams participated in a 12-week leadership intervention (workshop, teleconferences). Participants received summarized chart audit data, identified goals for change, and created a team leadership action. Criteria to assess feasibility of the protocol included: design, intervention, measures, and data collection procedures. For the trial, chart audits compared differences in nursing process outcomes. 8-item nursing assessments score. Secondary outcome: 5-item score of nursing care based on goals for change identified by intervention participants. Qualitative interviews described leadership behaviors that influenced guideline use. Conducting this pilot showed some aspects of the study protocol were feasible, while others require further development. Trial findings observed no significant difference in the primary outcome. A significant increase was observed in the 5-item score chosen by intervention participants (p = 0.02). In the experimental group more relations-oriented leadership behaviors, audit and feedback and reminders were described as leadership strategies. Findings suggest that a leadership intervention has the potential to influence nurses' use of guideline recommendations, but further work is required to refine the intervention and outcome measures. A taxonomy of leadership behaviors is proposed to inform future research. © 2012 The authors. World Views on Evidence-Based Nursing © Sigma Theta Tau International.

  18. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  19. Standardization of a Videofluoroscopic Swallow Study Protocol to Investigate Dysphagia in Dogs.

    PubMed

    Harris, R A; Grobman, M E; Allen, M J; Schachtel, J; Rawson, N E; Bennett, B; Ledyayev, J; Hopewell, B; Coates, J R; Reinero, C R; Lever, T E

    2017-03-01

    Videofluoroscopic swallow study (VFSS) is the gold standard for diagnosis of dysphagia in veterinary medicine but lacks standardized protocols that emulate physiologic feeding practices. Age impacts swallow function in humans but has not been evaluated by VFSS in dogs. To develop a protocol with custom kennels designed to allow free-feeding of 3 optimized formulations of contrast media and diets that address limitations of current VFSS protocols. We hypothesized that dogs evaluated by a free-feeding VFSS protocol would show differences in objective swallow metrics based on age. Healthy juvenile, adult, and geriatric dogs (n = 24). Prospective, experimental study. Custom kennels were developed to maintain natural feeding behaviors during VFSS. Three food consistencies (thin liquid, pureed food, and dry kibble) were formulated with either iohexol or barium to maximize palatability and voluntary prehension. Dogs were evaluated by 16 swallow metrics and compared across age groups. Development of a standardized VFSS protocol resulted in successful collection of swallow data in healthy dogs. No significant differences in swallow metrics were observed among age groups. Substantial variability was observed in healthy dogs when evaluated under these physiologic conditions. Features typically attributed to pathologic states, such as gastric reflux, were seen in healthy dogs. Development of a VFSS protocol that reflects natural feeding practices may allow emulation of physiology resulting in clinical signs of dysphagia. Age did not result in significant changes in swallow metrics, but additional studies are needed, particularly in light of substantial normal variation. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  20. Research on low-latency MAC protocols for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    He, Chenguang; Sha, Xuejun; Lee, Chankil

    2007-11-01

    Energy-efficient should not be the only design goal in MAC protocols for wireless sensor networks, which involve the use of battery-operated computing and sensing devices. Low-latency operation becomes the same important as energy-efficient in the case that the traffic load is very heavy or the real-time constrain is used in applications like tracking or locating. This paper introduces some causes of traditional time delays which are inherent in a multi-hops network using existing WSN MAC protocols, illuminates the importance of low-latency MAC design for wireless sensor networks, and presents three MACs as examples of low-latency protocols designed specially for sleep delay, wait delay and wakeup delay in wireless sensor networks, respectively. The paper also discusses design trade-offs with emphasis on low-latency and points out their advantages and disadvantages, together with some design considerations and suggestions for MAC protocols for future applications and researches.

  1. Methodology capture: discriminating between the "best" and the rest of community practice

    PubMed Central

    Eales, James M; Pinney, John W; Stevens, Robert D; Robertson, David L

    2008-01-01

    Background The methodologies we use both enable and help define our research. However, as experimental complexity has increased the choice of appropriate methodologies has become an increasingly difficult task. This makes it difficult to keep track of available bioinformatics software, let alone the most suitable protocols in a specific research area. To remedy this we present an approach for capturing methodology from literature in order to identify and, thus, define best practice within a field. Results Our approach is to implement data extraction techniques on the full-text of scientific articles to obtain the set of experimental protocols used by an entire scientific discipline, molecular phylogenetics. Our methodology for identifying methodologies could in principle be applied to any scientific discipline, whether or not computer-based. We find a number of issues related to the nature of best practice, as opposed to community practice. We find that there is much heterogeneity in the use of molecular phylogenetic methods and software, some of which is related to poor specification of protocols. We also find that phylogenetic practice exhibits field-specific tendencies that have increased through time, despite the generic nature of the available software. We used the practice of highly published and widely collaborative researchers ("expert" researchers) to analyse the influence of authority on community practice. We find expert authors exhibit patterns of practice common to their field and therefore act as useful field-specific practice indicators. Conclusion We have identified a structured community of phylogenetic researchers performing analyses that are customary in their own local community and significantly different from those in other areas. Best practice information can help to bridge such subtle differences by increasing communication of protocols to a wider audience. We propose that the practice of expert authors from the field of evolutionary biology is the closest to contemporary best practice in phylogenetic experimental design. Capturing best practice is, however, a complex task and should also acknowledge the differences between fields such as the specific context of the analysis. PMID:18761740

  2. Damping measurements in flowing water

    NASA Astrophysics Data System (ADS)

    Coutu, A.; Seeley, C.; Monette, C.; Nennemann, B.; Marmont, H.

    2012-11-01

    Fluid-structure interaction (FSI), in the form of mass loading and damping, governs the dynamic response of water turbines, such as Francis turbines. Water added mass and damping are both critical quantities in evaluating the dynamic response of the turbine component. Although the effect of fluid added mass is well documented, fluid damping, a critical quantity to limit vibration amplitudes during service, and therefore to help avoiding possible failure of the turbines, has received much less attention in the literature. This paper presents an experimental investigation of damping due to FSI. The experimental setup, designed to create dynamic characteristics similar to the ones of Francis turbine blades is discussed, together with the experimental protocol and examples of measurements obtained. The paper concludes with the calculated damping values and a discussion on the impact of the observed damping behaviour on the response of hydraulic turbine blades to FSI.

  3. Morphological Study of Langmuir Polymer Films by means of Atomic Force Microscopy and MD Simulations

    NASA Astrophysics Data System (ADS)

    Reiter, Renate; Knecht, Volker; Chandran, Sivasurender; Reiter, Günter

    In general it is difficult to reproduce well defined morphologies of Langmuir polymer films (LPFs) because they have a high propensity to form non-equilibrium states. We present a systematic study based on different compression protocols designed to allow for relaxations of LPFs under well defined conditions. The homo peptide poly-?-benzyl-L-glutamate (PBLG) was chosen for this study because it is a well investigated system that represents the relaxational behaviour of rod-like molecules which is expected to show less complexity than coiled polymer molecules. Our results demonstrate that experimentally manipulating the course of relaxations in LPFs has tremendous impact on the ordering of the molecules. Coarse grain molecular dynamics simulations were performed under comparable conditions. The results match the experimental observations reasonably well and allow to zoom into molecular details which are not resolved experimentally.

  4. SPP: A data base processor data communications protocol

    NASA Technical Reports Server (NTRS)

    Fishwick, P. A.

    1983-01-01

    The design and implementation of a data communications protocol for the Intel Data Base Processor (DBP) is defined. The protocol is termed SPP (Service Port Protocol) since it enables data transfer between the host computer and the DBP service port. The protocol implementation is extensible in that it is explicitly layered and the protocol functionality is hierarchically organized. Extensive trace and performance capabilities have been supplied with the protocol software to permit optional efficient monitoring of the data transfer between the host and the Intel data base processor. Machine independence was considered to be an important attribute during the design and implementation of SPP. The protocol source is fully commented and is included in Appendix A of this report.

  5. Does Posting Facebook Status Updates Increase or Decrease Loneliness? An Online Social Networking Experiment

    PubMed Central

    Deters, Fenne große; Mehl, Matthias R.

    2013-01-01

    Online social networking is a pervasive but empirically understudied phenomenon. Strong public opinions on its consequences exist but are backed up by little empirical evidence and almost no causally-conclusive, experimental research. The current study tested the psychological effects of posting status updates on Facebook using an experimental design. For one week, participants in the experimental condition were asked to post more than they usually do, whereas participants in the control condition received no instructions. Participants added a lab “Research Profile” as a Facebook friend allowing for the objective documentation of protocol compliance, participants’ status updates, and friends’ responses. Results revealed (1) that the experimentally-induced increase in status updating activity reduced loneliness, (2) that the decrease in loneliness was due to participants feeling more connected to their friends on a daily basis and (3) that the effect of posting on loneliness was independent of direct social feedback (i.e. responses) by friends. PMID:24224070

  6. Using Backward Design in Education Research: A Research Methods Essay †

    PubMed Central

    Jensen, Jamie L.; Bailey, Elizabeth G.; Kummer, Tyler A.; Weber, K. Scott

    2017-01-01

    Education research within the STEM disciplines applies a scholarly approach to teaching and learning, with the intent of better understanding how people learn and of improving pedagogy at the undergraduate level. Most of the professionals practicing in this field have ‘crossed over’ from other disciplinary fields and thus have faced challenges in becoming experts in a new discipline. In this article, we offer a novel framework for approaching education research design called Backward Design in Education Research. It is patterned on backward curricular design and provides a three-step, systematic approach to designing education projects: 1) Define a research question that leads to a testable causal hypothesis based on a theoretical rationale; 2) Choose or design the assessment instruments to test the research hypothesis; and 3) Develop an experimental protocol that will be effective in testing the research hypothesis. This approach provides a systematic method to develop and carry out evidence-based research design. PMID:29854045

  7. Preparation Prerequisites for Effective Irrigation of Apical Root Canal: A Critical Review.

    PubMed

    Tziafas, Dimitrios; Alraeesi, Dana; Al Hormoodi, Reem; Ataya, Maamoun; Fezai, Hessa; Aga, Nausheen

    2017-10-01

    It is well recognized that disinfection of the complex root canal system at the apical root canal remains the most critical therapeutic measure to treat apical periodontitis. Observational and experimental data in relation to the anatomy of the apical root canal in different tooth types and the cross sectional diameters of the apical part of the most commonly used hand and rotary files are critically reviewed. The present data analysis confirm that the challenging issue of antibacterial efficacy of modern preparation protocols in non-surgical endodontics requires more attention to apical root canal irrigation as a balance between safety and effectiveness. Ex vivo investigations clearly indicate that a specific design of the chemo-mechanical preparation is needed at the onset of RCT, more particularly in infected teeth. Design should be based on specific anatomical parameters, and must determine the appropriate size and taper of preparation as pre-requirements for effective and safe apical irrigation. The optimal irrigation protocols might be designed on the basis of technical specifications of the preparations procedures, such as the penetration depth, the type of the needle, the required time for continuous irrigant flow, the concentration of NaOCl, and the activation parameters. Key words: Endodontics, root canal treatment, instrumentation, irrigation, apical root canal.

  8. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  9. A new rapid kindling variant for induction of cortical epileptogenesis in freely moving rats

    PubMed Central

    Morales, Juan Carlos; Álvarez-Ferradas, Carla; Roncagliolo, Manuel; Fuenzalida, Marco; Wellmann, Mario; Nualart, Francisco Javier; Bonansco, Christian

    2014-01-01

    Kindling, one of the most used models of experimental epilepsy is based on daily electrical stimulation in several brain structures. Unlike the classic or slow kindling protocols (SK), the rapid kindling types (RK) described until now require continuous stimulation at suprathreshold intensities applied directly to the same brain structure used for subsequent electrophysiological and immunohistochemical studies, usually the hippocampus. However, the cellular changes observed in these rapid protocols, such as astrogliosis and neuronal loss, could be due to experimental manipulation more than to epileptogenesis-related alterations. Here, we developed a new RK protocol in order to generate an improved model of temporal lobe epilepsy (TLE) which allows gradual progression of the epilepsy as well as obtaining an epileptic hippocampus, thus avoiding direct surgical manipulation and electric stimulation over this structure. This new protocol consists of basolateral amygdala (BLA) stimulation with 10 trains of biphasic pulses (10 s; 50 Hz) per day with 20 min-intervals, during 3 consecutive days, using a subconvulsive and subthreshold intensity, which guarantees tissue integrity. The progression of epileptic activity was evaluated in freely moving rats through electroencephalographic (EEG) recordings from cortex and amygdala, accompanied with synchronized video recordings. Moreover, we assessed the effectiveness of RK protocol and the establishment of epilepsy by evaluating cellular alterations of hippocampal slices from kindled rats. RK protocol induced convulsive states similar to SK protocols but in 3 days, with persistently lowered threshold to seizure induction and epileptogenic-dependent cellular changes in amygdala projection areas. We concluded that this novel RK protocol introduces a new variant of the chronic epileptogenesis models in freely moving rats, which is faster, highly reproducible and causes minimum cell damage with respect to that observed in other experimental models of epilepsy. PMID:25100948

  10. A new rapid kindling variant for induction of cortical epileptogenesis in freely moving rats.

    PubMed

    Morales, Juan Carlos; Alvarez-Ferradas, Carla; Roncagliolo, Manuel; Fuenzalida, Marco; Wellmann, Mario; Nualart, Francisco Javier; Bonansco, Christian

    2014-01-01

    Kindling, one of the most used models of experimental epilepsy is based on daily electrical stimulation in several brain structures. Unlike the classic or slow kindling protocols (SK), the rapid kindling types (RK) described until now require continuous stimulation at suprathreshold intensities applied directly to the same brain structure used for subsequent electrophysiological and immunohistochemical studies, usually the hippocampus. However, the cellular changes observed in these rapid protocols, such as astrogliosis and neuronal loss, could be due to experimental manipulation more than to epileptogenesis-related alterations. Here, we developed a new RK protocol in order to generate an improved model of temporal lobe epilepsy (TLE) which allows gradual progression of the epilepsy as well as obtaining an epileptic hippocampus, thus avoiding direct surgical manipulation and electric stimulation over this structure. This new protocol consists of basolateral amygdala (BLA) stimulation with 10 trains of biphasic pulses (10 s; 50 Hz) per day with 20 min-intervals, during 3 consecutive days, using a subconvulsive and subthreshold intensity, which guarantees tissue integrity. The progression of epileptic activity was evaluated in freely moving rats through electroencephalographic (EEG) recordings from cortex and amygdala, accompanied with synchronized video recordings. Moreover, we assessed the effectiveness of RK protocol and the establishment of epilepsy by evaluating cellular alterations of hippocampal slices from kindled rats. RK protocol induced convulsive states similar to SK protocols but in 3 days, with persistently lowered threshold to seizure induction and epileptogenic-dependent cellular changes in amygdala projection areas. We concluded that this novel RK protocol introduces a new variant of the chronic epileptogenesis models in freely moving rats, which is faster, highly reproducible and causes minimum cell damage with respect to that observed in other experimental models of epilepsy.

  11. The Role of Additional Pulses in Electropermeabilization Protocols

    PubMed Central

    Suárez, Cecilia; Soba, Alejandro; Maglietti, Felipe; Olaiz, Nahuel; Marshall, Guillermo

    2014-01-01

    Electropermeabilization (EP) based protocols such as those applied in medicine, food processing or environmental management, are well established and widely used. The applied voltage, as well as tissue electric conductivity, are of utmost importance for assessing final electropermeabilized area and thus EP effectiveness. Experimental results from literature report that, under certain EP protocols, consecutive pulses increase tissue electric conductivity and even the permeabilization amount. Here we introduce a theoretical model that takes into account this effect in the application of an EP-based protocol, and its validation with experimental measurements. The theoretical model describes the electric field distribution by a nonlinear Laplace equation with a variable conductivity coefficient depending on the electric field, the temperature and the quantity of pulses, and the Penne's Bioheat equation for temperature variations. In the experiments, a vegetable tissue model (potato slice) is used for measuring electric currents and tissue electropermeabilized area in different EP protocols. Experimental measurements show that, during sequential pulses and keeping constant the applied voltage, the electric current density and the blackened (electropermeabilized) area increase. This behavior can only be attributed to a rise in the electric conductivity due to a higher number of pulses. Accordingly, we present a theoretical modeling of an EP protocol that predicts correctly the increment in the electric current density observed experimentally during the addition of pulses. The model also demonstrates that the electric current increase is due to a rise in the electric conductivity, in turn induced by temperature and pulse number, with no significant changes in the electric field distribution. The EP model introduced, based on a novel formulation of the electric conductivity, leads to a more realistic description of the EP phenomenon, hopefully providing more accurate predictions of treatment outcomes. PMID:25437512

  12. Experimental design to evaluate directed adaptive mutation in Mammalian cells.

    PubMed

    Bordonaro, Michael; Chiaro, Christopher R; May, Tobias

    2014-12-09

    We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. We performed the initial stages of characterizing our system and have limited preliminary data from several pilot experiments. Cell growth and DNA sequence data indicate that we have identified a cell clone that exhibits several suitable characteristics, although further study is required to identify a more optimal cell clone. The experimental approach is based on a quantum biological model of basis-dependent selection describing a novel mechanism of adaptive mutation. This project is currently inactive due to lack of funding. However, consistent with the objective of early reports, we describe a proposed study that has not produced publishable results, but is worthy of report because of the hypothesis, experimental design, and protocols. We outline the project's rationale and experimental design, with its strengths and weaknesses, to stimulate discussion and analysis, and lay the foundation for future studies in this field.

  13. Guidance from an NIH Workshop on Designing, Implementing, and Reporting Clinical Studies of Soy Interventions1–4

    PubMed Central

    Klein, Marguerite A.; Nahin, Richard L.; Messina, Mark J.; Rader, Jeanne I.; Thompson, Lilian U.; Badger, Thomas M.; Dwyer, Johanna T.; Kim, Young S.; Pontzer, Carol H.; Starke-Reed, Pamela E.; Weaver, Connie M.

    2010-01-01

    The NIH sponsored a scientific workshop, “Soy Protein/Isoflavone Research: Challenges in Designing and Evaluating Intervention Studies,” July 28–29, 2009. The workshop goal was to provide guidance for the next generation of soy protein/isoflavone human research. Session topics included population exposure to soy; the variability of the human response to soy; product composition; methods, tools, and resources available to estimate exposure and protocol adherence; and analytical methods to assess soy in foods and supplements and analytes in biologic fluids and other tissues. The intent of the workshop was to address the quality of soy studies, not the efficacy or safety of soy. Prior NIH workshops and an evidence-based review questioned the quality of data from human soy studies. If clinical studies are pursued, investigators need to ensure that the experimental designs are optimal and the studies properly executed. The workshop participants identified methodological issues that may confound study results and interpretation. Scientifically sound and useful options for dealing with these issues were discussed. The resulting guidance is presented in this document with a brief rationale. The guidance is specific to soy clinical research and does not address nonsoy-related factors that should also be considered in designing and reporting clinical studies. This guidance may be used by investigators, journal editors, study sponsors, and protocol reviewers for a variety of purposes, including designing and implementing trials, reporting results, and interpreting published epidemiological and clinical studies. PMID:20392880

  14. Design and analysis issues for economic analysis alongside clinical trials.

    PubMed

    Marshall, Deborah A; Hux, Margaret

    2009-07-01

    Clinical trials can offer a valuable and efficient opportunity to collect the health resource use and outcomes data for economic evaluation. However, economic and clinical studies differ fundamentally in the question they seek to answer. The design and analysis of trial-based cost-effectiveness studies require special consideration, which are reviewed in this article. Traditional randomized controlled trials, using an experimental design with a controlled protocol, are designed to measure safety and efficacy for product registration. Cost-effectiveness analysis seeks to measure effectiveness in the context of routine clinical practice, and requires collection of health care resources to allow estimation of cost over an equal timeframe for each treatment alternative. In assessing suitability of a trial for economic data collection, the comparator treatment and other protocol factors need to reflect current clinical practice and the trial follow-up must be sufficiently long to capture important costs and effects. The broadest available population and a measure of effectiveness reflecting important benefits for patients are preferred for economic analyses. Special analytical issues include dealing with missing and censored cost data, assessing uncertainty of the incremental cost-effectiveness ratio, and accounting for the underlying heterogeneity in patient subgroups. Careful consideration also needs to be given to data from multinational studies since practice patterns can differ across countries. Although clinical trials can be an efficient opportunity to collect data for economic evaluation, careful consideration of the suitability of the study design, and appropriate analytical methods must be applied to obtain rigorous results.

  15. Experimental Demonstration of a Cheap and Accurate Phase Estimation

    DOE PAGES

    Rudinger, Kenneth; Kimmel, Shelby; Lobser, Daniel; ...

    2017-05-11

    We demonstrate an experimental implementation of robust phase estimation (RPE) to learn the phase of a single-qubit rotation on a trapped Yb + ion qubit. Here, we show this phase can be estimated with an uncertainty below 4 × 10 -4 rad using as few as 176 total experimental samples, and our estimates exhibit Heisenberg scaling. Unlike standard phase estimation protocols, RPE neither assumes perfect state preparation and measurement, nor requires access to ancillae. We crossvalidate the results of RPE with the more resource-intensive protocol of gate set tomography.

  16. Variability in Criteria for Emergency Medical Services Routing of Acute Stroke Patients to Designated Stroke Center Hospitals.

    PubMed

    Dimitrov, Nikolay; Koenig, William; Bosson, Nichole; Song, Sarah; Saver, Jeffrey L; Mack, William J; Sanossian, Nerses

    2015-09-01

    Comprehensive stroke systems of care include routing to the nearest designated stroke center hospital, bypassing non-designated hospitals. Routing protocols are implemented at the state or county level and vary in qualification criteria and determination of destination hospital. We surveyed all counties in the state of California for presence and characteristics of their prehospital stroke routing protocols. Each county's local emergency medical services agency (LEMSA) was queried for the presence of a stroke routing protocol. We reviewed these protocols for method of stroke identification and criteria for patient transport to a stroke center. Thirty-three LEMSAs serve 58 counties in California with populations ranging from 1,175 to nearly 10 million. Fifteen LEMSAs (45%) had stroke routing protocols, covering 23 counties (40%) and 68% of the state population. Counties with protocols had higher population density (1,500 vs. 140 persons per square mile). In the six counties without designated stroke centers, patients meeting criteria were transported out of county. Stroke identification in the field was achieved using the Cincinnati Prehospital Stroke Screen in 72%, Los Angeles Prehospital Stroke Screen in 7% and a county-specific protocol in 22%. California EMS prehospital acute stroke routing protocols cover 68% of the state population and vary in characteristics including activation by symptom onset time and destination facility features, reflecting matching of system design to local geographic resources.

  17. Mathematical Model Formulation And Validation Of Water And Solute Transport In Whole Hamster Pancreatic Islets

    PubMed Central

    Benson, Charles T.; Critser, John K.

    2014-01-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3 × 3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87 ± 0.06 (mean ± S.D.). Only the treatment variable of perfusing solution was found to be significant (p < 0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. PMID:24950195

  18. Update on the MRI Core of the Alzheimer's Disease Neuroimaging Initiative

    PubMed Central

    Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; DeCarli, Charles S; Dale, Anders M; Weiner, Michael W

    2010-01-01

    Functions of the ADNI MRI core fall into three categories: (1) those of the central MRI core lab at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data, and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present (“ADNI-GO”) and future (“ADNI-2”, if funded) MRI protocol will be to maintain MRI methodological consistency in previously enrolled “ADNI-1” subjects who are followed longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor specific pilot sub-studies of arterial spin labeling perfusion, resting state functional connectivity and diffusion tensor imaging. One each of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multi-center (but single vendor) setting for these three emerging MRI applications. PMID:20451869

  19. Update on the magnetic resonance imaging core of the Alzheimer's disease neuroimaging initiative.

    PubMed

    Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; Decarli, Charles S; Dale, Anders M; Carmichael, Owen W; Tosun, Duygu; Weiner, Michael W

    2010-05-01

    Functions of the Alzheimer's Disease Neuroimaging Initiative (ADNI) magnetic resonance imaging (MRI) core fall into three categories: (1) those of the central MRI core laboratory at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data; and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing, and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present ("ADNI-GO") and future ("ADNI-2," if funded) MRI protocol will be to maintain MRI methodological consistency in the previously enrolled "ADNI-1" subjects who are followed up longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor-specific pilot sub-studies of arterial spin-labeling perfusion, resting state functional connectivity, and diffusion tensor imaging. One of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multicenter (but single vendor) setting for these three emerging MRI applications. Copyright 2010 The Alzheimer

  20. Feasibility and Emotional Impact of Experimentally Extending Sleep in Short-Sleeping Adolescents.

    PubMed

    Van Dyk, Tori R; Zhang, Nanhua; Catlin, Perry A; Cornist, Kaylin; McAlister, Shealan; Whitacre, Catharine; Beebe, Dean W

    2017-09-01

    Published experimental sleep manipulation protocols for adolescents have been limited to the summer, limiting causal conclusions about how short sleep affects them on school nights, when they are most likely to restrict their sleep. This study assesses the feasibility and emotional impact of a school-night sleep manipulation protocol to test the effects of lengthening sleep in habitually short-sleeping adolescents. High school students aged 14-18 years who habitually slept 5-7 hours on school nights participated in a 5-week experimental sleep manipulation protocol. Participants completed a baseline week followed in randomized counterbalanced order by two experimental conditions lasting 2 weeks each: prescribed habitual sleep (HAB; sleep time set to match baseline) and sleep extension (EXT; 1.5-hour increase in time in bed from HAB). All sleep was obtained at home, monitored with actigraphy. Data on adherence, protocol acceptability, mood and behavior were collected at the end of each condition. Seventy-six adolescents enrolled in the study, with 54 retained through all 5 weeks. Compared to HAB, during EXT, participants averaged an additional 72.6 minutes/night of sleep (p < .001) and had reduced symptoms of sleepiness, anger, vigor, fatigue, and confusion (p < .05). The large majority of parents (98%) and adolescents (100%) said they would "maybe" or "definitely" recommend the study to another family. An experimental, school-night sleep manipulation protocol can be feasibly implemented which directly tests the potential protective effects of lengthening sleep. Many short-sleeping adolescents would benefit emotionally from sleeping longer, supporting public health efforts to promote adolescent sleep on school nights. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  1. Analysis of the contribution of experimental bias, experimental noise, and inter-subject biological variability on the assessment of developmental trajectories in diffusion MRI studies of the brain.

    PubMed

    Sadeghi, Neda; Nayak, Amritha; Walker, Lindsay; Okan Irfanoglu, M; Albert, Paul S; Pierpaoli, Carlo

    2015-04-01

    Metrics derived from the diffusion tensor, such as fractional anisotropy (FA) and mean diffusivity (MD) have been used in many studies of postnatal brain development. A common finding of previous studies is that these tensor-derived measures vary widely even in healthy populations. This variability can be due to inherent inter-individual biological differences as well as experimental noise. Moreover, when comparing different studies, additional variability can be introduced by different acquisition protocols. In this study we examined scans of 61 individuals (aged 4-22 years) from the NIH MRI study of normal brain development. Two scans were collected with different protocols (low and high resolution). Our goal was to separate the contributions of biological variability and experimental noise to the overall measured variance, as well as to assess potential systematic effects related to the use of different protocols. We analyzed FA and MD in seventeen regions of interest. We found that biological variability for both FA and MD varies widely across brain regions; biological variability is highest for FA in the lateral part of the splenium and body of the corpus callosum along with the cingulum and the superior longitudinal fasciculus, and for MD in the optic radiations and the lateral part of the splenium. These regions with high inter-individual biological variability are the most likely candidates for assessing genetic and environmental effects in the developing brain. With respect to protocol-related effects, the lower resolution acquisition resulted in higher MD and lower FA values for the majority of regions compared with the higher resolution protocol. However, the majority of the regions did not show any age-protocol interaction, indicating similar trajectories were obtained irrespective of the protocol used. Published by Elsevier Inc.

  2. Quantum computing on encrypted data

    NASA Astrophysics Data System (ADS)

    Fisher, K. A. G.; Broadbent, A.; Shalm, L. K.; Yan, Z.; Lavoie, J.; Prevedel, R.; Jennewein, T.; Resch, K. J.

    2014-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.

  3. Quantum computing on encrypted data.

    PubMed

    Fisher, K A G; Broadbent, A; Shalm, L K; Yan, Z; Lavoie, J; Prevedel, R; Jennewein, T; Resch, K J

    2014-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.

  4. Microencapsulation of Hepatocytes and Mesenchymal Stem Cells for Therapeutic Applications.

    PubMed

    Meier, Raphael P H; Montanari, Elisa; Morel, Philippe; Pimenta, Joël; Schuurman, Henk-Jan; Wandrey, Christine; Gerber-Lemaire, Sandrine; Mahou, Redouan; Bühler, Leo H

    2017-01-01

    Encapsulated hepatocyte transplantation and encapsulated mesenchymal stem cell transplantation are newly developed potential treatments for acute and chronic liver diseases, respectively. Cells are microencapsulated in biocompatible semipermeable alginate-based hydrogels. Microspheres protect cells against antibodies and immune cells, while allowing nutrients, small/medium size proteins and drugs to diffuse inside and outside the polymer matrix. Microencapsulated cells are assessed in vitro and designed for experimental transplantation and for future clinical applications.Here, we describe the protocol for microencapsulation of hepatocytes and mesenchymal stem cells within hybrid poly(ethylene glycol)-alginate hydrogels.

  5. High-fidelity readout in circuit quantum electrodynamics using the Jaynes-Cummings nonlinearity.

    PubMed

    Reed, M D; DiCarlo, L; Johnson, B R; Sun, L; Schuster, D I; Frunzio, L; Schoelkopf, R J

    2010-10-22

    We demonstrate a qubit readout scheme that exploits the Jaynes-Cummings nonlinearity of a superconducting cavity coupled to transmon qubits. We find that, in the strongly driven dispersive regime of this system, there is the unexpected onset of a high-transmission "bright" state at a critical power which depends sensitively on the initial qubit state. A simple and robust measurement protocol exploiting this effect achieves a single-shot fidelity of 87% using a conventional sample design and experimental setup, and at least 61% fidelity to joint correlations of three qubits.

  6. Confocal Raman Microspectroscopy: The Measurement of VX Depth Profiles in Hairless Guinea Pig Skin and the Evaluation of RSDL

    DTIC Science & Technology

    2015-02-01

    Defense, or the U.S. Government. The experimental protocol was approved by the Animal Care and Use Committee at the United States Army Medical Research...Laboratory Animals and the Animal Welfare Act of 1966 (P.L. 89-544), as amended. The use of trade names does not constitute an official endorsement or...neat VX (0.3 µl, 13-14 x LD50), using a specially designed template that allowed repeated Raman measurements on the same skin location. Animals were

  7. Experimental support for an immunological approach to the search for life on other planets.

    PubMed

    Schweitzer, Mary Higby; Wittmeyer, Jennifer; Avci, Recep; Pincus, Seth

    2005-02-01

    We propose a three-phase approach to test for evidence of life in extraterrestrial samples. The approach capitalizes on the flexibility, sensitivity, and specificity of antibody-antigen interactions. Data are presented to support the first phase, in which various extraction protocols are compared for efficiency, and in which a preliminary suite of antibodies are tested against various antigens. The antigens and antibodies were chosen on the basis of criteria designed to optimize the detection of extraterrestrial biomarkers unique to living or once-living organisms.

  8. Representing the work of medical protocols for organizational simulation.

    PubMed Central

    Fridsma, D. B.

    1998-01-01

    Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231

  9. Operator assistant systems - An experimental approach using a telerobotics application

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.; Mathe, Nathalie

    1993-01-01

    This article presents a knowledge-based system methodology for developing operator assistant (OA) systems in dynamic and interactive environments. This is a problem both of training and design, which is the subject of this article. Design includes both design of the system to be controlled and design of procedures for operating this system. A specific knowledge representation is proposed for representing the corresponding system and operational knowledge. This representation is based on the situation recognition and analytical reasoning paradigm. It tries to make explicit common factors involved in both human and machine intelligence, including perception and reasoning. An OA system based on this representation has been developed for space telerobotics. Simulations have been carried out with astronauts and the resulting protocols have been analyzed. Results show the relevance of the approach and have been used for improving the knowledge representation and the OA architecture.

  10. Enriching peptide libraries for binding affinity and specificity through computationally directed library design

    PubMed Central

    Foight, Glenna Wink; Chen, T. Scott; Richman, Daniel; Keating, Amy E.

    2017-01-01

    Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model. PMID:28236241

  11. Enriching Peptide Libraries for Binding Affinity and Specificity Through Computationally Directed Library Design.

    PubMed

    Foight, Glenna Wink; Chen, T Scott; Richman, Daniel; Keating, Amy E

    2017-01-01

    Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model.

  12. Design of clinical trials for new drugs in animals.

    PubMed

    Muser, R K

    1980-05-15

    Consideration of the intended purpose, an effort to ask specific questions that can be answered by specific data, selection of the correct values to be measured, and investigators who are able to conduct the study, will make it possible to design clinical studies properly. The ethical need to avoid unnecessary suffering of experimental animals and rules and regulations in effect at the time of clinical trials will dictate certain aspects of the design. The type of product to be evaluated, its mode of administration, the type of animal in which the drug will be used, the type of investigator who will collect the data, and the individual who will use the drug, will determine the mechanical part of the protocol. All of these factors need to be considered and discussed with all participants, which may include biostatisticians, regulatory authorities, investigators, and other individuals, to ensure that a rational design is chosen.

  13. Preventive Neuromuscular Training for Young Female Athletes: Comparison of Coach and Athlete Compliance Rates

    PubMed Central

    Sugimoto, Dai; Mattacola, Carl G.; Bush, Heather M.; Thomas, Staci M.; Foss, Kim D. Barber; Myer, Gregory D.; Hewett, Timothy E.

    2017-01-01

    Context: Fewer athletic injuries and lower anterior cruciate ligament injury incidence rates were noted in studies of neuromuscular-training (NMT) interventions that had high compliance rates. However, several groups have demonstrated that preventive NMT interventions were limited by low compliance rates. Objective: To descriptively analyze coach and athlete compliance with preventive NMT and compare the compliance between study arms as well as among school levels and sports. Design: Randomized, controlled clinical trial. Setting: Middle and high school athletic programs. Participants or Other Participants: A total of 52 teams, comprising 547 female athletes, were randomly assigned to the experimental or control group and followed for 1 athletic season. Intervention(s): The experimental group (n = 30 teams [301 athletes]: 12 basketball teams [125 athletes], 6 soccer teams [74 athletes], and 12 volleyball teams [102 athletes]) participated in an NMT program aimed at reducing traumatic knee injuries through a trunk-stabilization and hip-strengthening program. The control group (n = 22 teams [246 athletes]: 11 basketball teams [116 athletes], 5 soccer teams [68 athletes], and 6 volleyball teams [62 athletes]) performed a resistive rubber-band running program. Main Outcome Measure(s): Compliance with the assigned intervention protocols (3 times per week during the preseason [mean = 3.4 weeks] and 2 times per week in-season [mean = 11.9 weeks] of coaches [coach compliance] and athletes [athlete compliance]) was measured descriptively. Using an independent t test, we compared coach and athlete compliance between the study arms. A 2-way analysis of variance was calculated to compare differences between coach and athlete compliance by school level (middle and high schools) and sport (basketball, soccer, and volleyball). Results: The protocols were completed at a mean rate of 1.3 ± 1.1 times per week during the preseason and 1.2 ± 0.5 times per week in-season. A total of 88.4% of athletes completed 2/3 of the intervention sessions. Coach compliance was greater in the experimental group than in the control group (P = .014). Coach compliance did not differ by sport but was greater at the high school than the middle school (P = .001) level. Athlete compliance did not differ by study arm, sport, or school level. Conclusions: Athletes received instruction in about 50% of each protocol. Nearly 90% of athletes performed more than 2/3 of the assigned NMT interventions. The assigned intervention was performed more often in the experimental arm compared with the control arm. Coaches at the high school level complied with the given protocol more than middle school coaches did. Athletes complied well with the protocol, but coaches did not, especially at the middle school level. PMID:27977300

  14. The Spanish Version of the Unified Protocol for Transdiagnostic Treatment of Emotional Disorders in Adolescents (UP-A) Adapted as a School-Based Anxiety and Depression Prevention Program: Study Protocol for a Cluster Randomized Controlled Trial.

    PubMed

    García-Escalera, Julia; Valiente, Rosa M; Chorot, Paloma; Ehrenreich-May, Jill; Kennedy, Sarah M; Sandín, Bonifacio

    2017-08-21

    Anxiety and depression are common, impairing conditions that evidence high comorbidity rates in adolescence. The Unified Protocol for Transdiagnostic Treatment of Emotional Disorders in Adolescents (UP-A) is one of the few existing resources aimed at applying transdiagnostic treatment principles to target core dysfunctions associated with both anxiety and depression within a single protocol. To our knowledge, this is the first study examining the efficacy of the UP-A adapted as a universal preventive intervention program. The primary aim of this study is to examine whether the Spanish version of the UP-A is more effective than a waitlist (WL) control group in reducing and preventing symptoms of anxiety and depression when employed as a universal, classroom-based preventive intervention. The secondary aim is to investigate changes in a broad range of secondary outcome measures, including negative and positive affect, anxiety sensitivity, emotional avoidance, top problems ratings, school grades, depression and anxiety-related interference, self-esteem, life satisfaction, quality of life, conduct problems, hyperactivity/inattention symptoms, peer problems, prosocial behavior, school adjustment, and discipline problems. Other aims are to assess a range of possible predictors of intervention effects and to examine the feasibility and the acceptability of implementing UP-A in a prevention group format and in a school setting. A cluster, randomized, WL, controlled trial design with classroom as the unit of randomization was used in this study. Five classes including a total of 152 adolescents were randomized to the experimental or WL control groups. Participants in the experimental group received 9 55-minute sessions delivered by advanced doctoral and masters students in clinical psychology. The WL control group will receive the intervention once the 3-month follow-up assessment is completed. We have recruited participants to the cluster randomized controlled trial (RCT) and have conducted the intervention with the experimental group. We expect the WL control group to complete the intervention in July 2017. Data analysis will take place during the second semester of 2017. We expect the experimental group to outperform the WL control group at post-intervention and 3-month follow-up. We also expect the WL control group to show improvements in primary and secondary outcome measures after receiving the intervention. Results will have implications for researchers, families, and education providers. Clinicaltrials.gov NCT03123991; https://clinicaltrials.gov/ct2/show/NCT03123991 (Archived by WebCite at http://www.webcitation.org/6qp7GIzcR). ©Julia García-Escalera, Rosa M Valiente, Paloma Chorot, Jill Ehrenreich-May, Sarah M Kennedy, Bonifacio Sandín. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 21.08.2017.

  15. Cognitive Protocol Stack Design

    DTIC Science & Technology

    2015-12-30

    SECURITY CLASSIFICATION OF: In the ARO “ Cognitive Protocol Stack Design" project we proposed cognitive networking solutions published in international...areas related to cognitive networking, opening also new lines of research that was not possible to forecast at the beginning of the project. In a...Distribution Unlimited Final Report: Cognitive Protocol Stack Design The views, opinions and/or findings contained in this report are those of the author(s

  16. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  17. Histomorphometric assessment of bone necrosis produced by two cryosurgery protocols using liquid nitrogen: an experimental study on rat femurs.

    PubMed

    Costa, Fábio Wildson Gurgel; Brito, Gerly Anne de Castro; Pessoa, Rosana Maria Andrade; Studart-Soares, Eduardo Costa

    2011-01-01

    The aim of this study was to evaluate the effects of liquid nitrogen cryosurgery on the femoral diaphysis of rats. The femoral diaphyses of 42 Wistar rats were exposed to three local and sequential applications of liquid nitrogen for 1 or 2 min, intercalated with periods of 5 min of passive thawing. The animals were sacrificed after 1, 2, 4 and 12 weeks and the specimens obtained were processed and analyzed histomorphometrically. The depth and extent of peak bone necrosis were 124.509 µm and 2087.094 µm for the 1-min protocol, respectively, and 436.424 µm and 12046.426 µm for the 2-min protocol. Peak necrosis was observed in the second experimental week with both cryotherapy protocols. The present results indicate that the 2-min protocol produced more marked bone necrosis than the 1-min protocol. Although our results cannot be entirely extrapolated to clinical practice, they contribute to the understanding of the behavior of bone tissue submitted to different cycles of liquid nitrogen freezing and may serve as a basis for new studies.

  18. Simulation-based Randomized Comparative Assessment of Out-of-Hospital Cardiac Arrest Resuscitation Bundle Completion by Emergency Medical Service Teams Using Standard Life Support or an Experimental Automation-assisted Approach.

    PubMed

    Choi, Bryan; Asselin, Nicholas; Pettit, Catherine C; Dannecker, Max; Machan, Jason T; Merck, Derek L; Merck, Lisa H; Suner, Selim; Williams, Kenneth A; Jay, Gregory D; Kobayashi, Leo

    2016-12-01

    Effective resuscitation of out-of-hospital cardiac arrest (OHCA) patients is challenging. Alternative resuscitative approaches using electromechanical adjuncts may improve provider performance. Investigators applied simulation to study the effect of an experimental automation-assisted, goal-directed OHCA management protocol on EMS providers' resuscitation performance relative to standard protocols and equipment. Two-provider (emergency medical technicians (EMT)-B and EMT-I/C/P) teams were randomized to control or experimental group. Each team engaged in 3 simulations: baseline simulation (standard roles); repeat simulation (standard roles); and abbreviated repeat simulation (reversed roles, i.e., basic life support provider performing ALS tasks). Control teams used standard OHCA protocols and equipment (with high-performance cardiopulmonary resuscitation training intervention); for second and third simulations, experimental teams performed chest compression, defibrillation, airway, pulmonary ventilation, vascular access, medication, and transport tasks with goal-directed protocol and resuscitation-automating devices. Videorecorders and simulator logs collected resuscitation data. Ten control and 10 experimental teams comprised 20 EMT-B's; 1 EMT-I, 8 EMT-C's, and 11 EMT-P's; study groups were not fully matched. Both groups suboptimally performed chest compressions and ventilations at baseline. For their second simulations, control teams performed similarly except for reduced on-scene time, and experimental teams improved their chest compressions (P=0.03), pulmonary ventilations (P<0.01), and medication administration (P=0.02); changes in their performance of chest compression, defibrillation, airway, and transport tasks did not attain significance against control teams' changes. Experimental teams maintained performance improvements during reversed-role simulations. Simulation-based investigation into OHCA resuscitation revealed considerable variability and improvable deficiencies in small EMS teams. Goal-directed, automation-assisted OHCA management augmented select resuscitation bundle element performance without comprehensive improvement.

  19. Security of modified Ping-Pong protocol in noisy and lossy channel

    PubMed Central

    Han, Yun-Guang; Yin, Zhen-Qiang; Li, Hong-Wei; Chen, Wei; Wang, Shuang; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    The “Ping-Pong” (PP) protocol is a two-way quantum key protocol based on entanglement. In this protocol, Bob prepares one maximally entangled pair of qubits, and sends one qubit to Alice. Then, Alice performs some necessary operations on this qubit and sends it back to Bob. Although this protocol was proposed in 2002, its security in the noisy and lossy channel has not been proven. In this report, we add a simple and experimentally feasible modification to the original PP protocol, and prove the security of this modified PP protocol against collective attacks when the noisy and lossy channel is taken into account. Simulation results show that our protocol is practical. PMID:24816899

  20. Security of modified Ping-Pong protocol in noisy and lossy channel.

    PubMed

    Han, Yun-Guang; Yin, Zhen-Qiang; Li, Hong-Wei; Chen, Wei; Wang, Shuang; Guo, Guang-Can; Han, Zheng-Fu

    2014-05-12

    The "Ping-Pong" (PP) protocol is a two-way quantum key protocol based on entanglement. In this protocol, Bob prepares one maximally entangled pair of qubits, and sends one qubit to Alice. Then, Alice performs some necessary operations on this qubit and sends it back to Bob. Although this protocol was proposed in 2002, its security in the noisy and lossy channel has not been proven. In this report, we add a simple and experimentally feasible modification to the original PP protocol, and prove the security of this modified PP protocol against collective attacks when the noisy and lossy channel is taken into account. Simulation results show that our protocol is practical.

  1. Development and evaluation of a study design typology for human research.

    PubMed

    Carini, Simona; Pollock, Brad H; Lehmann, Harold P; Bakken, Suzanne; Barbour, Edward M; Gabriel, Davera; Hagler, Herbert K; Harper, Caryn R; Mollah, Shamim A; Nahm, Meredith; Nguyen, Hien H; Scheuermann, Richard H; Sim, Ida

    2009-11-14

    A systematic classification of study designs would be useful for researchers, systematic reviewers, readers, and research administrators, among others. As part of the Human Studies Database Project, we developed the Study Design Typology to standardize the classification of study designs in human research. We then performed a multiple observer masked evaluation of active research protocols in four institutions according to a standardized protocol. Thirty-five protocols were classified by three reviewers each into one of nine high-level study designs for interventional and observational research (e.g., N-of-1, Parallel Group, Case Crossover). Rater classification agreement was moderately high for the 35 protocols (Fleiss' kappa = 0.442) and higher still for the 23 quantitative studies (Fleiss' kappa = 0.463). We conclude that our typology shows initial promise for reliably distinguishing study design types for quantitative human research.

  2. Biodistribution of the boron carriers boronophenylalanine (BPA) and/or decahydrodecaborate (GB-10) for Boron Neutron Capture Therapy (BNCT) in an experimental model of lung metastases.

    PubMed

    Trivillin, V A; Garabalino, M A; Colombo, L L; González, S J; Farías, R O; Monti Hughes, A; Pozzi, E C C; Bortolussi, S; Altieri, S; Itoiz, M E; Aromando, R F; Nigg, D W; Schwint, A E

    2014-06-01

    BNCT was proposed for the treatment of diffuse, non-resectable tumors in the lung. We performed boron biodistribution studies with 5 administration protocols employing the boron carriers BPA and/or GB-10 in an experimental model of disseminated lung metastases in rats. All 5 protocols were non-toxic and showed preferential tumor boron uptake versus lung. Absolute tumor boron concentration values were therapeutically useful (25-76ppm) for 3 protocols. Dosimetric calculations indicate that BNCT at RA-3 would be potentially therapeutic without exceeding radiotolerance in the lung. © 2013 Published by Elsevier Ltd.

  3. Biodistribution of the boron carriers boronophenylalanine (BPA) and/or decahydrodecaborate (GB-10) for Boron Neutron Capture Therapy (BNCT) in an experimental model of lung metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.W. Nigg; Various Others

    BNCT was proposed for the treatment of diffuse, non-resectable tumors in the lung. We performed boron biodistribution studies with 5 administration protocols employing the boron carriers BPA and/or GB-10 in an experimental model of disseminated lung metastases in rats. All 5 protocols were non-toxic and showed preferential tumor boron uptake versus lung. Absolute tumor boron concentration values were therapeutically useful (25–76 ppm) for 3 protocols. Dosimetric calculations indicate that BNCT at RA-3 would be potentially therapeutic without exceeding radiotolerance in the lung.

  4. Recommendations for Benchmarking Preclinical Studies of Nanomedicines.

    PubMed

    Dawidczyk, Charlene M; Russell, Luisa M; Searson, Peter C

    2015-10-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small-molecule drug therapy for cancer and to achieve both therapeutic and diagnostic functions in the same platform. Preclinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of preclinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of preclinical trials and propose a protocol for benchmarking that we recommend be included in in vivo preclinical studies of drug-delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. ©2015 American Association for Cancer Research.

  5. Perspective: Recommendations for benchmarking pre-clinical studies of nanomedicines

    PubMed Central

    Dawidczyk, Charlene M.; Russell, Luisa M.; Searson, Peter C.

    2015-01-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small molecule drug therapy for cancer, and to achieve both therapeutic and diagnostic functions in the same platform. Pre-clinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of pre-clinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of pre-clinical trials and propose a protocol for benchmarking that we recommend be included in in vivo pre-clinical studies of drug delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. PMID:26249177

  6. Defining standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to mastitis pathogens.

    PubMed

    Schukken, Y H; Rauch, B J; Morelli, J

    2013-04-01

    The objective of this paper was to define standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to both Staphylococcus aureus and Streptococcus agalactiae. The standardized protocols describe the selection of cows and herds and define the critical points in performing experimental exposure, performing bacterial culture, evaluating the culture results, and finally performing statistical analyses and reporting of the results. The protocols define both negative control and positive control trials. For negative control trials, the protocol states that an efficacy of reducing new intramammary infections (IMI) of at least 40% is required for a teat disinfectant to be considered effective. For positive control trials, noninferiority to a control disinfectant with a published efficacy of reducing new IMI of at least 70% is required. Sample sizes for both negative and positive control trials are calculated. Positive control trials are expected to require a large trial size. Statistical analysis methods are defined and, in the proposed methods, the rate of IMI may be analyzed using generalized linear mixed models. The efficacy of the test product can be evaluated while controlling for important covariates and confounders in the trial. Finally, standards for reporting are defined and reporting considerations are discussed. The use of the defined protocol is shown through presentation of the results of a recent trial of a test product against a negative control. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. An operational open-end file transfer protocol for mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Wang, Charles; Cheng, Unjeng; Yan, Tsun-Yee

    1988-01-01

    This paper describes an operational open-end file transfer protocol which includes the connecting procedure, data transfer, and relinquishment procedure for mobile satellite communications. The protocol makes use of the frame level and packet level formats of the X.25 standard for the data link layer and network layer, respectively. The structure of a testbed for experimental simulation of this protocol over a mobile fading channel is also introduced.

  8. Nutritional Management in Enterocutaneous Fistula. What is the evidence?

    PubMed Central

    BADRASAWI, Manal; SHAHAR, Suzana; SAGAP, Ismail

    2015-01-01

    The management of Enterocutaneous fistula (ECF) is challenging. It remains associated with morbidity and mortality, despite advancements in medical and surgical therapies. Early nutritional support using parenteral, enteral or fystuloclysis routs is essential to reverse catabolism and replace nutrients, fluid and electrolyte losses. This study aims to review the current literature on the management of ECF. Fistulae classifications have an impact on the calories and protein requirements. Early nutritional support with parenteral, enteral nutrition or fistuloclysis played a significant role in the management outcome. Published literature on the nutritional management of ECF is mostly retrospective and lacks experimental design. Prospective studies do not investigate nutritional assessment or management experimentally. Individualising the nutritional management protocol was recommended due to the absence of management guidelines for ECF patients. PMID:26715903

  9. The Integration of Family and Group Therapy as an Alternative to Juvenile Incarceration: A Quasi-Experimental Evaluation Using Parenting with Love and Limits.

    PubMed

    Karam, Eli A; Sterrett, Emma M; Kiaer, Lynn

    2017-06-01

    The current study employed a quasi-experimental design using both intent-to-treat and protocol adherence analysis of 155 moderate- to high-risk juvenile offenders to evaluate the effectiveness of Parenting with Love and Limits® (PLL), an integrative group and family therapy approach. Youth completing PLL had significantly lower rates of recidivism than the comparison group. Parents also reported statistically significant improvements in youth behavior. Lengths of service were also significantly shorter for the treatment sample than the matched comparison group by an average of 4 months. This study contributes to the literature by suggesting that intensive community-based combined family and group treatment is effective in curbing recidivism among high-risk juveniles. © 2015 Family Process Institute.

  10. Time-dependent Increase in the Network Response to the Stimulation of Neuronal Cell Cultures on Micro-electrode Arrays.

    PubMed

    Gertz, Monica L; Baker, Zachary; Jose, Sharon; Peixoto, Nathalia

    2017-05-29

    Micro-electrode arrays (MEAs) can be used to investigate drug toxicity, design paradigms for next-generation personalized medicine, and study network dynamics in neuronal cultures. In contrast with more traditional methods, such as patch-clamping, which can only record activity from a single cell, MEAs can record simultaneously from multiple sites in a network, without requiring the arduous task of placing each electrode individually. Moreover, numerous control and stimulation configurations can be easily applied within the same experimental setup, allowing for a broad range of dynamics to be explored. One of the key dynamics of interest in these in vitro studies has been the extent to which cultured networks display properties indicative of learning. Mouse neuronal cells cultured on MEAs display an increase in response following training induced by electrical stimulation. This protocol demonstrates how to culture neuronal cells on MEAs; successfully record from over 95% of the plated dishes; establish a protocol to train the networks to respond to patterns of stimulation; and sort, plot, and interpret the results from such experiments. The use of a proprietary system for stimulating and recording neuronal cultures is demonstrated. Software packages are also used to sort neuronal units. A custom-designed graphical user interface is used to visualize post-stimulus time histograms, inter-burst intervals, and burst duration, as well as to compare the cellular response to stimulation before and after a training protocol. Finally, representative results and future directions of this research effort are discussed.

  11. The impact of protocol on nurses' role stress: a longitudinal perspective.

    PubMed

    Dodd-McCue, Diane; Tartaglia, Alexander; Veazey, Kenneth W; Streetman, Pamela S

    2005-04-01

    The study examined the impact of a protocol directed at increasing organ donation on the role stress and work attitudes of critical care nurses involved in potential organ donation cases. The research examined whether the protocol could positively affect nurses' perceptions of role stress, and if so, could the work environment improvements be sustained over time. The Family Communication Coordinator (FCC) protocol promotes effective communication during potential organ donation cases using a multidisciplinary team approach. Previous research found it associated with improved donation outcomes and with improved perceptions of role stress by critical care nurses. However, the previous study lacked methodological rigor necessary to determine causality and sustainability over time. The study used a quasi-experimental prospective longitudinal design. The sample included critical care nurses who had experience with potential organ donation cases with the protocol. Survey data were collected at 4 points over 2 years. Surveys used previously validated and reliable measures of role stress (role ambiguity, role conflict, role overload) and work attitudes (commitment, satisfaction). Interviews supplemented these data. The nurses' perceptions of role stress associated with potential organ donation cases dramatically dropped after the protocol was implemented. All measures of role stress, particularly role ambiguity and role conflict, showed statistically significant and sustained improvement. Nurses' professional, unit, and hospital commitment and satisfaction reflect an increasingly positive workplace. The results demonstrate that the FCC protocol positively influenced the workplace through its impact on role stress over the first 2 years following its implementation. The findings suggest that similar protocols may be appropriate in improving the critical care environment by reducing the stress and uncertainty of professionals involved in other end-of-life situations. However, the most striking implication relates to the reality of the workplace: meeting the goals of improved patient care outcomes and those of improving the healthcare work environment are not mutually exclusive and may be mutually essential.

  12. Reliability of Vibrating Mesh Technology.

    PubMed

    Gowda, Ashwin A; Cuccia, Ann D; Smaldone, Gerald C

    2017-01-01

    For delivery of inhaled aerosols, vibrating mesh systems are more efficient than jet nebulizers are and do not require added gas flow. We assessed the reliability of a vibrating mesh nebulizer (Aerogen Solo, Aerogen Ltd, Galway Ireland) suitable for use in mechanical ventilation. An initial observational study was performed with 6 nebulizers to determine run time and efficiency using normal saline and distilled water. Nebulizers were run until cessation of aerosol production was noted, with residual volume and run time recorded. Three controllers were used to assess the impact of the controller on nebulizer function. Following the observational study, a more detailed experimental protocol was performed using 20 nebulizers. For this analysis, 2 controllers were used, and time to cessation of aerosol production was noted. Gravimetric techniques were used to measure residual volume. Total nebulization time and residual volume were recorded. Failure was defined as premature cessation of aerosol production represented by residual volume of > 10% of the nebulizer charge. In the initial observational protocol, an unexpected sporadic failure rate was noted of 25% in 55 experimental runs. In the experimental protocol, a failure rate was noted of 30% in 40 experimental runs. Failed runs in the experimental protocol exhibited a wide range of retained volume averaging ± SD 36 ± 21.3% compared with 3.2 ± 1.5% (P = .001) in successful runs. Small but significant differences existed in nebulization time between controllers. Aerogen Solo nebulization was often randomly interrupted with a wide range of retained volumes. Copyright © 2017 by Daedalus Enterprises.

  13. Detecting and removing multiplicative spatial bias in high-throughput screening technologies.

    PubMed

    Caraus, Iurie; Mazoure, Bogdan; Nadon, Robert; Makarenkov, Vladimir

    2017-10-15

    Considerable attention has been paid recently to improve data quality in high-throughput screening (HTS) and high-content screening (HCS) technologies widely used in drug development and chemical toxicity research. However, several environmentally- and procedurally-induced spatial biases in experimental HTS and HCS screens decrease measurement accuracy, leading to increased numbers of false positives and false negatives in hit selection. Although effective bias correction methods and software have been developed over the past decades, almost all of these tools have been designed to reduce the effect of additive bias only. Here, we address the case of multiplicative spatial bias. We introduce three new statistical methods meant to reduce multiplicative spatial bias in screening technologies. We assess the performance of the methods with synthetic and real data affected by multiplicative spatial bias, including comparisons with current bias correction methods. We also describe a wider data correction protocol that integrates methods for removing both assay and plate-specific spatial biases, which can be either additive or multiplicative. The methods for removing multiplicative spatial bias and the data correction protocol are effective in detecting and cleaning experimental data generated by screening technologies. As our protocol is of a general nature, it can be used by researchers analyzing current or next-generation high-throughput screens. The AssayCorrector program, implemented in R, is available on CRAN. makarenkov.vladimir@uqam.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Improved injection needles facilitate germline transformation of the buckeye butterfly Junonia coenia.

    PubMed

    Beaudette, Kahlia; Hughes, Tia M; Marcus, Jeffrey M

    2014-01-01

    Germline transformation with transposon vectors is an important tool for insect genetics, but progress in developing transformation protocols for butterflies has been limited by high post-injection ova mortality. Here we present an improved glass injection needle design for injecting butterfly ova that increases survival in three Nymphalid butterfly species. Using the needles to genetically transform the common buckeye butterfly Junonia coenia, the hatch rate for injected Junonia ova was 21.7%, the transformation rate was 3%, and the overall experimental efficiency was 0.327%, a substantial improvement over previous results in other butterfly species. Improved needle design and a higher efficiency of transformation should permit the deployment of transposon-based genetic tools in a broad range of less fecund lepidopteran species.

  15. Microencapsulation of Corrosion Indicators for Smart Coatings

    NASA Technical Reports Server (NTRS)

    Li, Wenyan; Buhrow, Jerry W.; Jolley, Scott T.; Calle, Luz M.; Hanna,Joshua S.; Rawlins, James W.

    2011-01-01

    A multifunctional smart coating for the autonomous detection, indication, and control of corrosion is been developed based on microencapsulation technology. This paper summarizes the development, optimization, and testing of microcapsules specifically designed for early detection and indication of corrosion when incorporated into a smart coating. Results from experiments designed to test the ability of the microcapsules to detect and indicate corrosion, when blended into several paint systems, show that these experimental coatings generate a color change, indicative of spot specific corrosion events, that can be observed with the naked eye within hours rather than the hundreds of hours or months typical of the standard accelerated corrosion test protocols.. Key words: smart coating, corrosion detection, microencapsulation, microcapsule, pH-sensitive microcapsule, corrosion indicator, corrosion sensing paint

  16. Practical quantum appointment scheduling

    NASA Astrophysics Data System (ADS)

    Touchette, Dave; Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-04-01

    We propose a protocol based on coherent states and linear optics operations for solving the appointment-scheduling problem. Our main protocol leaks strictly less information about each party's input than the optimal classical protocol, even when considering experimental errors. Along with the ability to generate constant-amplitude coherent states over two modes, this protocol requires the ability to transfer these modes back-and-forth between the two parties multiple times with very low losses. The implementation requirements are thus still challenging. Along the way, we develop tools to study quantum information cost of interactive protocols in the finite regime.

  17. Jog Your Mind: methodology and challenges of conducting evaluative research in partnership with community organizations.

    PubMed

    Bier, Nathalie; Lorthios-Guilledroit, Agathe; Nour, Kareen; Parisien, Manon; Ellemberg, Dave; Laforest, Sophie

    2015-01-01

    Jog Your Mind is a community-based program aiming at empowering elderly people to maintain their cognitive abilities using a multi-strategic approach including cognitively stimulating activities, mnemonic strategies, and strategies to promote healthy behaviors. It is offered to elderly individuals without known or diagnosed cognitive impairment by volunteers or community practitioners over ten weekly sessions. This paper describes the protocol of a quasi-experimental study designed to evaluate Jog Your Mind. Community responsible to recruit participants were either assigned to the experimental group (participating in the Jog Your Mind program) or to the control group (one-year waiting list). All participants were interviewed at baseline (T1), after the program (T2), and 12 months after the baseline (T3). Primary outcomes were the use of everyday memory strategies and aids and subjective memory functioning in daily life. Secondary outcomes included attitudes, knowledge, and behaviors related to cognitive vitality and cognitive abilities (memory and executive functions). Program delivery, organizational and environmental variables were recorded to document the implementation process. Twenty-three community organizations recruited 294 community-dwelling elderly individuals in total at T1. Between T1 and T3, an attrition rate of 15.2% was obtained. Jog Your Mind is one of the only programs targeting cognition among older adults being offered in community settings by community practitioners. The protocol described was designed with a focus on maximizing broad generalizations of the results while achieving scientific rigor. It can serve as an example to guide future research aiming to evaluate health interventions under natural conditions.

  18. Methods for processing high-throughput RNA sequencing data.

    PubMed

    Ares, Manuel

    2014-11-03

    High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.

  19. Revisiting control establishments for emerging energy hubs

    NASA Astrophysics Data System (ADS)

    Nasirian, Vahidreza

    Emerging small-scale energy systems, i.e., microgrids and smartgrids, rely on centralized controllers for voltage regulation, load sharing, and economic dispatch. However, the central controller is a single-point-of-failure in such a design as either the controller or attached communication links failure can render the entire system inoperable. This work seeks for alternative distributed control structures to improve system reliability and help to the scalability of the system. A cooperative distributed controller is proposed that uses a noise-resilient voltage estimator and handles global voltage regulation and load sharing across a DC microgrid. Distributed adaptive droop control is also investigated as an alternative solution. A droop-free distributed control is offered to handle voltage/frequency regulation and load sharing in AC systems. This solution does not require frequency measurement and, thus, features a fast frequency regulation. Distributed economic dispatch is also studied, where a distributed protocol is designed that controls generation units to merge their incremental costs into a consensus and, thus, push the entire system to generate with the minimum cost. Experimental verifications and Hardware-in-the-Loop (HIL) simulations are used to study efficacy of the proposed control protocols.

  20. Detecting and overcoming systematic bias in high-throughput screening technologies: a comprehensive review of practical issues and methodological solutions.

    PubMed

    Caraus, Iurie; Alsuwailem, Abdulaziz A; Nadon, Robert; Makarenkov, Vladimir

    2015-11-01

    Significant efforts have been made recently to improve data throughput and data quality in screening technologies related to drug design. The modern pharmaceutical industry relies heavily on high-throughput screening (HTS) and high-content screening (HCS) technologies, which include small molecule, complementary DNA (cDNA) and RNA interference (RNAi) types of screening. Data generated by these screening technologies are subject to several environmental and procedural systematic biases, which introduce errors into the hit identification process. We first review systematic biases typical of HTS and HCS screens. We highlight that study design issues and the way in which data are generated are crucial for providing unbiased screening results. Considering various data sets, including the publicly available ChemBank data, we assess the rates of systematic bias in experimental HTS by using plate-specific and assay-specific error detection tests. We describe main data normalization and correction techniques and introduce a general data preprocessing protocol. This protocol can be recommended for academic and industrial researchers involved in the analysis of current or next-generation HTS data. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. Relative binding affinity prediction of farnesoid X receptor in the D3R Grand Challenge 2 using FEP.

    PubMed

    Schindler, Christina; Rippmann, Friedrich; Kuhn, Daniel

    2018-01-01

    Physics-based free energy simulations have increasingly become an important tool for predicting binding affinity and the recent introduction of automated protocols has also paved the way towards a more widespread use in the pharmaceutical industry. The D3R 2016 Grand Challenge 2 provided an opportunity to blindly test the commercial free energy calculation protocol FEP+ and assess its performance relative to other affinity prediction methods. The present D3R free energy prediction challenge was built around two experimental data sets involving inhibitors of farnesoid X receptor (FXR) which is a promising anticancer drug target. The FXR binding site is predominantly hydrophobic with few conserved interaction motifs and strong induced fit effects making it a challenging target for molecular modeling and drug design. For both data sets, we achieved reasonable prediction accuracy (RMSD ≈ 1.4 kcal/mol, rank 3-4 according to RMSD out of 20 submissions) comparable to that of state-of-the-art methods in the field. Our D3R results boosted our confidence in the method and strengthen our desire to expand its applications in future in-house drug design projects.

  2. Relative binding affinity prediction of farnesoid X receptor in the D3R Grand Challenge 2 using FEP+

    NASA Astrophysics Data System (ADS)

    Schindler, Christina; Rippmann, Friedrich; Kuhn, Daniel

    2018-01-01

    Physics-based free energy simulations have increasingly become an important tool for predicting binding affinity and the recent introduction of automated protocols has also paved the way towards a more widespread use in the pharmaceutical industry. The D3R 2016 Grand Challenge 2 provided an opportunity to blindly test the commercial free energy calculation protocol FEP+ and assess its performance relative to other affinity prediction methods. The present D3R free energy prediction challenge was built around two experimental data sets involving inhibitors of farnesoid X receptor (FXR) which is a promising anticancer drug target. The FXR binding site is predominantly hydrophobic with few conserved interaction motifs and strong induced fit effects making it a challenging target for molecular modeling and drug design. For both data sets, we achieved reasonable prediction accuracy (RMSD ≈ 1.4 kcal/mol, rank 3-4 according to RMSD out of 20 submissions) comparable to that of state-of-the-art methods in the field. Our D3R results boosted our confidence in the method and strengthen our desire to expand its applications in future in-house drug design projects.

  3. Estimation and uncertainty analysis of dose response in an inter-laboratory experiment

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Rösslein, Matthias; Elliott, John T.; Petersen, Elijah J.

    2016-02-01

    An inter-laboratory experiment for the evaluation of toxic effects of NH2-polystyrene nanoparticles on living human cancer cells was performed with five participating laboratories. Previously published results from nanocytoxicity assays are often contradictory, mostly due to challenges related to producing a reliable cytotoxicity assay protocol for use with nanomaterials. Specific challenges include reproducibility preparing nanoparticle dispersions, biological variability from testing living cell lines, and the potential for nano-related interference effects. In this experiment, such challenges were addressed by developing a detailed experimental protocol and using a specially designed 96-well plate layout which incorporated a range of control measurements to assess multiple factors such as nanomaterial interference, pipetting accuracy, cell seeding density, and instrument performance. Detailed data analysis of these control measurements showed that good control of the experiments was attained by all participants in most cases. The main measurement objective of the study was the estimation of a dose response relationship between concentration of the nanoparticles and metabolic activity of the living cells, under several experimental conditions. The dose curve estimation was achieved by imbedding a three parameter logistic curve in a three level Bayesian hierarchical model, accounting for uncertainty due to all known experimental conditions as well as between laboratory variability in a top-down manner. Computation was performed using Markov Chain Monte Carlo methods. The fit of the model was evaluated using Bayesian posterior predictive probabilities and found to be satisfactory.

  4. Simulation of transmission electron microscope images of biological specimens.

    PubMed

    Rullgård, H; Ofverstedt, L-G; Masich, S; Daneholt, B; Oktem, O

    2011-09-01

    We present a new approach to simulate electron cryo-microscope images of biological specimens. The framework for simulation consists of two parts; the first is a phantom generator that generates a model of a specimen suitable for simulation, the second is a transmission electron microscope simulator. The phantom generator calculates the scattering potential of an atomic structure in aqueous buffer and allows the user to define the distribution of molecules in the simulated image. The simulator includes a well defined electron-specimen interaction model based on the scalar Schrödinger equation, the contrast transfer function for optics, and a noise model that includes shot noise as well as detector noise including detector blurring. To enable optimal performance, the simulation framework also includes a calibration protocol for setting simulation parameters. To test the accuracy of the new framework for simulation, we compare simulated images to experimental images recorded of the Tobacco Mosaic Virus (TMV) in vitreous ice. The simulated and experimental images show good agreement with respect to contrast variations depending on dose and defocus. Furthermore, random fluctuations present in experimental and simulated images exhibit similar statistical properties. The simulator has been designed to provide a platform for development of new instrumentation and image processing procedures in single particle electron microscopy, two-dimensional crystallography and electron tomography with well documented protocols and an open source code into which new improvements and extensions are easily incorporated. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  5. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits can be anticipated for the assessment of safety and toxicity profile of novel molecules. © The Author 2016. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Effects of a Straw Phonation Protocol on Acoustic and Perceptual Measures of an SATB Chorus.

    PubMed

    Manternach, Jeremy N; Daugherty, James F

    2017-12-29

    Recent scholarship has suggested that semi-occluded vocal tract (SOVT) exercises may increase vocal economy of individuals by reducing vocal effort while maintaining or increasing acoustic output. Choral singers, however, may use different resonance techniques or change voicing behaviors in an effort to hear their own sound in relation to others. One investigation revealed significant increases in a choir's mean spectral energy after participating in a straw phonation protocol. However, that study reported only acoustic measures and did not include choristers' perceptions of the choral sound and their own voicing efficiency. The purpose of this study was to measure the effect of a straw phonation protocol on acoustic (long-term average spectrum) and perceptual (self-report) measures of the choral sound of an intact soprano, alto, tenor, and bass (SATB) choir. This is a quasi-experimental, one-group, pretest-posttest design. An SATB choir (N = 48 singers) performed a Renaissance motet, participated in a 4-minute voicing protocol with a small straw, and then sang the motet a second time. They completed the same procedure later in the rehearsal. Long-term average spectrum results indicated no statistically significant mean changes in spectral energy after the SOVT protocols. Most participants, however, perceived that the choir sounded better (78.26%) and that their own vocal production was more efficient or comfortable (73.91%) following the protocol. Choristers perceived less vocal effort while maintaining vocal output after straw phonation, which may feasibly align with extant solo research. More research may determine whether this result is due specifically to SOVTs. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  7. Full-Scale Wind-Tunnel Investigation of Wing-Cooling Ducts Effects of Propeller Slipstream, Special Report

    NASA Technical Reports Server (NTRS)

    Nickle, F. R.; Freeman, Arthur B.

    1939-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  8. Minutiae Matching with Privacy Protection Based on the Combination of Garbled Circuit and Homomorphic Encryption

    PubMed Central

    Li, Mengxing; Zhao, Jian; Yang, Mei; Kang, Lijun; Wu, Lili

    2014-01-01

    Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks. PMID:24711729

  9. Minutiae matching with privacy protection based on the combination of garbled circuit and homomorphic encryption.

    PubMed

    Li, Mengxing; Feng, Quan; Zhao, Jian; Yang, Mei; Kang, Lijun; Wu, Lili

    2014-01-01

    Biometrics plays an important role in authentication applications since they are strongly linked to holders. With an increasing growth of e-commerce and e-government, one can expect that biometric-based authentication systems are possibly deployed over the open networks in the near future. However, due to its openness, the Internet poses a great challenge to the security and privacy of biometric authentication. Biometric data cannot be revoked, so it is of paramount importance that biometric data should be handled in a secure way. In this paper we present a scheme achieving privacy-preserving fingerprint authentication between two parties, in which fingerprint minutiae matching algorithm is completed in the encrypted domain. To improve the efficiency, we exploit homomorphic encryption as well as garbled circuits to design the protocol. Our goal is to provide protection for the security of template in storage and data privacy of two parties in transaction. The experimental results show that the proposed authentication protocol runs efficiently. Therefore, the protocol can run over open networks and help to alleviate the concerns on security and privacy of biometric applications over the open networks.

  10. Soft chelating irrigation protocol optimizes bonding quality of Resilon/Epiphany root fillings.

    PubMed

    De-Deus, Gustavo; Namen, Fátima; Galan, João; Zehnder, Matthias

    2008-06-01

    This study was designed to test the impact of either a strong (MTAD) or a soft (1-hydroxyethylidene-1, 1-bisphosphonate [HEPB]) chelating solution on the bond strength of Resilon/Epiphany root fillings. Both 17% EDTA and the omission of a chelator in the irrigation protocol were used as reference treatments. Forty extracted human upper lateral incisors were prepared using different irrigation protocols (n = 10): G1: NaOCl, G2: NaOCl + 17% EDTA, G3: NaOCl + BioPure MTAD (Dentsply/Tulsa, Tulsa, OK), and G4: NaOCl + 18% HEPB. The teeth were obturated and then prepared for micropush-out assessment using root slices of 1 mm thickness. Loading was performed on a universal testing machine at a speed of 0.5 mm/min. One-way analysis of variance and Tukey multiple comparisons were used to compare the results among the experimental groups. EDTA- and MTAD-treated samples revealed intermediate bond strength (0.3-3.6 MPa). The lowest bond strengths were achieved in NaOCl-treated samples (0.3-1.2 MPa, p < 0.05). The highest bond strength was reached in the HEBP-treated samples (3.1-6.1 MPa, p < 0.05). Under the present in vitro conditions, the soft chelating irrigation protocol (18% HEBP) optimized the bonding quality of Resilon/Epiphany (Resilon Research LLC, Madison, CT) root fillings.

  11. Central Composite Design Optimization of Zinc Removal from Contaminated Soil, Using Citric Acid as Biodegradable Chelant.

    PubMed

    Asadzadeh, Farrokh; Maleki-Kaklar, Mahdi; Soiltanalinejad, Nooshin; Shabani, Farzin

    2018-02-08

    Citric acid (CA) was evaluated in terms of its efficiency as a biodegradable chelating agent, in removing zinc (Zn) from heavily contaminated soil, using a soil washing process. To determine preliminary ranges of variables in the washing process, single factor experiments were carried out with different CA concentrations, pH levels and washing times. Optimization of batch washing conditions followed using a response surface methodology (RSM) based central composite design (CCD) approach. CCD predicted values and experimental results showed strong agreement, with an R 2 value of 0.966. Maximum removal of 92.8% occurred with a CA concentration of 167.6 mM, pH of 4.43, and washing time of 30 min as optimal variable values. A leaching column experiment followed, to examine the efficiency of the optimum conditions established by the CCD model. A comparison of two soil washing techniques indicated that the removal efficiency rate of the column experiment (85.8%) closely matching that of the batch experiment (92.8%). The methodology supporting the research experimentation for optimizing Zn removal may be useful in the design of protocols for practical engineering soil decontamination applications.

  12. Effects of Guided Written Disclosure Protocol on mood states and psychological symptoms among parents of off-therapy acute lymphoblastic leukemia children.

    PubMed

    Martino, Maria Luisa; Freda, Maria Francesca; Camera, Flavia

    2013-06-01

    This study assesses the effects of Guided Written Disclosure Protocol on psychological distress in mothers and fathers of off-therapy acute lymphoblastic leukemia children. An experimental group participated in the writing intervention with a control group subject only to test-taking standards. The Symptom Questionnaire and Profile of Mood States were administered at baseline, post-intervention, and follow-up. Guided Written Disclosure Protocol had significant effects on the progressive reduction of anxiety, depression, somatic symptoms, hostility, tension-anxiety, and fatigue-inertia within the experimental group. However, the control group distress levels tended to worsen over time. The mediating role of emotional processing was highlighted.

  13. Nonpainful wide-area compression inhibits experimental pain.

    PubMed

    Honigman, Liat; Bar-Bachar, Ofrit; Yarnitsky, David; Sprecher, Elliot; Granovsky, Yelena

    2016-09-01

    Compression therapy, a well-recognized treatment for lymphoedema and venous disorders, pressurizes limbs and generates massive non-noxious afferent sensory barrages. The aim of this study was to study whether such afferent activity has an analgesic effect when applied on the lower limbs, hypothesizing that larger compression areas will induce stronger analgesic effects, and whether this effect correlates with conditioned pain modulation (CPM). Thirty young healthy subjects received painful heat and pressure stimuli (47°C for 30 seconds, forearm; 300 kPa for 15 seconds, wrist) before and during 3 compression protocols of either SMALL (up to ankles), MEDIUM (up to knees), or LARGE (up to hips) compression areas. Conditioned pain modulation (heat pain conditioned by noxious cold water) was tested before and after each compression protocol. The LARGE protocol induced more analgesia for heat than the SMALL protocol (P < 0.001). The analgesic effect interacted with gender (P = 0.015). The LARGE protocol was more efficient for females, whereas the MEDIUM protocol was more efficient for males. Pressure pain was reduced by all protocols (P < 0.001) with no differences between protocols and no gender effect. Conditioned pain modulation was more efficient than the compression-induced analgesia. For the LARGE protocol, precompression CPM efficiency positively correlated with compression-induced analgesia. Large body area compression exerts an area-dependent analgesic effect on experimental pain stimuli. The observed correlation with pain inhibition in response to robust non-noxious sensory stimulation may suggest that compression therapy shares similar mechanisms with inhibitory pain modulation assessed through CPM.

  14. Assessment of three AC electroosmotic flow protocols for mixing in microfluidic channel.

    PubMed

    Chen, Jia-Kun; Weng, Chi-Neng; Yang, Ruey-Jen

    2009-05-07

    This study performs an experimental investigation into the micromixer capabilities of three different protocols of AC electroosmotic flow (AC EOF), namely capacitive charging (CC), Faradaic charging (FC) and asymmetric polarization (AP). The results reveal that the vortices generated by the FC protocol (the frequency is around 50-350 Hz) are stronger than those induced by the CC protocol (the frequency is higher than 350 Hz), and therefore provide an improved mixing effect. However, in the FC protocol, the frequency of the external AC voltage must be carefully controlled to avoid damaging electrodes as a result of Faradaic reactions. The experimental results indicate that the AP polarization effect (the applied voltage and frequency are V(1) = 1 V(pp) and V(2) = 20 V(pp)/5 kHz) induces more powerful vortices than either the CC protocol or the FC protocol, and therefore yields a better mixing performance. Two AP-based micromixers are fabricated with symmetric and asymmetric electrode configurations, respectively. The mixing indices achieved by the two devices after an elapsed time of 60 seconds are found to be 56.49 % and 71.77 %, respectively. This result shows that of the two devices, an asymmetric electrode configuration represents a more suitable choice for micromixer in microfluidic devices.

  15. Coping-Infused Dialogue through Patient-Preferred Live Music: A Medical Music Therapy Protocol and Randomized Pilot Study for Hospitalized Organ Transplant Patients.

    PubMed

    Hogan, Tyler James; Silverman, Michael J

    2015-01-01

    Solid organ transplant patients often experience a variety of psychosocial stressors that can lead to distress and may hinder successful recovery. Using coping-infused dialogue (CID) through patient- preferred live music (PPLM) music therapy sessions may improve mood and decrease pain while also imparting psychoeducational knowledge concerning the identification of local and global problems and coping skills. The purpose of this pilot study was to develop a coping-based medical music therapy protocol that combines coping-infused dialogue (CID) with patient-preferred live music (PPLM) and measure the effects of the resulting CID-PPLM protocol on mood (positive and negative affect) and pain in hospitalized transplant patients. Our study used a pre-/posttest single-session wait-list control design. Participants (N=25) were randomly assigned to experimental (CID-PPLM) or control (usual care) conditions. Participants in the CID-PPLM condition received a single 30-minute session that integrated stressor identification and knowledge of coping skills (CID) with patient-preferred live music (PPLM). Results indicated no between-group differences at pretest and significant correlations between pre- and posttest measures. Concerning posttest ANCOVA analyses, there were significant between-group differences in positive affect, negative affect, and pain, with experimental participants having more favorable posttest scores than control participants. Effect sizes were in the medium-to-large range for positive affect (η2=.198), negative affect (η2=.422), and pain (η2=.303). CID through receptive PPLM may be an effective protocol for improving mood and decreasing pain in organ transplant recipients. MT interventions can be an important tool to develop rapport and enhance outcomes with patients. As greater engagement during interventions may have stronger treatment effects, we recommend future research examining patient engagement as a potential mediator of intervention effects, as well as the number of sessions required to maximize clinical outcomes. © the American Music Therapy Association 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Serious gaming during multidisciplinary rehabilitation for patients with complex chronic pain or fatigue complaints: study protocol for a controlled trial and process evaluation.

    PubMed

    Vugts, Miel A P; Joosen, Margot C W; Mert, Agali; Zedlitz, Aglaia; Vrijhoef, Hubertus J M

    2017-06-08

    Many individuals suffer from chronic pain or functional somatic syndromes and face boundaries for diminishing functional limitations by means of biopsychosocial interventions. Serious gaming could complement multidisciplinary interventions through enjoyment and independent accessibility. A study protocol is presented for studying whether, how, for which patients and under what circumstances, serious gaming improves patient health outcomes during regular multidisciplinary rehabilitation. A mixed-methods design is described that prioritises a two-armed naturalistic quasi-experiment. An experimental group is composed of patients who follow serious gaming during an outpatient multidisciplinary programme at two sites of a Dutch rehabilitation centre. Control group patients follow the same programme without serious gaming in two similar sites. Multivariate mixed-modelling analysis is planned for assessing how much variance in 250 patient records of routinely monitored pain intensity, pain coping and cognition, fatigue and psychopathology outcomes is attributable to serious gaming. Embedded qualitative methods include unobtrusive collection and analyses of stakeholder focus group interviews, participant feedback and semistructured patient interviews. Process analyses are carried out by a systematic approach of mixing qualitative and quantitative methods at various stages of the research. The Ethics Committee of the Tilburg School of Social and Behavioural Sciences approved the research after reviewing the protocol for the protection of patients' interests in conformity to the letter and rationale of the applicable laws and research practice (EC 2016.25t). Findings will be presented in research articles and international scientific conferences. A prospective research protocol for the naturalistic quasi-experimental outcome evaluation was entered in the Dutch trial register (registration number: NTR6020; Pre-results). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. An example problem illustrating the application of the national lime association mixture design and testing protocol (MDTP) to ascertain engineering properties of lime-treated subgrades for mechanistic pavement design/analysis.

    DOT National Transportation Integrated Search

    2001-09-01

    This document presents an example of mechanistic design and analysis using a mix design and : testing protocol. More specifically, it addresses the structural properties of lime-treated subgrade, : subbase, and base layers through mechanistic design ...

  18. BioBlocks: Programming Protocols in Biology Made Easier.

    PubMed

    Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso

    2017-07-21

    The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.

  19. DTN routing in body sensor networks with dynamic postural partitioning.

    PubMed

    Quwaider, Muhannad; Biswas, Subir

    2010-11-01

    This paper presents novel store-and-forward packet routing algorithms for Wireless Body Area Networks ( WBAN ) with frequent postural partitioning. A prototype WBAN has been constructed for experimentally characterizing on-body topology disconnections in the presence of ultra short range radio links, unpredictable RF attenuation, and human postural mobility. On-body DTN routing protocols are then developed using a stochastic link cost formulation, capturing multi-scale topological localities in human postural movements. Performance of the proposed protocols are evaluated experimentally and via simulation, and are compared with a number of existing single-copy DTN routing protocols and an on-body packet flooding mechanism that serves as a performance benchmark with delay lower-bound. It is shown that via multi-scale modeling of the spatio-temporal locality of on-body link disconnection patterns, the proposed algorithms can provide better routing performance compared to a number of existing probabilistic, opportunistic, and utility-based DTN routing protocols in the literature.

  20. A Within-subjects Experimental Protocol to Assess the Effects of Social Input on Infant EEG.

    PubMed

    St John, Ashley M; Kao, Katie; Chita-Tegmark, Meia; Liederman, Jacqueline; Grieve, Philip G; Tarullo, Amanda R

    2017-05-03

    Despite the importance of social interactions for infant brain development, little research has assessed functional neural activation while infants socially interact. Electroencephalography (EEG) power is an advantageous technique to assess infant functional neural activation. However, many studies record infant EEG only during one baseline condition. This protocol describes a paradigm that is designed to comprehensively assess infant EEG activity in both social and nonsocial contexts as well as tease apart how different types of social inputs differentially relate to infant EEG. The within-subjects paradigm includes four controlled conditions. In the nonsocial condition, infants view objects on computer screens. The joint attention condition involves an experimenter directing the infant's attention to pictures. The joint attention condition includes three types of social input: language, face-to-face interaction, and the presence of joint attention. Differences in infant EEG between the nonsocial and joint attention conditions could be due to any of these three types of input. Therefore, two additional conditions (one with language input while the experimenter is hidden behind a screen and one with face-to-face interaction) were included to assess the driving contextual factors in patterns of infant neural activation. Representative results demonstrate that infant EEG power varied by condition, both overall and differentially by brain region, supporting the functional nature of infant EEG power. This technique is advantageous in that it includes conditions that are clearly social or nonsocial and allows for examination of how specific types of social input relate to EEG power. This paradigm can be used to assess how individual differences in age, affect, socioeconomic status, and parent-infant interaction quality relate to the development of the social brain. Based on the demonstrated functional nature of infant EEG power, future studies should consider the role of EEG recording context and design conditions that are clearly social or nonsocial.

  1. Quantum dense key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.

    2004-03-01

    This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility.

  2. Proof-of-principle experimental realization of a qubit-like qudit-based quantum key distribution scheme

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; Yin, Zhen-Qiang; Chau, H. F.; Chen, Wei; Wang, Chao; Guo, Guang-Can; Han, Zheng-Fu

    2018-04-01

    In comparison to qubit-based protocols, qudit-based quantum key distribution ones generally allow two cooperative parties to share unconditionally secure keys under a higher channel noise. However, it is very hard to prepare and measure the required quantum states in qudit-based protocols in general. One exception is the recently proposed highly error tolerant qudit-based protocol known as the Chau15 (Chau 2015 Phys. Rev. A 92 062324). Remarkably, the state preparation and measurement in this protocol can be done relatively easily since the required states are phase encoded almost like the diagonal basis states of a qubit. Here we report the first proof-of-principle demonstration of the Chau15 protocol. One highlight of our experiment is that its post-processing is based on practical one-way manner, while the original proposal in Chau (2015 Phys. Rev. A 92 062324) relies on complicated two-way post-processing, which is a great challenge in experiment. In addition, by manipulating time-bin qudit and measurement with a variable delay interferometer, our realization is extensible to qudit with high-dimensionality and confirms the experimental feasibility of the Chau15 protocol.

  3. Special Plans and Operations: Assessment of Allegations Concerning Traumatic Brain Injury Research Integrity in Iraq

    DTIC Science & Technology

    2011-03-31

    protocols conducted in Iraq. His office had been designated by the 1 A research protocol is a formal document detailing the study methodology and the...Human Research Protections Program plan requires scientific peer review to ensure that research is scientifically sound in its design and methods, and...ofthe approved research protocol and IRB minutes, revealed that there was no mention of "active rehabilitation and exercise" under the design

  4. The Xpress Transfer Protocol (XTP): A tutorial (expanded version)

    NASA Technical Reports Server (NTRS)

    Sanders, Robert M.; Weaver, Alfred C.

    1990-01-01

    The Xpress Transfer Protocol (XTP) is a reliable, real-time, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in high speed networks without compromising reliability and functionality. This paper describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4.

  5. EASE (Experimental Assembly of Structures in EVA) overview of selected results

    NASA Technical Reports Server (NTRS)

    Akin, David L.

    1987-01-01

    Experimental Assembly of Structures in EVA (EASE) objectives, experimental protocol, neutral buoyancy simulation, task time distribution, assembly task performance, metabolic rate/biomedical readouts are summarized. This presentation is shown in charts, figures, and graphs.

  6. Effects of a peer support programme for youth social services employees experiencing potentially traumatic events: a protocol for a prospective cohort study

    PubMed Central

    Guay, Stephane; Tremblay, Nicole; Goncalves, Jane; Bilodeau, Henriette; Geoffrion, Steve

    2017-01-01

    Introduction The use of peer support programmes to help workers experiencing potentially traumatic events (PTE) has increased in high-risk organisations in the last decades. However, the scientific evidence of its effectiveness is still very limited. This paper aims to describe the protocol of a prospective cohort study that assesses the efficacy of a peer support programme among youth social services employees exposed to a PTE at work on psychological well-being, work functioning and needs of support. Methods and analysis This is a mixed-methods prospective study that will examine workers' evolution four times over a 12-month period in Canada. This study involves: (1) quantitative data obtained through self-administrated questionnaires among 222 workers, and (2) qualitative in-depth interviews with a subsample of 45 workers. This study will compare findings from a cohort who received the support of a peer following a PTE (peer support–experimental protocol) as part of the experimental protocol of the Montreal Youth Social Services-University Institute (MYSS-UI), the second group of workers did not ask for the peer support (no peer support-experimental protocol) but was part of MYSS-UI, and the third group received standard organisational support from the Monteregie Youth Social Services (MYSS) (standard organisational protocol). Ethics and dissemination The protocol and informed consent form complied with the ethics guidelines of the MYSS-UI. The Research Ethics Board of MYSS-UI and MYSS reviewed and accepted the protocol as required. The results of the study will be published in peer-reviewed journals, presented at research and general public conferences, disseminated via a public report for the institute that funded the project and for all workers. Results of this study will influence decision making regarding intervention policies following PTE and peer support interventions may be expanded throughout the youth social services in Canada and worldwide. PMID:28647721

  7. Mathematical model formulation and validation of water and solute transport in whole hamster pancreatic islets.

    PubMed

    Benson, James D; Benson, Charles T; Critser, John K

    2014-08-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3×3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87±0.06 (mean ± SD). Only the treatment variable of perfusing solution was found to be significant (p<0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. A Protocol for Evaluating Contextual Design Principles

    PubMed Central

    Stamps, Arthur

    2014-01-01

    This paper explains how scientific data can be incorporated into urban design decisions, such as evaluating contextual design principles. The recommended protocols are based on the Cochrane Reviews that have been widely used in medical research. The major concepts of a Cochrane Review are explained, as well as the underlying mathematics. The underlying math is meta-analysis. Data are reported for three applications and seven contextual design policies. It is suggested that use of the Cochrane protocols will be of great assistance to planners by providing scientific data that can be used to evaluate the efficacies of contextual design policies prior to implementing those policies. PMID:25431448

  9. Virtual laboratories: new opportunities for collaborative water science

    NASA Astrophysics Data System (ADS)

    Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten

    2015-04-01

    Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.

  10. Effectiveness of the implementation of an evidence-based nursing model using participatory action research in oncohematology: research protocol.

    PubMed

    Abad-Corpa, Eva; Meseguer-Liza, Cristobal; Martínez-Corbalán, José Tomás; Zárate-Riscal, Lourdes; Caravaca-Hernández, Amor; Paredes-Sidrach de Cardona, Antonio; Carrillo-Alcaraz, Andrés; Delgado-Hito, Pilar; Cabrero-García, Julio

    2010-08-01

    To generate changes in nursing practice introducing an evidence-based clinical practice (EBCP) model through a participatory process. To evaluate the effectiveness of the changes in terms of nurse-sensitive outcome (NSO). For international nursing science, it is necessary to explore the reasons for supporting EBCP and evaluate the real repercussions and effectiveness. A mixed methods study with a sequential transformative design will be conducted in the bone marrow transplant unit of a tertiary-level Spanish hospital, in two time periods >12 months (date of approval of the protocol: 2006). To evaluate the effectiveness of the intervention, we will use a prospective quasi-experimental design with two non-equivalent and non-concurrent groups. NSO and patient health data will be collected: (a) impact of psycho-social adjustment; (b) patient satisfaction; (c) symptom control; (d) adverse effects. All patients admitted during the period of time will be included, and all staff working on the unit during a participatory action research (PAR). The PAR design will be adopted from a constructivist paradigm perspective, following Checkland's "Soft Systems" theoretical model. Qualitative techniques will be used: 2-hour group meetings with nursing professionals, to be recorded and transcribed. Field diaries (participants and researchers) will be drawn up and data analysis will be carried out by content analysis. PAR is a rigorous research method for introducing changes into practice to improve NSO.

  11. 3D printing of versatile reactionware for chemical synthesis.

    PubMed

    Kitson, Philip J; Glatzel, Stefan; Chen, Wei; Lin, Chang-Gen; Song, Yu-Fei; Cronin, Leroy

    2016-05-01

    In recent decades, 3D printing (also known as additive manufacturing) techniques have moved beyond their traditional applications in the fields of industrial manufacturing and prototyping to increasingly find roles in scientific research contexts, such as synthetic chemistry. We present a general approach for the production of bespoke chemical reactors, termed reactionware, using two different approaches to extrusion-based 3D printing. This protocol describes the printing of an inert polypropylene (PP) architecture with the concurrent printing of soft material catalyst composites, using two different 3D printer setups. The steps of the PROCEDURE describe the design and preparation of a 3D digital model of the desired reactionware device and the preparation of this model for use with fused deposition modeling (FDM) type 3D printers. The protocol then further describes the preparation of composite catalyst-silicone materials for incorporation into the 3D-printed device and the steps required to fabricate a reactionware device. This combined approach allows versatility in the design and use of reactionware based on the specific needs of the experimental user. To illustrate this, we present a detailed procedure for the production of one such reactionware device that will result in the production of a sealed reactor capable of effecting a multistep organic synthesis. Depending on the design time of the 3D model, and including time for curing and drying of materials, this procedure can be completed in ∼3 d.

  12. Generation of concatenated Greenberger-Horne-Zeilinger-type entangled coherent state based on linear optics

    NASA Astrophysics Data System (ADS)

    Guo, Rui; Zhou, Lan; Gu, Shi-Pu; Wang, Xing-Fu; Sheng, Yu-Bo

    2017-03-01

    The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new type of multipartite entangled state, which has potential application in future quantum information. In this paper, we propose a protocol of constructing arbitrary C-GHZ entangled state approximatively. Different from previous protocols, each logic qubit is encoded in the coherent state. This protocol is based on the linear optics, which is feasible in experimental technology. This protocol may be useful in quantum information based on the C-GHZ state.

  13. Translational Neuromodulation: Approximating Human Transcranial Magnetic Stimulation Protocols In Rats

    PubMed Central

    Vahabzadeh-Hagh, Andrew M.; Muller, Paul A.; Gersner, Roman; Zangen, Abraham; Rotenberg, Alexander

    2015-01-01

    Objective Transcranial magnetic stimulation (TMS) is a well-established clinical protocol with numerous potential therapeutic and diagnostic applications. Yet, much work remains in the elucidation of TMS mechanisms, optimization of protocols, and in development of novel therapeutic applications. As with many technologies, the key to these issues lies in the proper experimentation and translation of TMS methods to animal models, among which rat models have proven popular. A significant increase in the number of rat TMS publications has necessitated analysis of their relevance to human work. We therefore review the essential principles necessary for the approximation of human TMS protocols in rats as well as specific methods that addressed these issues in published studies. Materials and Methods We performed an English language literature search combined with our own experience and data. We address issues that we see as important in the translation of human TMS methods to rat models and provide a summary of key accomplishments in these areas. Results An extensive literature review illustrated the growth of rodent TMS studies in recent years. Current advances in the translation of single, paired-pulse, and repetitive stimulation paradigms to rodent models are presented. The importance of TMS in the generation of data for preclinical trials is also highlighted. Conclusions Rat TMS has several limitations when considering parallels between animal and human stimulation. However, it has proven to be a useful tool in the field of translational brain stimulation and will likely continue to aid in the design and implementation of stimulation protocols for therapeutic and diagnostic applications. PMID:22780329

  14. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.

    PubMed

    Alanazi, Adwan; Elleithy, Khaled

    2015-09-02

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.

  15. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis

    PubMed Central

    Alanazi, Adwan; Elleithy, Khaled

    2015-01-01

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol. PMID:26364639

  16. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  17. Finite-key analysis for the 1-decoy state QKD protocol

    NASA Astrophysics Data System (ADS)

    Rusca, Davide; Boaron, Alberto; Grünenfelder, Fadri; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    It has been shown that in the asymptotic case of infinite-key length, the 2-decoy state Quantum Key Distribution (QKD) protocol outperforms the 1-decoy state protocol. Here, we present a finite-key analysis of the 1-decoy method. Interestingly, we find that for practical block sizes of up to 108 bits, the 1-decoy protocol achieves for almost all experimental settings higher secret key rates than the 2-decoy protocol. Since using only one decoy is also easier to implement, we conclude that it is the best choice for QKD, in most common practical scenarios.

  18. Experimental Determination of Demand Response Control Models and Cost of Control for Ensembles of Window-Mount Air Conditioners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geller, Drew Adam; Backhaus, Scott N.

    Control of consumer electrical devices for providing electrical grid services is expanding in both the scope and the diversity of loads that are engaged in control, but there are few experimentally-based models of these devices suitable for control designs and for assessing the cost of control. A laboratory-scale test system is developed to experimentally evaluate the use of a simple window-mount air conditioner for electrical grid regulation services. The experimental test bed is a single, isolated air conditioner embedded in a test system that both emulates the thermodynamics of an air conditioned room and also isolates the air conditioner frommore » the real-world external environmental and human variables that perturb the careful measurements required to capture a model that fully characterizes both the control response functions and the cost of control. The control response functions and cost of control are measured using harmonic perturbation of the temperature set point and a test protocol that further isolates the air conditioner from low frequency environmental variability.« less

  19. Effectiveness of music therapy as an aid to neurorestoration of children with severe neurological disorders.

    PubMed

    Bringas, Maria L; Zaldivar, Marilyn; Rojas, Pedro A; Martinez-Montes, Karelia; Chongo, Dora M; Ortega, Maria A; Galvizu, Reynaldo; Perez, Alba E; Morales, Lilia M; Maragoto, Carlos; Vera, Hector; Galan, Lidice; Besson, Mireille; Valdes-Sosa, Pedro A

    2015-01-01

    This study was a two-armed parallel group design aimed at testing real world effectiveness of a music therapy (MT) intervention for children with severe neurological disorders. The control group received only the standard neurorestoration program and the experimental group received an additional MT "Auditory Attention plus Communication protocol" just before the usual occupational and speech therapy. Multivariate Item Response Theory (MIRT) identified a neuropsychological status-latent variable manifested in all children and which exhibited highly significant changes only in the experimental group. Changes in brain plasticity also occurred in the experimental group, as evidenced using a Mismatch Event Related paradigm which revealed significant post intervention positive responses in the latency range between 308 and 400 ms in frontal regions. LORETA EEG source analysis identified prefrontal and midcingulate regions as differentially activated by the MT in the experimental group. Taken together, our results showing improved attention and communication as well as changes in brain plasticity in children with severe neurological impairments, confirm the importance of MT for the rehabilitation of patients across a wide range of dysfunctions.

  20. Mindfulness-based intervention for teenagers with cancer: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Individuals living with cancer must learn to face not only the physical symptoms of their condition, but also the anxiety and uncertainty related to the progression of the disease, the anticipation of physical and emotional pain related to illness and treatment, the significant changes implied in living with cancer, as well as the fear of recurrence after remission. Mindfulness-based meditation constitutes a promising option to alleviate these manifestations. Methods/Design This article presents the rationale and protocol development for a research project aimed at evaluating the effects of a mindfulness-based meditation intervention on quality of life, sleep, and mood in adolescents with cancer compared to a control group. A prospective, longitudinal, experimental design involving three time points (baseline, post-intervention, and follow-up) and two groups (experimental and control) was developed for this project. Participants will be assigned randomly to either group. Eligible participants are adolescents aged 11 to 18 years with a diagnosis of cancer, with no specific selection/exclusion based on type, stage, or trajectory of cancer. A final sample size of 28 participants is targeted. Adolescents in the experimental group will be completing the mindfulness meditation intervention, taught by two trained therapists. The intervention will comprise of eight weekly sessions, lasting 90 min each. Once the follow-up assessment is completed by the experimental group, wait-list controls will be offered to complete the mindfulness-based program. Intra-group analyses will serve to evaluate the impact of the mindfulness-based meditation intervention on quality of life, sleep, and mood pre-post intervention, as well as follow-up. Analyses will also be used to carry out inter-group comparisons between the experimental group and the wait-list controls. Voluntary participation, risk of attrition, and the small sample size are potential limitations of this project. In spite of possible limitations, this project will be one among very few aimed at improving quality of life, sleep, and mood in adolescents living with cancer, will evaluate the potential benefits of such a practice on both psychological and physical health of youth with cancer, and help in creating mindfulness-based intervention programs, in order to provide the necessary psychological help to adolescents living with cancer. Trial registration Trial registration number: NCT01783418 PMID:23663534

  1. Device USB interface and software development for electric parameter measuring instrument

    NASA Astrophysics Data System (ADS)

    Li, Deshi; Chen, Jian; Wu, Yadong

    2003-09-01

    Aimed at general devices development, this paper discussed the development of USB interface and software development. With an example, using PDIUSBD12 which support parallel interface, the paper analyzed its technical characteristics. Designed different interface circuit with 80C52 singlechip microcomputer and TMS320C54 series digital signal processor, analyzed the address allocation, register access. According to USB1.1 standard protocol, designed the device software and application layer protocol. The paper designed the data exchange protocol, and carried out system functions.

  2. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial.

    PubMed

    van der Heijden, Amber A W A; de Bruijne, Martine C; Feenstra, Talitha L; Dekker, Jacqueline M; Baan, Caroline A; Bosmans, Judith E; Bot, Sandra D M; Donker, Gé A; Nijpels, Giel

    2014-06-25

    The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€ 758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Current Controlled trials: ISRCTN66124817.

  3. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial

    PubMed Central

    2014-01-01

    Background The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. Methods In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Results Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Conclusions Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Trial registration Current Controlled trials: ISRCTN66124817. PMID:24966055

  4. Achieving High Throughput for Data Transfer over ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  5. Design, construction and testing of a DC bioeffects test enclosure for small animals. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frazier, M J; Preache, M M

    1980-11-01

    This final report describes both the engineering development of a DC bioeffects test enclosure for small laboratory animals, and the biological protocol for the use of such enclosures in the testing of animals to determine possible biological effects of the environment associated with HVDC transmission lines. The test enclosure which has been designed is a modular unit, which will house up to eight rat-sized animals in individual compartments. Multiple test enclosures can be used to test larger numbers of animals. A prototype test enclosure has been fabricated and tested to characterize its electrical performance characteristics. The test enclosure provides amore » simulation of the dominant environment associated with HVDC transmission lines; namely, a static electric field and an ion current density. A biological experimental design has been developed for assessing the effects of the dominant components of the HVDC transmission line environment.« less

  6. Design and Development of Layered Security: Future Enhancements and Directions in Transmission

    PubMed Central

    Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang

    2016-01-01

    Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack. PMID:26751443

  7. Design and Development of Layered Security: Future Enhancements and Directions in Transmission.

    PubMed

    Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang

    2016-01-06

    Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack.

  8. [New integrated care model for older people admitted to Intermediate Care Units in Catalonia: A quasi-experimental study protocol].

    PubMed

    Santaeugènia, Sebastià J; García-Lázaro, Manuela; Alventosa, Ana María; Gutiérrez-Benito, Alícia; Monterde, Albert; Cunill, Joan

    To evaluate the clinical effectiveness of an intermediate care model based on a system of care focused on integrated care pathways compared to the traditional model of geriatric care (usual care) in Catalonia. The design is a quasi-experimental pre-post non-randomised study with non-synchronous control group. The intervention consists of the development and implementation of integrated care pathways and the creation of specialised interdisciplinary teams in each of the processes. The two groups will be compared for demographic, clinical variables on admission and discharge, geriatric syndromes, and use of resources. This quasi-experimental study, aims to assess the clinical impact of the transformation of a traditional model of geriatric care to an intermediate care model in an integrated healthcare organisation. It is believed that the results of this study may be useful for future randomised controlled studies. Copyright © 2016 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Experimental Rectification of Entropy Production by Maxwell's Demon in a Quantum System

    NASA Astrophysics Data System (ADS)

    Camati, Patrice A.; Peterson, John P. S.; Batalhão, Tiago B.; Micadei, Kaonan; Souza, Alexandre M.; Sarthour, Roberto S.; Oliveira, Ivan S.; Serra, Roberto M.

    2016-12-01

    Maxwell's demon explores the role of information in physical processes. Employing information about microscopic degrees of freedom, this "intelligent observer" is capable of compensating entropy production (or extracting work), apparently challenging the second law of thermodynamics. In a modern standpoint, it is regarded as a feedback control mechanism and the limits of thermodynamics are recast incorporating information-to-energy conversion. We derive a trade-off relation between information-theoretic quantities empowering the design of an efficient Maxwell's demon in a quantum system. The demon is experimentally implemented as a spin-1 /2 quantum memory that acquires information, and employs it to control the dynamics of another spin-1 /2 system, through a natural interaction. Noise and imperfections in this protocol are investigated by the assessment of its effectiveness. This realization provides experimental evidence that the irreversibility in a nonequilibrium dynamics can be mitigated by assessing microscopic information and applying a feed-forward strategy at the quantum scale.

  10. Experimental Rectification of Entropy Production by Maxwell's Demon in a Quantum System.

    PubMed

    Camati, Patrice A; Peterson, John P S; Batalhão, Tiago B; Micadei, Kaonan; Souza, Alexandre M; Sarthour, Roberto S; Oliveira, Ivan S; Serra, Roberto M

    2016-12-09

    Maxwell's demon explores the role of information in physical processes. Employing information about microscopic degrees of freedom, this "intelligent observer" is capable of compensating entropy production (or extracting work), apparently challenging the second law of thermodynamics. In a modern standpoint, it is regarded as a feedback control mechanism and the limits of thermodynamics are recast incorporating information-to-energy conversion. We derive a trade-off relation between information-theoretic quantities empowering the design of an efficient Maxwell's demon in a quantum system. The demon is experimentally implemented as a spin-1/2 quantum memory that acquires information, and employs it to control the dynamics of another spin-1/2 system, through a natural interaction. Noise and imperfections in this protocol are investigated by the assessment of its effectiveness. This realization provides experimental evidence that the irreversibility in a nonequilibrium dynamics can be mitigated by assessing microscopic information and applying a feed-forward strategy at the quantum scale.

  11. Statistical Analysis on the Performance of Molecular Mechanics Poisson–Boltzmann Surface Area versus Absolute Binding Free Energy Calculations: Bromodomains as a Case Study

    PubMed Central

    2017-01-01

    Binding free energy calculations that make use of alchemical pathways are becoming increasingly feasible thanks to advances in hardware and algorithms. Although relative binding free energy (RBFE) calculations are starting to find widespread use, absolute binding free energy (ABFE) calculations are still being explored mainly in academic settings due to the high computational requirements and still uncertain predictive value. However, in some drug design scenarios, RBFE calculations are not applicable and ABFE calculations could provide an alternative. Computationally cheaper end-point calculations in implicit solvent, such as molecular mechanics Poisson–Boltzmann surface area (MMPBSA) calculations, could too be used if one is primarily interested in a relative ranking of affinities. Here, we compare MMPBSA calculations to previously performed absolute alchemical free energy calculations in their ability to correlate with experimental binding free energies for three sets of bromodomain–inhibitor pairs. Different MMPBSA approaches have been considered, including a standard single-trajectory protocol, a protocol that includes a binding entropy estimate, and protocols that take into account the ligand hydration shell. Despite the improvements observed with the latter two MMPBSA approaches, ABFE calculations were found to be overall superior in obtaining correlation with experimental affinities for the test cases considered. A difference in weighted average Pearson () and Spearman () correlations of 0.25 and 0.31 was observed when using a standard single-trajectory MMPBSA setup ( = 0.64 and = 0.66 for ABFE; = 0.39 and = 0.35 for MMPBSA). The best performing MMPBSA protocols returned weighted average Pearson and Spearman correlations that were about 0.1 inferior to ABFE calculations: = 0.55 and = 0.56 when including an entropy estimate, and = 0.53 and = 0.55 when including explicit water molecules. Overall, the study suggests that ABFE calculations are indeed the more accurate approach, yet there is also value in MMPBSA calculations considering the lower compute requirements, and if agreement to experimental affinities in absolute terms is not of interest. Moreover, for the specific protein–ligand systems considered in this study, we find that including an explicit ligand hydration shell or a binding entropy estimate in the MMPBSA calculations resulted in significant performance improvements at a negligible computational cost. PMID:28786670

  12. Cognitive-behavioral treatment groups for people with chronic physical illness in Hong Kong: reflections on a culturally attuned model.

    PubMed

    Wong, Daniel Fu Keung; Chau, Phyllis; Kwok, Anna; Kwan, Jackie

    2007-07-01

    This study describes and evaluates a cognitive-behavioral treatment group for people with chronic physical illness in Hong Kong. We developed a group protocol based on the understanding that Chinese people generally prefer a structured group format, expect group leaders to be active and directive, and are not used to expressing opinions and emotions in groups. The experimental and waitlist control groups had 38 and 35 participants, respectively. A standardized questionnaire was administered to all participants before and after the group treatment. Results suggest that members of the experimental group showed improvements in mental health, negative automatic thoughts, and negative emotions when compared to those in the waitlist control groups, and at the end of group treatment. Implications for designing and running a culturally attuned CBT group for Chinese people are discussed.

  13. PDB_REDO: automated re-refinement of X-ray structure models in the PDB.

    PubMed

    Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert

    2009-06-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour.

  14. Experimental Procedure for Warm Spinning of Cast Aluminum Components.

    PubMed

    Roy, Matthew J; Maijer, Daan M

    2017-02-01

    High performance, cast aluminum automotive wheels are increasingly being incrementally formed via flow forming/metal spinning at elevated temperatures to improve material properties. With a wide array of processing parameters which can affect both the shape attained and resulting material properties, this type of processing is notoriously difficult to commission. A simplified, light-duty version of the process has been designed and implemented for full-size automotive wheels. The apparatus is intended to assist in understanding the deformation mechanisms and the material response to this type of processing. An experimental protocol has been developed to prepare for, and subsequently perform forming trials and is described for as-cast A356 wheel blanks. The thermal profile attained, along with instrumentation details are provided. Similitude with full-scale forming operations which impart significantly more deformation at faster rates is discussed.

  15. Experimental Procedure for Warm Spinning of Cast Aluminum Components

    PubMed Central

    Roy, Matthew J.; Maijer, Daan M.

    2017-01-01

    High performance, cast aluminum automotive wheels are increasingly being incrementally formed via flow forming/metal spinning at elevated temperatures to improve material properties. With a wide array of processing parameters which can affect both the shape attained and resulting material properties, this type of processing is notoriously difficult to commission. A simplified, light-duty version of the process has been designed and implemented for full-size automotive wheels. The apparatus is intended to assist in understanding the deformation mechanisms and the material response to this type of processing. An experimental protocol has been developed to prepare for, and subsequently perform forming trials and is described for as-cast A356 wheel blanks. The thermal profile attained, along with instrumentation details are provided. Similitude with full-scale forming operations which impart significantly more deformation at faster rates is discussed. PMID:28190063

  16. Equipment and Protocols for Quasi-Static and Dynamic Tests of Very-High-Strength Concrete (VHSC) and High-Strength High-Ductility Concrete (HSHDC)

    DTIC Science & Technology

    2016-08-01

    quasi -static mechanical properties, deformation behavior, and damage mechanisms in HSHDC and compare the behavior with VHSC. 2. Develop experimental ...using the experimental setup described in Chapter 6. The quasi -static strain rate was approximately 10-4/s. All panels tested have nominal dimensions...ER D C TR -1 6- 13 Force Protection Basing; TeCD 1a Equipment and Protocols for Quasi -Static and Dynamic Tests of Very-High-Strength

  17. Extended Theories of Gravitation. Observation Protocols and Experimental Tests

    NASA Astrophysics Data System (ADS)

    Fatibene, Lorenzo; Ferraris, Marco; Francaviglia, Mauro; Magnano, Guido

    2013-09-01

    Within the framework of extended theories of gravitation we shall discuss physical equivalences among different formalisms and classical tests. As suggested by the Ehlers-Pirani-Schild framework, the conformal invariance will be preserved and its effect on observational protocols discussed. Accordingly, we shall review standard tests showing how Palatini f(R)-theories naturally passes solar system tests. Observation protocols will be discussed in this wider framework.

  18. SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach

    NASA Astrophysics Data System (ADS)

    Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore

    The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.

  19. Experimental evolution in silico: a custom-designed mathematical model for virulence evolution of Bacillus thuringiensis.

    PubMed

    Strauß, Jakob Friedrich; Crain, Philip; Schulenburg, Hinrich; Telschow, Arndt

    2016-08-01

    Most mathematical models on the evolution of virulence are based on epidemiological models that assume parasite transmission follows the mass action principle. In experimental evolution, however, mass action is often violated due to controlled infection protocols. This "theory-experiment mismatch" raises the question whether there is a need for new mathematical models to accommodate the particular characteristics of experimental evolution. Here, we explore the experimental evolution model system of Bacillus thuringiensis as a parasite and Caenorhabditis elegans as a host. Recent experimental studies with strict control of parasite transmission revealed that one-sided adaptation of B. thuringiensis with non-evolving hosts selects for intermediate or no virulence, sometimes coupled with parasite extinction. In contrast, host-parasite coevolution selects for high virulence and for hosts with strong resistance against B. thuringiensis. In order to explain the empirical results, we propose a new mathematical model that mimics the basic experimental set-up. The key assumptions are: (i) controlled parasite transmission (no mass action), (ii) discrete host generations, and (iii) context-dependent cost of toxin production. Our model analysis revealed the same basic trends as found in the experiments. Especially, we could show that resistant hosts select for highly virulent bacterial strains. Moreover, we found (i) that the evolved level of virulence is independent of the initial level of virulence, and (ii) that the average amount of bacteria ingested significantly affects the evolution of virulence with fewer bacteria ingested selecting for highly virulent strains. These predictions can be tested in future experiments. This study highlights the usefulness of custom-designed mathematical models in the analysis and interpretation of empirical results from experimental evolution. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  20. Controlling for confounding variables in MS-omics protocol: why modularity matters.

    PubMed

    Smith, Rob; Ventura, Dan; Prince, John T

    2014-09-01

    As the field of bioinformatics research continues to grow, more and more novel techniques are proposed to meet new challenges and improvements upon solutions to long-standing problems. These include data processing techniques and wet lab protocol techniques. Although the literature is consistently thorough in experimental detail and variable-controlling rigor for wet lab protocol techniques, bioinformatics techniques tend to be less described and less controlled. As the validation or rejection of hypotheses rests on the experiment's ability to isolate and measure a variable of interest, we urge the importance of reducing confounding variables in bioinformatics techniques during mass spectrometry experimentation. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. Free-space quantum key distribution at night

    NASA Astrophysics Data System (ADS)

    Buttler, William T.; Hughes, Richard J.; Kwiat, Paul G.; Lamoreaux, Steve K.; Luther, Gabriel G.; Morgan, George L.; Nordholt, Jane E.; Peterson, C. Glen; Simmons, Charles M.

    1998-07-01

    An experimental free-space quantum key distribution (QKD) system has been tested over an outdoor optical path of approximately 1 km under nighttime conditions at Los Alamos National Laboratory. This system employs the Bennett 92 protocol; here we give a brief overview of this protocol, and describe our experimental implementation of it. An analysis of the system efficiency is presented as well as a description of our error detection protocol, which employs a 2D parity check scheme. Finally, the susceptibility of this system to eavesdropping by various techniques is determined, and the effectiveness of privacy amplification procedures is discussed. Our conclusions are that free-space QKD is both effective and secure; possible applications include the rekeying of satellites in low earth orbit.

  2. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  3. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  4. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  5. Atomistic Simulations of Graphene Growth: From Kinetics to Mechanism.

    PubMed

    Qiu, Zongyang; Li, Pai; Li, Zhenyu; Yang, Jinlong

    2018-03-20

    Epitaxial growth is a promising strategy to produce high-quality graphene samples. At the same time, this method has great flexibility for industrial scale-up. To optimize growth protocols, it is essential to understand the underlying growth mechanisms. This is, however, very challenging, as the growth process is complicated and involves many elementary steps. Experimentally, atomic-scale in situ characterization methods are generally not feasible at the high temperature of graphene growth. Therefore, kinetics is the main experimental information to study growth mechanisms. Theoretically, first-principles calculations routinely provide atomic structures and energetics but have a stringent limit on the accessible spatial and time scales. Such gap between experiment and theory can be bridged by atomistic simulations using first-principles atomic details as input and providing the overall growth kinetics, which can be directly compared with experiment, as output. Typically, system-specific approximations should be applied to make such simulations computationally feasible. By feeding kinetic Monte Carlo (kMC) simulations with first-principles parameters, we can directly simulate the graphene growth process and thus understand the growth mechanisms. Our simulations suggest that the carbon dimer is the dominant feeding species in the epitaxial growth of graphene on both Cu(111) and Cu(100) surfaces, which enables us to understand why the reaction is diffusion limited on Cu(111) but attachment limited on Cu(100). When hydrogen is explicitly considered in the simulation, the central role hydrogen plays in graphene growth is revealed, which solves the long-standing puzzle into why H 2 should be fed in the chemical vapor deposition of graphene. The simulation results can be directly compared with the experimental kinetic data, if available. Our kMC simulations reproduce the experimentally observed quintic-like behavior of graphene growth on Ir(111). By checking the simulation results, we find that such nonlinearity is caused by lattice mismatch, and the induced growth front inhomogeneity can be universally used to predict growth behaviors in other heteroepitaxial systems. Notably, although experimental kinetics usually gives useful insight into atomic mechanisms, it can sometimes be misleading. Such pitfalls can be avoided via atomistic simulations, as demonstrated in our study of the graphene etching process. Growth protocols can be designed theoretically with computational kinetic and mechanistic information. By contrasting the different activation energies involved in an atom-exchange-based carbon penetration process for monolayer and bilayer graphene, we propose a three-step strategy to grow high-quality bilayer graphene. Based on first-principles parameters, a kinetic pathway toward the high-density, ordered N doping of epitaxial graphene on Cu(111) using a C 5 NCl 5 precursor is also identified. These studies demonstrate that atomistic simulations can unambiguously produce or reproduce the kinetic information on graphene growth, which is pivotal to understanding the growth mechanism and designing better growth protocols. A similar strategy can be used in growth mechanism studies of other two-dimensional atomic crystals.

  6. Nonpainful wide-area compression inhibits experimental pain

    PubMed Central

    Honigman, Liat; Bar-Bachar, Ofrit; Yarnitsky, David; Sprecher, Elliot; Granovsky, Yelena

    2016-01-01

    Abstract Compression therapy, a well-recognized treatment for lymphoedema and venous disorders, pressurizes limbs and generates massive non-noxious afferent sensory barrages. The aim of this study was to study whether such afferent activity has an analgesic effect when applied on the lower limbs, hypothesizing that larger compression areas will induce stronger analgesic effects, and whether this effect correlates with conditioned pain modulation (CPM). Thirty young healthy subjects received painful heat and pressure stimuli (47°C for 30 seconds, forearm; 300 kPa for 15 seconds, wrist) before and during 3 compression protocols of either SMALL (up to ankles), MEDIUM (up to knees), or LARGE (up to hips) compression areas. Conditioned pain modulation (heat pain conditioned by noxious cold water) was tested before and after each compression protocol. The LARGE protocol induced more analgesia for heat than the SMALL protocol (P < 0.001). The analgesic effect interacted with gender (P = 0.015). The LARGE protocol was more efficient for females, whereas the MEDIUM protocol was more efficient for males. Pressure pain was reduced by all protocols (P < 0.001) with no differences between protocols and no gender effect. Conditioned pain modulation was more efficient than the compression-induced analgesia. For the LARGE protocol, precompression CPM efficiency positively correlated with compression-induced analgesia. Large body area compression exerts an area-dependent analgesic effect on experimental pain stimuli. The observed correlation with pain inhibition in response to robust non-noxious sensory stimulation may suggest that compression therapy shares similar mechanisms with inhibitory pain modulation assessed through CPM. PMID:27152691

  7. Recommendations for Improving Identification and Quantification in Non-Targeted, GC-MS-Based Metabolomic Profiling of Human Plasma

    PubMed Central

    Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.

    2017-01-01

    The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195

  8. Toward the design of alkynylimidazole fluorophores: computational and experimental characterization of spectroscopic features in solution and in poly(methyl methacrylate).

    PubMed

    Barone, Vincenzo; Bellina, Fabio; Biczysko, Malgorzata; Bloino, Julien; Fornaro, Teresa; Latouche, Camille; Lessi, Marco; Marianetti, Giulia; Minei, Pierpaolo; Panattoni, Alessandro; Pucci, Andrea

    2015-10-28

    The possibilities offered by organic fluorophores in the preparation of advanced plastic materials have been increased by designing novel alkynylimidazole dyes, featuring different push and pull groups. This new family of fluorescent dyes was synthesized by means of a one-pot sequential bromination-alkynylation of the heteroaromatic core, and their optical properties were investigated in tetrahydrofuran and in poly(methyl methacrylate). An efficient in silico pre-screening scheme was devised as consisting of a step-by-step procedure employing computational methodologies by simulation of electronic spectra within simple vertical energy and more sophisticated vibronic approaches. Such an approach was also extended to efficiently simulate one-photon absorption and emission spectra of the dyes in the polymer environment for their potential application in luminescent solar concentrators. Besides the specific applications of this novel material, the integration of computational and experimental techniques reported here provides an efficient protocol that can be applied to make a selection among similar dye candidates, which constitute the essential responsive part of those fluorescent plastic materials.

  9. Protein crystal growth in low gravity

    NASA Technical Reports Server (NTRS)

    Feigelson, Robert S.

    1990-01-01

    The effect of low gravity on the growth of protein crystals and those parameters which will affect growth and crystal quality was studied. The proper design of the flight hardware and experimental protocols are highly dependent on understanding the factors which influence the nucleation and growth of crystals of biological macromolecules. Thus, those factors are investigated and the body of knowledge which has been built up for small molecule crystallization. These data also provide a basis of comparison for the results obtained from low-g experiments. The flows around growing crystals are detailed. The preliminary study of the growth of isocitrate lyase, the crystal morphologies found and the preliminary x ray results are discussed. The design of two apparatus for protein crystal growth by temperature control are presented along with preliminary results.

  10. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    PubMed

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  11. Continuous-variable quantum key distribution based on a plug-and-play dual-phase-modulated coherent-states protocol

    NASA Astrophysics Data System (ADS)

    Huang, Duan; Huang, Peng; Wang, Tao; Li, Huasheng; Zhou, Yingming; Zeng, Guihua

    2016-09-01

    We propose and experimentally demonstrate a continuous-variable quantum key distribution (CV-QKD) protocol using dual-phase-modulated coherent states. We show that the modulation scheme of our protocol works equivalently to that of the Gaussian-modulated coherent-states (GMCS) protocol, but shows better experimental feasibility in the plug-and-play configuration. Besides, it waives the necessity of propagation of a local oscillator (LO) between legitimate users and generates a real local LO for quantum measurement. Our protocol is proposed independent of the one-way GMCS QKD without sending a LO [Opt. Lett. 40, 3695 (2015), 10.1364/OL.40.003695; Phys. Rev. X 5, 041009 (2015), 10.1103/PhysRevX.5.041009; Phys. Rev. X 5, 041010 (2015), 10.1103/PhysRevX.5.041010]. In those recent works, the system stability will suffer the impact of polarization drifts induced by environmental perturbations, and two independent frequency-locked laser sources are necessary to achieve reliable coherent detection. In the proposed protocol, these previous problems can be resolved. We derive the security bounds for our protocol against collective attacks, and we also perform a proof-of-principle experiment to confirm the utility of our proposal in real-life applications. Such an efficient scheme provides a way of removing the security loopholes associated with the transmitting LO, which have been a notoriously hard problem in continuous-variable quantum communication.

  12. Design, rationale and feasibility of a multidimensional experimental protocol to study early life stress.

    PubMed

    Bartholomeusz, M Dillwyn; Bolton, Philip S; Callister, Robin; Skinner, Virginia; Hodgson, Deborah

    2017-09-01

    There is a rapidly accumulating body of evidence regarding the influential role of early life stress (ELS) upon medical and psychiatric conditions. While self-report instruments, with their intrinsic limitations of recall, remain the primary means of detecting ELS in humans, biological measures are generally limited to a single biological system. This paper describes the design, rationale and feasibility of a study to simultaneously measure neuroendocrine, immune and autonomic nervous system (ANS) responses to psychological and physiological stressors in relation to ELS. Five healthy university students were recruited by advertisement. Exclusion criteria included chronic medical conditions, psychotic disorders, needle phobia, inability to tolerate pain, and those using anti-inflammatory medications. They were clinically interviewed and physiological recordings made over a two-hour period pre, during and post two acute stressors: the cold pressor test and recalling a distressing memory. The Childhood Trauma Questionnaire and the Parental Bonding Index were utilised to measure ELS. Other psychological measures of mood and personality were also administered. Measurements of heart rate, blood pressure, respiratory rate, skin conductance, skin blood flow and temporal plasma samples were successfully obtained before, during and after acute stress. Participants reported the extensive psychological and multisystem physiological data collection and stress provocations were tolerable. Most (4/5) participants indicated a willingness to return to repeat the protocol, indicating acceptability. Our protocol is viable and safe in young physically healthy adults and allows us to assess simultaneously neuroendocrine, immune and autonomic nervous system responses to stressors in persons assessed for ELS.

  13. Assessing the welfare of laboratory mice in their home environment using animal-based measures--a benchmarking tool.

    PubMed

    Spangenberg, Elin M F; Keeling, Linda J

    2016-02-01

    Welfare problems in laboratory mice can be a consequence of an ongoing experiment, or a characteristic of a particular genetic line, but in some cases, such as breeding animals, they are most likely to be a result of the design and management of the home cage. Assessment of the home cage environment is commonly performed using resource-based measures, like access to nesting material. However, animal-based measures (related to the health status and behaviour of the animals) can be used to assess the current welfare of animals regardless of the inputs applied (i.e. the resources or management). The aim of this study was to design a protocol for assessing the welfare of laboratory mice using only animal-based measures. The protocol, to be used as a benchmarking tool, assesses mouse welfare in the home cage and does not contain parameters related to experimental situations. It is based on parameters corresponding to the 12 welfare criteria established by the Welfare Quality® project. Selection of animal-based measures was performed by scanning existing published, web-based and informal protocols, and by choosing parameters that matched these criteria, were feasible in practice and, if possible, were already validated indicators of mouse welfare. The parameters should identify possible animal welfare problems and enable assessment directly in an animal room during cage cleaning procedures, without the need for extra equipment. Thermal comfort behaviours and positive emotional states are areas where more research is needed to find valid, reliable and feasible animal-based measures. © The Author(s) 2015.

  14. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  15. Nonlinear least-squares data fitting in Excel spreadsheets.

    PubMed

    Kemmer, Gerdi; Keller, Sandro

    2010-02-01

    We describe an intuitive and rapid procedure for analyzing experimental data by nonlinear least-squares fitting (NLSF) in the most widely used spreadsheet program. Experimental data in x/y form and data calculated from a regression equation are inputted and plotted in a Microsoft Excel worksheet, and the sum of squared residuals is computed and minimized using the Solver add-in to obtain the set of parameter values that best describes the experimental data. The confidence of best-fit values is then visualized and assessed in a generally applicable and easily comprehensible way. Every user familiar with the most basic functions of Excel will be able to implement this protocol, without previous experience in data fitting or programming and without additional costs for specialist software. The application of this tool is exemplified using the well-known Michaelis-Menten equation characterizing simple enzyme kinetics. Only slight modifications are required to adapt the protocol to virtually any other kind of dataset or regression equation. The entire protocol takes approximately 1 h.

  16. A reconfigurable visual-programming library for real-time closed-loop cellular electrophysiology

    PubMed Central

    Biró, István; Giugliano, Michele

    2015-01-01

    Most of the software platforms for cellular electrophysiology are limited in terms of flexibility, hardware support, ease of use, or re-configuration and adaptation for non-expert users. Moreover, advanced experimental protocols requiring real-time closed-loop operation to investigate excitability, plasticity, dynamics, are largely inaccessible to users without moderate to substantial computer proficiency. Here we present an approach based on MATLAB/Simulink, exploiting the benefits of LEGO-like visual programming and configuration, combined to a small, but easily extendible library of functional software components. We provide and validate several examples, implementing conventional and more sophisticated experimental protocols such as dynamic-clamp or the combined use of intracellular and extracellular methods, involving closed-loop real-time control. The functionality of each of these examples is demonstrated with relevant experiments. These can be used as a starting point to create and support a larger variety of electrophysiological tools and methods, hopefully extending the range of default techniques and protocols currently employed in experimental labs across the world. PMID:26157385

  17. Spina Bifida

    MedlinePlus

    ... of myelomeningocele through a National Institutes of Health experimental protocol (Management of Myelomeningocele Study, or MOMS). Fetal ... additional loss from occurring. The surgery is considered experimental and there are risks to the fetus as ...

  18. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists

    PubMed Central

    Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348

  19. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists.

    PubMed

    Russell, Rachel; Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.

  20. UMDR: Multi-Path Routing Protocol for Underwater Ad Hoc Networks with Directional Antenna

    NASA Astrophysics Data System (ADS)

    Yang, Jianmin; Liu, Songzuo; Liu, Qipei; Qiao, Gang

    2018-01-01

    This paper presents a new routing scheme for underwater ad hoc networks based on directional antennas. Ad hoc networks with directional antennas have become a hot research topic because of space reuse may increase networks capacity. At present, researchers have applied traditional self-organizing routing protocols (such as DSR, AODV) [1] [2] on this type of networks, and the routing scheme is based on the shortest path metric. However, such routing schemes often suffer from long transmission delays and frequent link fragmentation along the intermediate nodes of the selected route. This is caused by a unique feature of directional transmission, often called as “deafness”. In this paper, we take a different approach to explore the advantages of space reuse through multipath routing. This paper introduces the validity of the conventional routing scheme in underwater ad hoc networks with directional antennas, and presents a special design of multipath routing algorithm for directional transmission. The experimental results show a significant performance improvement in throughput and latency.

  1. Transfection using DEAE-dextran.

    PubMed

    Gulick, T

    2001-05-01

    Transfection of cultured mammalian cells using diethylaminoethyl (DEAE)-dextran/DNA can be an attractive alternative to other transfection methods in many circumstances. The major advantages of the technique are its relative simplicity and speed, limited expense, and remarkably reproducible interexperimental and intraexperimental transfection efficiency. Disadvantages include inhibition of cell growth and induction of heterogeneous morphological changes in cells. Furthermore, the concentration of serum in the culture medium must be transiently reduced during the transfection. In general, DEAE-dextran DNA transfection is ideal for transient transfections with promoter/reporter plasmids in analyses of promoter and enhancer functions, and is suitable for overexpression of recombinant protein in transient transfections or for generation of stable cell lines using vectors designed to exist in the cell as episomes. This unit presents a general description of DEAE-dextran transfection, as well as two more specific protocols for typical experimental applications. The basic protocol is suitable for transfection of anchorage-dependent (attached) cells. For cells that grow in suspension, electroporation or lipofection is usually preferred, although DEAE-dextran-mediated transfection can be used.

  2. Transfection using DEAE-dextran.

    PubMed

    Gulick, Tod

    2003-08-01

    Transfection of cultured mammalian cells using diethylaminoethyl (DEAE)-dextran/DNA can be an attractive alternative to other transfection methods in many circumstances. The major advantages of the technique are its relative simplicity and speed, limited expense, and remarkably reproducible interexperimental and intraexperimental transfection efficiency. Disadvantages include inhibition of cell growth and induction of heterogeneous morphological changes in cells. Furthermore, the concentration of serum in the culture medium must be transiently reduced during the transfection. In general, DEAE-dextran DNA transfection is ideal for transient transfections with promoter/reporter plasmids in analyses of promoter and enhancer functions, and is suitable for overexpression of recombinant protein in transient transfections or for generation of stable cell lines using vectors designed to exist in the cell as episomes. This unit presents a general description of DEAE-dextran transfection, as well as two more specific protocols for typical experimental applications. The basic protocol is suitable for transfection of anchorage-dependent (attached) cells. For cells that grow in suspension, electroporation or lipofection is usually preferred, although DEAE-dextran-mediated transfection can be used.

  3. Transfection using DEAE-dextran.

    PubMed

    Gulick, T

    2001-05-01

    Transfection of cultured mammalian cells using diethylaminoethyl (DEAE)-dextran/DNA can be an attractive alternative to other transfection methods in many circumstances. The major advantages of the technique are its relative simplicity and speed, limited expense, and remarkably reproducible interexperimental and intraexperimental transfection efficiency. Disadvantages include inhibition of cell growth and induction of heterogeneous morphological changes in cells. Furthermore, the concentration of serum in the culture medium must be transiently reduced during the transfection. In general, DEAE-dextran DNA transfection is ideal for transient transfections with promoter/reporter plasmids in analyses of promoter and enhancer functions, and is suitable for overexpression of recombinant protein in transient transfections or for generation of stable cell lines using vectors designed to exist in the cell as episomes. This unit presents a general description of DEAE-dextran transfection, as well as two more specific protocols for typical experimental applications. The Basic Protocol is suitable for transfection of anchorage-dependent (attached) cells. For cells that grow in suspension, electroporation or lipofection is usually preferred, although DEAE-dextran-mediated transfection can be used.

  4. OpenSource lab-on-a-chip physiometer for accelerated zebrafish embryo biotests.

    PubMed

    Akagi, Jin; Hall, Chris J; Crosier, Kathryn E; Cooper, Jonathan M; Crosier, Philip S; Wlodkowic, Donald

    2014-01-02

    Zebrafish (Danio rerio) embryo assays have recently come into the spotlight as convenient experimental models in both biomedicine and ecotoxicology. As a small aquatic model organism, zebrafish embryo assays allow for rapid physiological, embryo-, and genotoxic tests of drugs and environmental toxins that can be simply dissolved in water. This protocol describes prototyping and application of an innovative, miniaturized, and polymeric chip-based device capable of immobilizing a large number of living fish embryos for real-time and/or time-lapse microscopic examination. The device provides a physical address designation to each embryo during analysis, continuous perfusion of medium, and post-analysis specimen recovery. Miniaturized embryo array is a new concept of immobilization and real-time drug perfusion of multiple individual and developing zebrafish embryos inside the mesofluidic device. The OpenSource device presented in this protocol is particularly suitable to perform accelerated fish embryo biotests in ecotoxicology and phenotype-based pharmaceutical screening. Copyright © 2014 John Wiley & Sons, Inc.

  5. Study and Simulation of Enhancements for TCP (Transmission Control Protocol) Performance Over Noisy, High-Latency Links

    NASA Technical Reports Server (NTRS)

    Shepard, Timothy J.; Partridge, Craig; Coulter, Robert

    1997-01-01

    The designers of the TCP/IP protocol suite explicitly included support of satellites in their design goals. The goal of the Internet Project was to design a protocol which could be layered over different networking technologies to allow them to be concatenated into an internet. The results of this project included two protocols, IP and TCP. IP is the protocol used by all elements in the network and it defines the standard packet format for IP datagrams. TCP is the end-to-end transport protocol commonly used between end systems on the Internet to derive a reliable bi-directional byte-pipe service from the underlying unreliable IP datagram service. Satellite links are explicitly mentioned in Vint Cerf's 2-page article which appeared in 1980 in CCR [2] to introduce the specifications for IP and TCP. In the past fifteen years, TCP has been demonstrated to work over many differing networking technologies, including over paths including satellites links. So if satellite links were in the minds of the designers from the beginning, what is the problem? The problem is that the performance of TCP has in some cases been disappointing. A goal of the authors of the original specification of TCP was to specify only enough behavior to ensure interoperability. The specification left a number of important decisions, in particular how much data is to be sent when, to the implementor. This was deliberately' done. By leaving performance-related decisions to the implementor, this would allow the protocol TCP to be tuned and adapted to different networks and situations in the future without the need to revise the specification of the protocol, or break interoperability. Interoperability would continue while future implementations would be allowed flexibility to adapt to needs which could not be anticipated at the time of the original protocol design.

  6. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  7. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case.

    PubMed

    Russ, Thomas A; Ramakrishnan, Cartic; Hovy, Eduard H; Bota, Mihail; Burns, Gully A P C

    2011-08-22

    We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS).

  8. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case

    PubMed Central

    2011-01-01

    Background We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. Results The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. Conclusions We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS). PMID:21859449

  9. Effects of exercise with or without light exposure on sleep quality and hormone reponses

    PubMed Central

    Lee, Hayan; Kim, Sunho; Kim, Donghee

    2014-01-01

    [Purpose] The objectives of the present study were to determine the effect of sun exposure and aerobic exercise on quality of sleep and investigate sleep-related hormonal responses in college-aged males. [Methods] In this study, the cross-over design was utilized. The subjects (N = 10) without any physical problems or sleep disorders participated in the experimental performed 4 protocols in only sun exposure (for 30 minutes, EG1) protocol, only aerobic exercise (walking and jogging for 30 minutes, EG2) protocol, aerobic exercise with sun exposure (EG3) protocol, and control (no exercise and no sun exposure, EG4) protocol. Each protocol was 5 times per week with one-week break (wash-out period) between protocols to prevent the effects of the previous protocol. Total test period was should be 7 weeks (one week of protocol and one week of break). Before and after each aerobic exercise session, the subjects completed stretching to warm up for 5 to 10 minutes. Surveys consisting of (bedtime, wake-up time, sleep onset latency, and (Pittsburgh Sleep Quality Index (PSQI) were obtained before the test and after each protocol. After each protocol, the following sleep-related hormonal responses were measured: blood concentrations of melatonin, cortisol, epinephrine, and norepinephrine. One-way ANOVA was used to determine differences between protocols. Statistical significance was set at p < 0.05. [Results] Bedtime of EG4 was significantly later than that of the EG1 or EG3. Wake-up time in the EG4 was significantly later than that of the EG1 or the EG3. Sleep onset latency in the EG4 was longer than that of the EG3. The quality of sleep in the EG4 was lower than that of the EG3. Sleep cycle in the EG4 was significantly shorter than that of the EG1. Blood melatonin concentrations of the EG3 was significantly higher than that of the EG4. There were no significant differences in blood concentrations of cortisol, epinephrine, or norepinephrine among protocols, with the order from the lowest to the highest values of EG1 < EG2 < EG3 < EG4. [Conclusion] The present data found that EG1 and EG3 showed positive sleep-related hormonal responses, sleep habits, and quality of sleep, indicating that sun exposure or exercise with sun exposure may improve the physical status and quality of life. PMID:25566466

  10. Designing Endocrine Disruption Out of the Next Generation of Chemicals

    PubMed Central

    Schug, T.T; Abagyan, R.; Blumberg, B.; Collins, T.J.; Crews, D.; DeFur, P.L.; Dickerson, S.M.; Edwards, T.M.; Gore, A.C.; Guillette, L.J.; Hayes, T.; Heindel, J.J.; Moores, A.; Patisaul, H.B.; Tal, T.L.; Thayer, K.A.; Vandenberg, L.N.; Warner, J.; Watson, C.S.; Saal, F.S. vom; Zoeller, R.T.; O’Brien, K.P.; Myers, J.P.

    2013-01-01

    A central goal of green chemistry is to avoid hazard in the design of new chemicals. This objective is best achieved when information about a chemical’s potential hazardous effects is obtained as early in the design process as feasible. Endocrine disruption is a type of hazard that to date has been inadequately addressed by both industrial and regulatory science. To aid chemists in avoiding this hazard, we propose an endocrine disruption testing protocol for use by chemists in the design of new chemicals. The Tiered Protocol for Endocrine Disruption (TiPED) has been created under the oversight of a scientific advisory committee composed of leading representatives from both green chemistry and the environmental health sciences. TiPED is conceived as a tool for new chemical design, thus it starts with a chemist theoretically at “the drawing board.” It consists of five testing tiers ranging from broad in silico evaluation up through specific cell- and whole organism-based assays. To be effective at detecting endocrine disruption, a testing protocol must be able to measure potential hormone-like or hormone-inhibiting effects of chemicals, as well as the many possible interactions and signaling sequellae such chemicals may have with cell-based receptors. Accordingly, we have designed this protocol to broadly interrogate the endocrine system. The proposed protocol will not detect all possible mechanisms of endocrine disruption, because scientific understanding of these phenomena is advancing rapidly. To ensure that the protocol remains current, we have established a plan for incorporating new assays into the protocol as the science advances. In this paper we present the principles that should guide the science of testing new chemicals for endocrine disruption, as well as principles by which to evaluate individual assays for applicability, and laboratories for reliability. In a ‘proof-of-principle’ test, we ran 6 endocrine disrupting chemicals (EDCs) that act via different endocrinological mechanisms through the protocol using published literature. Each was identified as endocrine active by one or more tiers. We believe that this voluntary testing protocol will be a dynamic tool to facilitate efficient and early identification of potentially problematic chemicals, while ultimately reducing the risks to public health. PMID:25110461

  11. Designing Endocrine Disruption Out of the Next Generation of Chemicals.

    PubMed

    Schug, T T; Abagyan, R; Blumberg, B; Collins, T J; Crews, D; DeFur, P L; Dickerson, S M; Edwards, T M; Gore, A C; Guillette, L J; Hayes, T; Heindel, J J; Moores, A; Patisaul, H B; Tal, T L; Thayer, K A; Vandenberg, L N; Warner, J; Watson, C S; Saal, F S Vom; Zoeller, R T; O'Brien, K P; Myers, J P

    2013-01-01

    A central goal of green chemistry is to avoid hazard in the design of new chemicals. This objective is best achieved when information about a chemical's potential hazardous effects is obtained as early in the design process as feasible. Endocrine disruption is a type of hazard that to date has been inadequately addressed by both industrial and regulatory science. To aid chemists in avoiding this hazard, we propose an endocrine disruption testing protocol for use by chemists in the design of new chemicals. The Tiered Protocol for Endocrine Disruption (TiPED) has been created under the oversight of a scientific advisory committee composed of leading representatives from both green chemistry and the environmental health sciences. TiPED is conceived as a tool for new chemical design, thus it starts with a chemist theoretically at "the drawing board." It consists of five testing tiers ranging from broad in silico evaluation up through specific cell- and whole organism-based assays. To be effective at detecting endocrine disruption, a testing protocol must be able to measure potential hormone-like or hormone-inhibiting effects of chemicals, as well as the many possible interactions and signaling sequellae such chemicals may have with cell-based receptors. Accordingly, we have designed this protocol to broadly interrogate the endocrine system. The proposed protocol will not detect all possible mechanisms of endocrine disruption, because scientific understanding of these phenomena is advancing rapidly. To ensure that the protocol remains current, we have established a plan for incorporating new assays into the protocol as the science advances. In this paper we present the principles that should guide the science of testing new chemicals for endocrine disruption, as well as principles by which to evaluate individual assays for applicability, and laboratories for reliability. In a 'proof-of-principle' test, we ran 6 endocrine disrupting chemicals (EDCs) that act via different endocrinological mechanisms through the protocol using published literature. Each was identified as endocrine active by one or more tiers. We believe that this voluntary testing protocol will be a dynamic tool to facilitate efficient and early identification of potentially problematic chemicals, while ultimately reducing the risks to public health.

  12. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  13. Rethinking the NTCIP Design and Protocols - Analyzing the Issues

    DOT National Transportation Integrated Search

    1998-03-03

    This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...

  14. Effects of different self-assembled monolayers on thin-film morphology: a combined DFT/MD simulation protocol.

    PubMed

    Alberga, Domenico; Mangiatordi, Giuseppe Felice; Motta, Alessandro; Nicolotti, Orazio; Lattanzi, Gianluca

    2015-10-06

    Organic thin film transistors (OTFTs) are multilayer field-effect transistors that employ an organic conjugated material as semiconductor. Several experimental groups have recently demonstrated that the insertion of an organic self-assembled monolayer (SAM) between the dielectric and the semiconductive layer is responsible for a sensible improvement of the OTFT performances in terms of an increased charge carrier mobility caused by a higher degree of order in the organic semiconductor layer. Here, we describe a combined periodic density functional theory (DFT) and classical molecular dynamics (MD) protocol applied to four different SAMs and a pentacene monolayer deposited onto their surfaces. In particular, we investigate the morphology and the surface of the four SAMs and the translational, orientational, and nematic order of the monolayer through the calculation of several distribution functions and order parameters pointing out the differences among the systems and relating them to known experimental results. Our calculations also suggest that small differences in the SAM molecular design will produce remarkable differences in the SAM surface and monolayer order. In particular, our simulations explain how a SAM with a bulky terminal group results in an irregular and rough surface that determines the deposition of a disordered semiconductive monolayer. On the contrary, SAMs with a small terminal group generate smooth surfaces with uninterrupted periodicity, thus favoring the formation of an ordered pentacene monolayer that increases the mobility of charge carriers and improves the overall performances of the OTFT devices. Our results clearly point out that the in silico procedure presented here might be of help in tuning the design of SAMs in order to improve the quality of OTFT devices.

  15. Modeling Collective Animal Behavior with a Cognitive Perspective: A Methodological Framework

    PubMed Central

    Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy

    2012-01-01

    The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters. PMID:22761685

  16. Modeling collective animal behavior with a cognitive perspective: a methodological framework.

    PubMed

    Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy

    2012-01-01

    The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters.

  17. An experimental and numerical investigation of phase change electrodes for therapeutic irreversible electroporation.

    PubMed

    Arena, Christopher B; Mahajan, Roop L; Nichole Rylander, Marissa; Davalos, Rafael V

    2013-11-01

    Irreversible electroporation (IRE) is a new technology for ablating aberrant tissue that utilizes pulsed electric fields (PEFs) to kill cells by destabilizing their plasma membrane. When treatments are planned correctly, the pulse parameters and location of the electrodes for delivering the pulses are selected to permit destruction of the target tissue without causing thermal damage to the surrounding structures. This allows for the treatment of surgically inoperable masses that are located near major blood vessels and nerves. In select cases of high-dose IRE, where a large ablation volume is desired without increasing the number of electrode insertions, it can become challenging to design a pulse protocol that is inherently nonthermal. To solve this problem we have developed a new electrosurgical device that requires no external equipment or protocol modifications. The design incorporates a phase change material (PCM) into the electrode core that melts during treatment and absorbs heat out of the surrounding tissue. Here, this idea is reduced to practice by testing hollow electrodes filled with gallium on tissue phantoms and monitoring temperature in real time. Additionally, the experimental data generated are used to validate a numerical model of the heat transfer problem, which is then applied to investigate the cooling performance of other classes of PCMs. The results indicate that metallic PCMs, such as gallium, are better suited than organics or salt hydrates for thermal management, because their comparatively higher thermal conductivity aids in heat dissipation. However, the melting point of the metallic PCM must be properly adjusted to ensure that the phase transition is not completed before the end of treatment. When translated clinically, phase change electrodes have the potential to continue to allow IRE to be performed safely near critical structures, even in high-dose cases.

  18. Self-referenced continuous-variable quantum key distribution protocol

    DOE PAGES

    Soh, Daniel Beom Soo; Sarovar, Mohan; Brif, Constantin; ...

    2015-10-21

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice’s and Bob’s measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of themore » protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. Furthermore, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.« less

  19. Self-referenced continuous-variable quantum key distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soh, Daniel Beom Soo; Sarovar, Mohan; Brif, Constantin

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice’s and Bob’s measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of themore » protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. Furthermore, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.« less

  20. Self-Referenced Continuous-Variable Quantum Key Distribution Protocol

    NASA Astrophysics Data System (ADS)

    Soh, Daniel B. S.; Brif, Constantin; Coles, Patrick J.; Lütkenhaus, Norbert; Camacho, Ryan M.; Urayama, Junji; Sarovar, Mohan

    2015-10-01

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice's and Bob's measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of the protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. As such, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.

  1. A shorter and more specific oral sensitization-based experimental model of food allergy in mice.

    PubMed

    Bailón, Elvira; Cueto-Sola, Margarita; Utrilla, Pilar; Rodríguez-Ruiz, Judith; Garrido-Mesa, Natividad; Zarzuelo, Antonio; Xaus, Jordi; Gálvez, Julio; Comalada, Mònica

    2012-07-31

    Cow's milk protein allergy (CMPA) is one of the most prevalent human food-borne allergies, particularly in children. Experimental animal models have become critical tools with which to perform research on new therapeutic approaches and on the molecular mechanisms involved. However, oral food allergen sensitization in mice requires several weeks and is usually associated with unspecific immune responses. To overcome these inconveniences, we have developed a new food allergy model that takes only two weeks while retaining the main characters of allergic response to food antigens. The new model is characterized by oral sensitization of weaned Balb/c mice with 5 doses of purified cow's milk protein (CMP) plus cholera toxin (CT) for only two weeks and posterior challenge with an intraperitoneal administration of the allergen at the end of the sensitization period. In parallel, we studied a conventional protocol that lasts for seven weeks, and also the non-specific effects exerted by CT in both protocols. The shorter protocol achieves a similar clinical score as the original food allergy model without macroscopically affecting gut morphology or physiology. Moreover, the shorter protocol caused an increased IL-4 production and a more selective antigen-specific IgG1 response. Finally, the extended CT administration during the sensitization period of the conventional protocol is responsible for the exacerbated immune response observed in that model. Therefore, the new model presented here allows a reduction not only in experimental time but also in the number of animals required per experiment while maintaining the features of conventional allergy models. We propose that the new protocol reported will contribute to advancing allergy research. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Gaussian error correction of quantum states in a correlated noisy channel.

    PubMed

    Lassen, Mikael; Berni, Adriano; Madsen, Lars S; Filip, Radim; Andersen, Ulrik L

    2013-11-01

    Noise is the main obstacle for the realization of fault-tolerant quantum information processing and secure communication over long distances. In this work, we propose a communication protocol relying on simple linear optics that optimally protects quantum states from non-Markovian or correlated noise. We implement the protocol experimentally and demonstrate the near-ideal protection of coherent and entangled states in an extremely noisy channel. Since all real-life channels are exhibiting pronounced non-Markovian behavior, the proposed protocol will have immediate implications in improving the performance of various quantum information protocols.

  3. C4MIP - The Coupled Climate-Carbon Cycle Model Intercomparison Project: experimental protocol for CMIP6

    NASA Astrophysics Data System (ADS)

    Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; Bopp, Laurent; Brovkin, Victor; Dunne, John; Graven, Heather; Hoffman, Forrest; Ilyina, Tatiana; John, Jasmin G.; Jung, Martin; Kawamiya, Michio; Koven, Charlie; Pongratz, Julia; Raddatz, Thomas; Randerson, James T.; Zaehle, Sönke

    2016-08-01

    Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate-Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks are potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate-carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate-carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This paper documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.

  4. C4MIP – The Coupled Climate–Carbon Cycle Model Intercomparison Project: Experimental protocol for CMIP6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre

    Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less

  5. C4MIP – The Coupled Climate–Carbon Cycle Model Intercomparison Project: Experimental protocol for CMIP6

    DOE PAGES

    Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; ...

    2016-08-25

    Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less

  6. Indices of fiber biopersistence and carcinogen classification for synthetic vitreous fibers (SVFs).

    PubMed

    Maxim, L Daniel; Boymel, Paul; Chase, Gerald R; Bernstein, David M

    2002-06-01

    It is generally accepted that the biopersistence of a synthetic vitreous fiber (SVF) is an important determinant of its biological activity. Experimental protocols have been developed to measure the biopersistence of an SVF from short-term inhalation experiments with rats. Clearance kinetics of long (>20 microm) fibers (those believed to have greatest biological activity) have been approximated by one- or two-pool models. Several measures or indices of biopersistence have been proposed in the literature of which three, the weighted half-time (WT(1/2)), the time required to clear 90% of long fibers (T(0.9)), and the so-called slow-phase half-time (T(2)), have been investigated in some detail. This paper considers both one- and two-pool models for long fiber clearance, characterizes the properties of these candidate indices of fiber biopersistence, identifies measures with potentially superior statistical properties, suggests possible cutoff values based on the relation between biopersistence and the outcome of chronic bioassays, and offers comments on the selection of efficient experimental designs. This analysis concludes that WT(1/2) and T(0.9) are highly correlated, are efficient predictors of the outcome of chronic bioassays, and have reasonable statistical properties. T(2), although perhaps attractive in principle, suffers from some statistical shortcomings when estimated using present experimental protocols. The WT(1/2) is shown to be directly proportional to the cumulative exposure (fiber days) after the cessation of exposure and also the mean residence time of these fibers in the lung. Copyright 2002 Elsevier Science (USA)

  7. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design

    PubMed Central

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R.

    2018-01-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228–256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody–antigen complexes, using two design strategies—optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the native amino acid types for residues that contact the antigen in the native structures was 72% in simulations performed in the presence of the antigen and 48% in simulations performed without the antigen, for an ARR of 1.5. For the non-contacting residues, the ARR was 1.08. This shows that the sequence profiles are able to maintain the amino acid types of these conserved, buried sites, while recovery of the exposed, contacting residues requires the presence of the antigen-antibody interface. We tested RAbD experimentally on both a lambda and kappa antibody–antigen complex, successfully improving their affinities 10 to 50 fold by replacing individual CDRs of the native antibody with new CDR lengths and clusters. PMID:29702641

  8. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design.

    PubMed

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Weitzner, Brian D; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R; Dunbrack, Roland L

    2018-04-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228-256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody-antigen complexes, using two design strategies-optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the native amino acid types for residues that contact the antigen in the native structures was 72% in simulations performed in the presence of the antigen and 48% in simulations performed without the antigen, for an ARR of 1.5. For the non-contacting residues, the ARR was 1.08. This shows that the sequence profiles are able to maintain the amino acid types of these conserved, buried sites, while recovery of the exposed, contacting residues requires the presence of the antigen-antibody interface. We tested RAbD experimentally on both a lambda and kappa antibody-antigen complex, successfully improving their affinities 10 to 50 fold by replacing individual CDRs of the native antibody with new CDR lengths and clusters.

  9. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  10. A technology training protocol for meeting QSEN goals: Focusing on meaningful learning.

    PubMed

    Luo, Shuhong; Kalman, Melanie

    2018-01-01

    The purpose of this paper is to describe and discuss how we designed and developed a 12-step technology training protocol. The protocol is meant to improve meaningful learning in technology education so that nursing students are able to meet the informatics requirements of Quality and Safety Education in Nursing competencies. When designing and developing the training protocol, we used a simplified experiential learning model that addressed the core features of meaningful learning: to connect new knowledge with students' prior knowledge and real-world workflow. Before training, we identified students' prior knowledge and workflow tasks. During training, students learned by doing, reflected on their prior computer skills and workflow, designed individualized procedures for integration into their workflow, and practiced the self-designed procedures in real-world settings. The trainer was a facilitator who provided a meaningful learning environment, asked the right questions to guide reflective conversation, and offered scaffoldings at critical moments. This training protocol could significantly improve nurses' competencies in using technologies and increase their desire to adopt new technologies. © 2017 Wiley Periodicals, Inc.

  11. A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang

    2017-08-08

    Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs.

  12. Salivary cortisol and testosterone responses to resistance and plyometric exercise in 12- to 14-year-old boys.

    PubMed

    Klentrou, Panagiota; Giannopoulou, Angeliki; McKinlay, Brandon J; Wallace, Phillip; Muir, Cameron; Falk, Bareket; Mack, Diane

    2016-07-01

    This study examined changes in salivary testosterone and cortisol following resistance and plyometric exercise protocols in active boys. In a crossover experimental design, 26 peri-pubertal (12- to 14-year-old) soccer players performed 2 exercise trials in random order, on separate evenings, 1 week apart. Each trial included a 30 min control session followed by 30 min of either resistance or plyometric exercise. Saliva was collected at baseline, post-control (i.e., pre-exercise), and 5 and 30 min post-exercise. There were no significant differences in the baseline hormone concentrations between trials or between weeks (p > 0.05). A significant effect for time was found for testosterone (p = 0.02, [Formula: see text] = 0.14), which increased from pre-exercise to 5 min post-exercise in both the resistance (27% ± 5%) and plyometric (12% ± 6%) protocols. Cortisol decreased to a similar extent in both trials (p = 0.009, [Formula: see text] = 0.19) from baseline to post-control and then to 5 min post-exercise, following its typical circadian decrease in the evening hours. However, a significant protocol-by-time interaction was observed for cortisol, which increased 30 min after the plyometrics (+31% ± 12%) but continued to decrease following the resistance protocol (-21% ± 5%). Our results suggest that in young male athletes, multiple modes of exercise can lead to a transient anabolic state, thus maximizing the beneficial effects on growth and development, when exercise is performed in the evening hours.

  13. Changes in morphology of long bone marrow tissue of rats submitted to cryotherapy with liquid nitrogen.

    PubMed

    Costa, Fábio Wildson Gurgel; Pessoa, Rosana Maria Andrade; Nogueira, Carlos Bruno Pinheiro; Pereira, Karuza Maria Alves; Brito, Gerly Anne de Castro; Soares, Eduardo Costa Studart

    2012-02-01

    To study the main effects of local use of liquid nitrogen on bone marrow tissue in rats. The femoral diaphyses of 42 Wistar rats were exposed to three local and sequential applications of liquid nitrogen for one or two minutes, intercalated with periods of five minutes of passive thawing. The animals were sacrificed after one, two, four and 12 weeks and the specimens obtained were analyzed histomorphologically. In the second experimental week of one-minute protocol, histological degree of inflammation obtained a mean score of one (mild), ranging from 0 (absent or scarce) and two (moderate) (Kruskal-Wallis test p=0.01). In the second experimental week of two-minute protocol, degree of inflammation to the medullar tissue obtained an average score of two (Kruskal-Wallis test p=0.01). The degree of inflammation of the bone marrow tissue was higher in protocol of three applications of two minutes compared to protocol of three applications of one minute.

  14. Comparison of protocols for measuring and calculating postmortem submersion intervals for human analogs in fresh water.

    PubMed

    Humphreys, Michael K; Panacek, Edward; Green, William; Albers, Elizabeth

    2013-03-01

    Protocols for determining postmortem submersion interval (PMSI) have long been problematic for forensic investigators due to the wide variety of factors affecting the rate of decomposition of submerged carrion. Likewise, it has been equally problematic for researchers to develop standardized experimental protocols to monitor underwater decomposition without artificially affecting the decomposition rate. This study compares two experimental protocols: (i) underwater in situ evaluation with photographic documentation utilizing the Heaton et al. total aquatic decomposition (TAD) score and (ii) weighing the carrion before and after submersion. Complete forensic necropsies were performed as a control. Perinatal piglets were used as human analogs. The results of this study indicate that in order to objectively measure decomposition over time, the human analog should be examined at depth using the TAD scoring system rather than utilizing a carrion weight evaluation. The acquired TAD score can be used to calculate an approximate PMSI. © 2012 American Academy of Forensic Sciences.

  15. Translational microsurgery. A new platform for transplantation research.

    PubMed

    Kobayashi, Eiji; Haga, Junko

    2016-03-01

    Clinical microsurgery has been introduced in many fields, while experimental microsurgery has the cross-disciplinary features of the sciences and techniques for growth of medicine, pharmacology, veterinary, engineering etc. Training protocol, proposing a new name as Translational Microsurgery, was introduced. Reconstructive skills of hepatic artery in pediatric living donor liver transplantation were summarized. Ex vivo training protocol using artificial blood vessel for surgeons was proposed. Clinical microsurgery requires anastomosis with delicate arteries and limited field of view. Our training protocol revealed that the relation between the score and speed was seen, while not all the surgeons with enough experience got high score. This training led to muster clinical skills and to apply excellent experimental works. Our microsurgical training protocol has been planned from the points of clinical setting. Training for vascular anastomosis led to rodent transplantation models. These models were used for immunology and immunosuppressant research. Microsurgical techniques led to master catheter technique and to inject various drugs or gene vectors.

  16. Automatized set-up procedure for transcranial magnetic stimulation protocols.

    PubMed

    Harquel, S; Diard, J; Raffin, E; Passera, B; Dall'Igna, G; Marendaz, C; David, O; Chauvin, A

    2017-06-01

    Transcranial Magnetic Stimulation (TMS) established itself as a powerful technique for probing and treating the human brain. Major technological evolutions, such as neuronavigation and robotized systems, have continuously increased the spatial reliability and reproducibility of TMS, by minimizing the influence of human and experimental factors. However, there is still a lack of efficient set-up procedure, which prevents the automation of TMS protocols. For example, the set-up procedure for defining the stimulation intensity specific to each subject is classically done manually by experienced practitioners, by assessing the motor cortical excitability level over the motor hotspot (HS) of a targeted muscle. This is time-consuming and introduces experimental variability. Therefore, we developed a probabilistic Bayesian model (AutoHS) that automatically identifies the HS position. Using virtual and real experiments, we compared the efficacy of the manual and automated procedures. AutoHS appeared to be more reproducible, faster, and at least as reliable as classical manual procedures. By combining AutoHS with robotized TMS and automated motor threshold estimation methods, our approach constitutes the first fully automated set-up procedure for TMS protocols. The use of this procedure decreases inter-experimenter variability while facilitating the handling of TMS protocols used for research and clinical routine. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Software-Defined Architectures for Spectrally Efficient Cognitive Networking in Extreme Environments

    NASA Astrophysics Data System (ADS)

    Sklivanitis, Georgios

    The objective of this dissertation is the design, development, and experimental evaluation of novel algorithms and reconfigurable radio architectures for spectrally efficient cognitive networking in terrestrial, airborne, and underwater environments. Next-generation wireless communication architectures and networking protocols that maximize spectrum utilization efficiency in congested/contested or low-spectral availability (extreme) communication environments can enable a rich body of applications with unprecedented societal impact. In recent years, underwater wireless networks have attracted significant attention for military and commercial applications including oceanographic data collection, disaster prevention, tactical surveillance, offshore exploration, and pollution monitoring. Unmanned aerial systems that are autonomously networked and fully mobile can assist humans in extreme or difficult-to-reach environments and provide cost-effective wireless connectivity for devices without infrastructure coverage. Cognitive radio (CR) has emerged as a promising technology to maximize spectral efficiency in dynamically changing communication environments by adaptively reconfiguring radio communication parameters. At the same time, the fast developing technology of software-defined radio (SDR) platforms has enabled hardware realization of cognitive radio algorithms for opportunistic spectrum access. However, existing algorithmic designs and protocols for shared spectrum access do not effectively capture the interdependencies between radio parameters at the physical (PHY), medium-access control (MAC), and network (NET) layers of the network protocol stack. In addition, existing off-the-shelf radio platforms and SDR programmable architectures are far from fulfilling runtime adaptation and reconfiguration across PHY, MAC, and NET layers. Spectrum allocation in cognitive networks with multi-hop communication requirements depends on the location, network traffic load, and interference profile at each network node. As a result, the development and implementation of algorithms and cross-layer reconfigurable radio platforms that can jointly treat space, time, and frequency as a unified resource to be dynamically optimized according to inter- and intra-network interference constraints is of fundamental importance. In the next chapters, we present novel algorithmic and software/hardware implementation developments toward the deployment of spectrally efficient terrestrial, airborne, and underwater wireless networks. In Chapter 1 we review the state-of-art in commercially available SDR platforms, describe their software and hardware capabilities, and classify them based on their ability to enable rapid prototyping and advance experimental research in wireless networks. Chapter 2 discusses system design and implementation details toward real-time evaluation of a software-radio platform for all-spectrum cognitive channelization in the presence of narrowband or wideband primary stations. All-spectrum channelization is achieved by designing maximum signal-to-interference-plus-noise ratio (SINR) waveforms that span the whole continuum of the device-accessible spectrum, while satisfying peak power and interference temperature (IT) constraints for the secondary and primary users, respectively. In Chapter 3, we introduce the concept of all-spectrum channelization based on max-SINR optimized sparse-binary waveforms, we propose optimal and suboptimal waveform design algorithms, and evaluate their SINR and bit-error-rate (BER) performance in an SDR testbed. Chapter 4 considers the problem of channel estimation with minimal pilot signaling in multi-cell multi-user multi-input multi-output (MIMO) systems with very large antenna arrays at the base station, and proposes a least-squares (LS)-type algorithm that iteratively extracts channel and data estimates from a short record of data measurements. Our algorithmic developments toward spectrally-efficient cognitive networking through joint optimization of channel access code-waveforms and routes in a multi-hop network are described in Chapter 5. Algorithmic designs are software optimized on heterogeneous multi-core general-purpose processor (GPP)-based SDR architectures by leveraging a novel software-radio framework that offers self-optimization and real-time adaptation capabilities at the PHY, MAC, and NET layers of the network protocol stack. Our system design approach is experimentally validated under realistic conditions in a large-scale hybrid ground-air testbed deployment. Chapter 6 reviews the state-of-art in software and hardware platforms for underwater wireless networking and proposes a software-defined acoustic modem prototype that enables (i) cognitive reconfiguration of PHY/MAC parameters, and (ii) cross-technology communication adaptation. The proposed modem design is evaluated in terms of effective communication data rate in both water tank and lake testbed setups. In Chapter 7, we present a novel receiver configuration for code-waveform-based multiple-access underwater communications. The proposed receiver is fully reconfigurable and executes (i) all-spectrum cognitive channelization, and (ii) combined synchronization, channel estimation, and demodulation. Experimental evaluation in terms of SINR and BER show that all-spectrum channelization is a powerful proposition for underwater communications. At the same time, the proposed receiver design can significantly enhance bandwidth utilization. Finally, in Chapter 8, we focus on challenging practical issues that arise in underwater acoustic sensor network setups where co-located multi-antenna sensor deployment is not feasible due to power, computation, and hardware limitations, and design, implement, and evaluate an underwater receiver structure that accounts for multiple carrier frequency and timing offsets in virtual (distributed) MIMO underwater systems.

  18. 77 FR 47707 - Public Housing Assessment System (PHAS): Physical Condition Scoring Notice and Revised Dictionary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-09

    ... Standards (UPCS) inspection protocol was designed to be a uniform inspection process and standard for HUD's... frequency of inspections based on the results the UPCS inspection. UPCS was designed to assess the condition... physical assessment score. HUD Response: The UPCS inspection protocol as designed assesses the physical...

  19. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations

    PubMed Central

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357

  20. Combined Shuttle-Box Training with Electrophysiological Cortex Recording and Stimulation as a Tool to Study Perception and Learning.

    PubMed

    Happel, Max F K; Deliano, Matthias; Ohl, Frank W

    2015-10-22

    Shuttle-box avoidance learning is a well-established method in behavioral neuroscience and experimental setups were traditionally custom-made; the necessary equipment is now available by several commercial companies. This protocol provides a detailed description of a two-way shuttle-box avoidance learning paradigm in rodents (here Mongolian gerbils; Meriones unguiculatus) in combination with site-specific electrical intracortical microstimulation (ICMS) and simultaneous chronical electrophysiological in vivo recordings. The detailed protocol is applicable to study multiple aspects of learning behavior and perception in different rodent species. Site-specific ICMS of auditory cortical circuits as conditioned stimuli here is used as a tool to test the perceptual relevance of specific afferent, efferent and intracortical connections. Distinct activation patterns can be evoked by using different stimulation electrode arrays for local, layer-dependent ICMS or distant ICMS sites. Utilizing behavioral signal detection analysis it can be determined which stimulation strategy is most effective for eliciting a behaviorally detectable and salient signal. Further, parallel multichannel-recordings using different electrode designs (surface electrodes, depth electrodes, etc.) allow for investigating neuronal observables over the time course of such learning processes. It will be discussed how changes of the behavioral design can increase the cognitive complexity (e.g. detection, discrimination, reversal learning).

  1. A Novel Domain Assembly Routine for Creating Full-Length Models of Membrane Proteins from Known Domain Structures.

    PubMed

    Koehler Leman, Julia; Bonneau, Richard

    2018-04-03

    Membrane proteins composed of soluble and membrane domains are often studied one domain at a time. However, to understand the biological function of entire protein systems and their interactions with each other and drugs, knowledge of full-length structures or models is required. Although few computational methods exist that could potentially be used to model full-length constructs of membrane proteins, none of these methods are perfectly suited for the problem at hand. Existing methods require an interface or knowledge of the relative orientations of the domains or are not designed for domain assembly, and none of them are developed for membrane proteins. Here we describe the first domain assembly protocol specifically designed for membrane proteins that assembles intra- and extracellular soluble domains and the transmembrane domain into models of the full-length membrane protein. Our protocol does not require an interface between the domains and samples possible domain orientations based on backbone dihedrals in the flexible linker regions, created via fragment insertion, while keeping the transmembrane domain fixed in the membrane. For five examples tested, our method mp_domain_assembly, implemented in RosettaMP, samples domain orientations close to the known structure and is best used in conjunction with experimental data to reduce the conformational search space.

  2. A contaminant-free assessment of Endogenous Retroviral RNA in human plasma

    PubMed Central

    Karamitros, Timokratis; Paraskevis, Dimitrios; Hatzakis, Angelos; Psichogiou, Mina; Elefsiniotis, Ioannis; Hurst, Tara; Geretti, Anna-Maria; Beloukas, Apostolos; Frater, John; Klenerman, Paul; Katzourakis, Aris; Magiorkinis, Gkikas

    2016-01-01

    Endogenous retroviruses (ERVs) comprise 6–8% of the human genome. HERVs are silenced in most normal tissues, up-regulated in stem cells and in placenta but also in cancer and HIV-1 infection. Crucially, there are conflicting reports on detecting HERV RNA in non-cellular clinical samples such as plasma that suggest the study of HERV RNA can be daunting. Indeed, we find that the use of real-time PCR in a quality assured clinical laboratory setting can be sensitive to low-level proviral contamination. We developed a mathematical model for low-level contamination that allowed us to design a laboratory protocol and standard operating procedures for robust measurement of HERV RNA. We focus on one family, HERV-K HML-2 (HK2) that has been most recently active even though they invaded our ancestral genomes almost 30 millions ago. We extensively validated our experimental design on a model cell culture system showing high sensitivity and specificity, totally eliminating the proviral contamination. We then tested 236 plasma samples from patients infected with HIV-1, HCV or HBV and found them to be negative. The study of HERV RNA for human translational studies should be performed with extensively validated protocols and standard operating procedures to control the widespread low-level human DNA contamination. PMID:27640347

  3. Listening with care: Using narrative methods to cultivate nurses’ responsive relationships in a home visiting intervention with teen mothers

    PubMed Central

    SmithBattle, Lee; Lorenz, Rebecca; Leander, Sheila

    2012-01-01

    Effective public health nursing relies on the development of responsive and collaborative relationships with families. While nurse-family relationships are endorsed by home visitation programs, training nurses to follow visit-to-visit protocols may unintentionally undermine these relationships and may also obscure nurses’ clinical understanding and situated knowledge. With these issues in mind, we designed a home visiting intervention, titled Listening with Care, to cultivate nurses’ relationships with teen mothers and nurses’ clinical judgment and reasoning. Rather than using protocols, the training for the intervention introduced nurses to narrative methods and therapeutic tools. This mixed-method pilot study included a quasi-experimental design to examine the effect of the intervention on teen mothers’ depressive symptoms, self-silencing, repeat pregnancy, and educational progress compared to teens who received usual care. Qualitative data was collected from the nurses to evaluate the feasibility and acceptability of the intervention and therapeutic tools. The nurses endorsed the therapeutic tools and expected to continue using them in their practice. Despite the lack of statistically significant differences in outcomes between groups, findings suggest that further study of the intervention is warranted. Future studies may have implications for strengthening hidden aspects of nursing that make a difference in the lives of teen mothers. PMID:22713121

  4. Design and Development of a Technology Platform for DNA-Encoded Library Production and Affinity Selection.

    PubMed

    Castañón, Jesús; Román, José Pablo; Jessop, Theodore C; de Blas, Jesús; Haro, Rubén

    2018-06-01

    DNA-encoded libraries (DELs) have emerged as an efficient and cost-effective drug discovery tool for the exploration and screening of very large chemical space using small-molecule collections of unprecedented size. Herein, we report an integrated automation and informatics system designed to enhance the quality, efficiency, and throughput of the production and affinity selection of these libraries. The platform is governed by software developed according to a database-centric architecture to ensure data consistency, integrity, and availability. Through its versatile protocol management functionalities, this application captures the wide diversity of experimental processes involved with DEL technology, keeps track of working protocols in the database, and uses them to command robotic liquid handlers for the synthesis of libraries. This approach provides full traceability of building-blocks and DNA tags in each split-and-pool cycle. Affinity selection experiments and high-throughput sequencing reads are also captured in the database, and the results are automatically deconvoluted and visualized in customizable representations. Researchers can compare results of different experiments and use machine learning methods to discover patterns in data. As of this writing, the platform has been validated through the generation and affinity selection of various libraries, and it has become the cornerstone of the DEL production effort at Lilly.

  5. The failure to detect drug-induced sensory loss in standard preclinical studies.

    PubMed

    Gauvin, David V; Abernathy, Matthew M; Tapp, Rachel L; Yoder, Joshua D; Dalton, Jill A; Baird, Theodore J

    2015-01-01

    Over the years a number of drugs have been approved for human use with limited signs of toxicity noted during preclinical risk assessment study designs but then show adverse events in compliant patients taking the drugs as prescribed within the first few years on the market. Loss or impairments in sensory systems, such as hearing, vision, taste, and smell have been reported to the FDA or have been described in the literature appearing in peer-reviewed scientific journals within the first five years of widespread use. This review highlights the interactive cross-modal compensation within sensory systems that can occur that reduces the likelihood of identifying these losses in less sentient animals used in standard preclinical toxicology and safety protocols. We provide some historical and experimental evidence to substantiate these sensory effects in and highlight the critical importance of detailed training of technicians on basic ethological, species-specific behaviors of all purpose-bred laboratory animals used in these study designs. We propose that the time, effort and cost of training technicians to be better able to identify and document very subtle changes in behavior will serve to increase the likelihood of early detection of biomarkers predictive of drug-induced sensory loss within current standard regulatory preclinical research protocols. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Combined Shuttle-Box Training with Electrophysiological Cortex Recording and Stimulation as a Tool to Study Perception and Learning

    PubMed Central

    Happel, Max F.K.

    2015-01-01

    Shuttle-box avoidance learning is a well-established method in behavioral neuroscience and experimental setups were traditionally custom-made; the necessary equipment is now available by several commercial companies. This protocol provides a detailed description of a two-way shuttle-box avoidance learning paradigm in rodents (here Mongolian gerbils; Meriones unguiculatus) in combination with site-specific electrical intracortical microstimulation (ICMS) and simultaneous chronical electrophysiological in vivo recordings. The detailed protocol is applicable to study multiple aspects of learning behavior and perception in different rodent species. Site-specific ICMS of auditory cortical circuits as conditioned stimuli here is used as a tool to test the perceptual relevance of specific afferent, efferent and intracortical connections. Distinct activation patterns can be evoked by using different stimulation electrode arrays for local, layer-dependent ICMS or distant ICMS sites. Utilizing behavioral signal detection analysis it can be determined which stimulation strategy is most effective for eliciting a behaviorally detectable and salient signal. Further, parallel multichannel-recordings using different electrode designs (surface electrodes, depth electrodes, etc.) allow for investigating neuronal observables over the time course of such learning processes. It will be discussed how changes of the behavioral design can increase the cognitive complexity (e.g. detection, discrimination, reversal learning). PMID:26556300

  7. Rendezvous Protocols and Dynamic Frequency Hopping Interference Design for Anti-Jamming Satellite Communication

    DTIC Science & Technology

    2013-11-25

    previously considered this proactive approach to combat unintentional, persistent (non- reactive) interference . In this project, we plan on extending our...channel” (or code ) by chance, through public knowledge of the underlying protocol semantics , or by compromising one of the network devices. An alternative...AFRL-RV-PS- AFRL-RV-PS- TR-2013-0142 TR-2013-0142 RENDEZVOUS PROTOCOLS AND DYNAMIC FREQUENCY HOPPING INTERFERENCE DESIGN FOR ANTI-JAMMING

  8. How We Make DNA Origami.

    PubMed

    Wagenbauer, Klaus F; Engelhardt, Floris A S; Stahl, Evi; Hechtl, Vera K; Stömmer, Pierre; Seebacher, Fabian; Meregalli, Letizia; Ketterer, Philip; Gerling, Thomas; Dietz, Hendrik

    2017-10-05

    DNA origami has attracted substantial attention since its invention ten years ago, due to the seemingly infinite possibilities that it affords for creating customized nanoscale objects. Although the basic concept of DNA origami is easy to understand, using custom DNA origami in practical applications requires detailed know-how for designing and producing the particles with sufficient quality and for preparing them at appropriate concentrations with the necessary degree of purity in custom environments. Such know-how is not readily available for newcomers to the field, thus slowing down the rate at which new applications outside the field of DNA nanotechnology may emerge. To foster faster progress, we share in this article the experience in making and preparing DNA origami that we have accumulated over recent years. We discuss design solutions for creating advanced structural motifs including corners and various types of hinges that expand the design space for the more rigid multilayer DNA origami and provide guidelines for preventing undesired aggregation and on how to induce specific oligomerization of multiple DNA origami building blocks. In addition, we provide detailed protocols and discuss the expected results for five key methods that allow efficient and damage-free preparation of DNA origami. These methods are agarose-gel purification, filtration through molecular cut-off membranes, PEG precipitation, size-exclusion chromatography, and ultracentrifugation-based sedimentation. The guide for creating advanced design motifs and the detailed protocols with their experimental characterization that we describe here should lower the barrier for researchers to accomplish the full DNA origami production workflow. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. in silico Whole Genome Sequencer & Analyzer (iWGS): A Computational Pipeline to Guide the Design and Analysis of de novo Genome Sequencing Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Xiaofan; Peris, David; Kominek, Jacek

    The availability of genomes across the tree of life is highly biased toward vertebrates, pathogens, human disease models, and organisms with relatively small and simple genomes. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understanding the biology and evolution of the full spectrum of biodiversity. The increasing diversity of sequencing technologies, assays, and de novo assembly algorithms have augmented the complexity of de novo genome sequencing projects in nonmodel organisms. To reduce the costs and challenges in de novo genome sequencing projects and streamline their experimentalmore » design and analysis, we developed iWGS (in silico Whole Genome Sequencer and Analyzer), an automated pipeline for guiding the choice of appropriate sequencing strategy and assembly protocols. iWGS seamlessly integrates the four key steps of a de novo genome sequencing project: data generation (through simulation), data quality control, de novo assembly, and assembly evaluation and validation. The last three steps can also be applied to the analysis of real data. iWGS is designed to enable the user to have great flexibility in testing the range of experimental designs available for genome sequencing projects, and supports all major sequencing technologies and popular assembly tools. Three case studies illustrate how iWGS can guide the design of de novo genome sequencing projects, and evaluate the performance of a wide variety of user-specified sequencing strategies and assembly protocols on genomes of differing architectures. iWGS, along with a detailed documentation, is freely available at https://github.com/zhouxiaofan1983/iWGS.« less

  10. in silico Whole Genome Sequencer & Analyzer (iWGS): A Computational Pipeline to Guide the Design and Analysis of de novo Genome Sequencing Studies

    DOE PAGES

    Zhou, Xiaofan; Peris, David; Kominek, Jacek; ...

    2016-09-16

    The availability of genomes across the tree of life is highly biased toward vertebrates, pathogens, human disease models, and organisms with relatively small and simple genomes. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understanding the biology and evolution of the full spectrum of biodiversity. The increasing diversity of sequencing technologies, assays, and de novo assembly algorithms have augmented the complexity of de novo genome sequencing projects in nonmodel organisms. To reduce the costs and challenges in de novo genome sequencing projects and streamline their experimentalmore » design and analysis, we developed iWGS (in silico Whole Genome Sequencer and Analyzer), an automated pipeline for guiding the choice of appropriate sequencing strategy and assembly protocols. iWGS seamlessly integrates the four key steps of a de novo genome sequencing project: data generation (through simulation), data quality control, de novo assembly, and assembly evaluation and validation. The last three steps can also be applied to the analysis of real data. iWGS is designed to enable the user to have great flexibility in testing the range of experimental designs available for genome sequencing projects, and supports all major sequencing technologies and popular assembly tools. Three case studies illustrate how iWGS can guide the design of de novo genome sequencing projects, and evaluate the performance of a wide variety of user-specified sequencing strategies and assembly protocols on genomes of differing architectures. iWGS, along with a detailed documentation, is freely available at https://github.com/zhouxiaofan1983/iWGS.« less

  11. The Land Use Model Intercomparison Project (LUMIP) contribution to CMIP6: rationale and experimental design

    NASA Astrophysics Data System (ADS)

    Lawrence, David M.; Hurtt, George C.; Arneth, Almut; Brovkin, Victor; Calvin, Kate V.; Jones, Andrew D.; Jones, Chris D.; Lawrence, Peter J.; de Noblet-Ducoudré, Nathalie; Pongratz, Julia; Seneviratne, Sonia I.; Shevliakova, Elena

    2016-09-01

    Human land-use activities have resulted in large changes to the Earth's surface, with resulting implications for climate. In the future, land-use activities are likely to expand and intensify further to meet growing demands for food, fiber, and energy. The Land Use Model Intercomparison Project (LUMIP) aims to further advance understanding of the impacts of land-use and land-cover change (LULCC) on climate, specifically addressing the following questions. (1) What are the effects of LULCC on climate and biogeochemical cycling (past-future)? (2) What are the impacts of land management on surface fluxes of carbon, water, and energy, and are there regional land-management strategies with the promise to help mitigate climate change? In addressing these questions, LUMIP will also address a range of more detailed science questions to get at process-level attribution, uncertainty, data requirements, and other related issues in more depth and sophistication than possible in a multi-model context to date. There will be particular focus on the separation and quantification of the effects on climate from LULCC relative to all forcings, separation of biogeochemical from biogeophysical effects of land use, the unique impacts of land-cover change vs. land-management change, modulation of land-use impact on climate by land-atmosphere coupling strength, and the extent to which impacts of enhanced CO2 concentrations on plant photosynthesis are modulated by past and future land use.LUMIP involves three major sets of science activities: (1) development of an updated and expanded historical and future land-use data set, (2) an experimental protocol for specific LUMIP experiments for CMIP6, and (3) definition of metrics and diagnostic protocols that quantify model performance, and related sensitivities, with respect to LULCC. In this paper, we describe LUMIP activity (2), i.e., the LUMIP simulations that will formally be part of CMIP6. These experiments are explicitly designed to be complementary to simulations requested in the CMIP6 DECK and historical simulations and other CMIP6 MIPs including ScenarioMIP, C4MIP, LS3MIP, and DAMIP. LUMIP includes a two-phase experimental design. Phase one features idealized coupled and land-only model simulations designed to advance process-level understanding of LULCC impacts on climate, as well as to quantify model sensitivity to potential land-cover and land-use change. Phase two experiments focus on quantification of the historic impact of land use and the potential for future land management decisions to aid in mitigation of climate change. This paper documents these simulations in detail, explains their rationale, outlines plans for analysis, and describes a new subgrid land-use tile data request for selected variables (reporting model output data separately for primary and secondary land, crops, pasture, and urban land-use types). It is essential that modeling groups participating in LUMIP adhere to the experimental design as closely as possible and clearly report how the model experiments were executed.

  12. The Land Use Model Intercomparison Project (LUMIP) contribution to CMIP6: rationale and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, David M.; Hurtt, George C.; Arneth, Almut

    Human land-use activities have resulted in large changes to the Earth's surface, with resulting implications for climate. In the future, land-use activities are likely to expand and intensify further to meet growing demands for food, fiber, and energy. The Land Use Model Intercomparison Project (LUMIP) aims to further advance understanding of the impacts of land-use and land-cover change (LULCC) on climate, specifically addressing the following questions. (1) What are the effects of LULCC on climate and biogeochemical cycling (past-future)? (2) What are the impacts of land management on surface fluxes of carbon, water, and energy, and are there regional land-managementmore » st rategies with the promise to help mitigate climate change? In addressing these questions, LUMIP will also address a range of more detailed science questions to get at process-level attribution, uncertainty, data requirements, and other related issues in more depth and sophistication than possible in a multi-model context to date. There will be particular focus on the separation and quantification of the effects on climate from LULCC relative to all forcings, separation of biogeochemical from biogeophysical effects of land use, the unique impacts of land-cover change vs. land-management change, modulation of land-use impact on climate by land-atmosphere coupling strength, and the extent to which impacts of enhanced CO 2 concentrations on plant photosynthesis are modulated by past and future land use.LUMIP involves three major sets of science activities: (1) development of an updated and expanded historical and future land-use data set, (2) an experimental protocol for specific LUMIP experiments for CMIP6, and (3) definition of metrics and diagnostic protocols that quantify model performance, and related sensitivities, with respect to LULCC. In this paper, we describe LUMIP activity (2), i.e., the LUMIP simulations that will formally be part of CMIP6. These experiments are explicitly designed to be complementary to simulations requested in the CMIP6 DECK and historical simulations and other CMIP6 MIPs including ScenarioMIP, C4MIP, LS3MIP, and DAMIP. LUMIP includes a two-phase experimental design. Phase one features idealized coupled and land-only model simulations designed to advance process-level understanding of LULCC impacts on climate, as well as to quantify model sensitivity to potential land-cover and land-use change. Phase two experiments focus on quantification of the historic impact of land use and the potential for future land management decisions to aid in mitigation of climate change. This paper documents these simulations in detail, explains their rationale, outlines plans for analysis, and describes a new subgrid land-use tile data request for selected variables (reporting model output data separately for primary and secondary land, crops, pasture, and urban land-use types). It is essential that modeling groups participating in LUMIP adhere to the experimental design as closely as possible and clearly report how the model experiments were executed.« less

  13. The Land Use Model Intercomparison Project (LUMIP) contribution to CMIP6: rationale and experimental design

    DOE PAGES

    Lawrence, David M.; Hurtt, George C.; Arneth, Almut; ...

    2016-09-02

    Human land-use activities have resulted in large changes to the Earth's surface, with resulting implications for climate. In the future, land-use activities are likely to expand and intensify further to meet growing demands for food, fiber, and energy. The Land Use Model Intercomparison Project (LUMIP) aims to further advance understanding of the impacts of land-use and land-cover change (LULCC) on climate, specifically addressing the following questions. (1) What are the effects of LULCC on climate and biogeochemical cycling (past-future)? (2) What are the impacts of land management on surface fluxes of carbon, water, and energy, and are there regional land-managementmore » st rategies with the promise to help mitigate climate change? In addressing these questions, LUMIP will also address a range of more detailed science questions to get at process-level attribution, uncertainty, data requirements, and other related issues in more depth and sophistication than possible in a multi-model context to date. There will be particular focus on the separation and quantification of the effects on climate from LULCC relative to all forcings, separation of biogeochemical from biogeophysical effects of land use, the unique impacts of land-cover change vs. land-management change, modulation of land-use impact on climate by land-atmosphere coupling strength, and the extent to which impacts of enhanced CO 2 concentrations on plant photosynthesis are modulated by past and future land use.LUMIP involves three major sets of science activities: (1) development of an updated and expanded historical and future land-use data set, (2) an experimental protocol for specific LUMIP experiments for CMIP6, and (3) definition of metrics and diagnostic protocols that quantify model performance, and related sensitivities, with respect to LULCC. In this paper, we describe LUMIP activity (2), i.e., the LUMIP simulations that will formally be part of CMIP6. These experiments are explicitly designed to be complementary to simulations requested in the CMIP6 DECK and historical simulations and other CMIP6 MIPs including ScenarioMIP, C4MIP, LS3MIP, and DAMIP. LUMIP includes a two-phase experimental design. Phase one features idealized coupled and land-only model simulations designed to advance process-level understanding of LULCC impacts on climate, as well as to quantify model sensitivity to potential land-cover and land-use change. Phase two experiments focus on quantification of the historic impact of land use and the potential for future land management decisions to aid in mitigation of climate change. This paper documents these simulations in detail, explains their rationale, outlines plans for analysis, and describes a new subgrid land-use tile data request for selected variables (reporting model output data separately for primary and secondary land, crops, pasture, and urban land-use types). It is essential that modeling groups participating in LUMIP adhere to the experimental design as closely as possible and clearly report how the model experiments were executed.« less

  14. A self-paced motor imagery based brain-computer interface for robotic wheelchair control.

    PubMed

    Tsui, Chun Sing Louis; Gan, John Q; Hu, Huosheng

    2011-10-01

    This paper presents a simple self-paced motor imagery based brain-computer interface (BCI) to control a robotic wheelchair. An innovative control protocol is proposed to enable a 2-class self-paced BCI for wheelchair control, in which the user makes path planning and fully controls the wheelchair except for the automatic obstacle avoidance based on a laser range finder when necessary. In order for the users to train their motor imagery control online safely and easily, simulated robot navigation in a specially designed environment was developed. This allowed the users to practice motor imagery control with the core self-paced BCI system in a simulated scenario before controlling the wheelchair. The self-paced BCI can then be applied to control a real robotic wheelchair using a protocol similar to that controlling the simulated robot. Our emphasis is on allowing more potential users to use the BCI controlled wheelchair with minimal training; a simple 2-class self paced system is adequate with the novel control protocol, resulting in a better transition from offline training to online control. Experimental results have demonstrated the usefulness of the online practice under the simulated scenario, and the effectiveness of the proposed self-paced BCI for robotic wheelchair control.

  15. Non-Invasive Transcranial Brain Therapy Guided by CT Scans: an In Vivo Monkey Study

    NASA Astrophysics Data System (ADS)

    Marquet, F.; Pernot, M.; Aubry, J.-F.; Montaldo, G.; Tanter, M.; Boch, A.-L.; Kujas, M.; Seilhean, D.; Fink, M.

    2007-05-01

    Brain therapy using focused ultrasound remains very limited due to the strong aberrations induced by the skull. A minimally invasive technique using time-reversal was validated recently in-vivo on 20 sheeps. But such a technique requires a hydrophone at the focal point for the first step of the time-reversal procedure. A completely noninvasive therapy requires a reliable model of the acoustic properties of the skull in order to simulate this first step. 3-D simulations based on high-resolution CT images of a skull have been successfully performed with a finite differences code developed in our Laboratory. Thanks to the skull porosity, directly extracted from the CT images, we reconstructed acoustic speed, density and absorption maps and performed the computation. Computed wavefronts are in good agreement with experimental wavefronts acquired through the same part of the skull and this technique was validated in-vitro in the laboratory. A stereotactic frame has been designed and built in order to perform non invasive transcranial focusing in vivo. Here we describe all the steps of our new protocol, from the CT-scans to the therapy treatment and the first in vivo results on a monkey will be presented. This protocol is based on protocols already existing in radiotherapy.

  16. On Increasing Network Lifetime in Body Area Networks Using Global Routing with Energy Consumption Balancing

    PubMed Central

    Tsouri, Gill R.; Prieto, Alvaro; Argade, Nikhil

    2012-01-01

    Global routing protocols in wireless body area networks are considered. Global routing is augmented with a novel link cost function designed to balance energy consumption across the network. The result is a substantial increase in network lifetime at the expense of a marginal increase in energy per bit. Network maintenance requirements are reduced as well, since balancing energy consumption means all batteries need to be serviced at the same time and less frequently. The proposed routing protocol is evaluated using a hardware experimental setup comprising multiple nodes and an access point. The setup is used to assess network architectures, including an on-body access point and an off-body access point with varying number of antennas. Real-time experiments are conducted in indoor environments to assess performance gains. In addition, the setup is used to record channel attenuation data which are then processed in extensive computer simulations providing insight on the effect of protocol parameters on performance. Results demonstrate efficient balancing of energy consumption across all nodes, an average increase of up to 40% in network lifetime corresponding to a modest average increase of 0.4 dB in energy per bit, and a cutoff effect on required transmission power to achieve reliable connectivity. PMID:23201987

  17. On increasing network lifetime in body area networks using global routing with energy consumption balancing.

    PubMed

    Tsouri, Gill R; Prieto, Alvaro; Argade, Nikhil

    2012-09-26

    Global routing protocols in wireless body area networks are considered. Global routing is augmented with a novel link cost function designed to balance energy consumption across the network. The result is a substantial increase in network lifetime at the expense of a marginal increase in energy per bit. Network maintenance requirements are reduced as well, since balancing energy consumption means all batteries need to be serviced at the same time and less frequently. The proposed routing protocol is evaluated using a hardware experimental setup comprising multiple nodes and an access point. The setup is used to assess network architectures, including an on-body access point and an off-body access point with varying number of antennas. Real-time experiments are conducted in indoor environments to assess performance gains. In addition, the setup is used to record channel attenuation data which are then processed in extensive computer simulations providing insight on the effect of protocol parameters on performance. Results demonstrate efficient balancing of energy consumption across all nodes, an average increase of up to 40% in network lifetime corresponding to a modest average increase of 0.4 dB in energy per bit, and a cutoff effect on required transmission power to achieve reliable connectivity.

  18. Changes in salivary oxytocin after inhalation of clary sage essential oil scent in term-pregnant women: a feasibility pilot study.

    PubMed

    Tadokoro, Yuriko; Horiuchi, Shigeko; Takahata, Kaori; Shuo, Takuya; Sawano, Erika; Shinohara, Kazuyuki

    2017-12-08

    This pilot study using a quasi-experimental design was conducted to evaluate the feasibility (i.e., limited efficacy, practicality, and acceptability) of our intervention protocol involving inhalation of the scent of clary sage essential oil by pregnant women and measurement of their preinhalation and postinhalation oxytocin levels. Participants were women of singleton pregnancies between 38 and 40 gestation weeks (N = 11). The experiment group (n = 5) inhaled the scent of clary sage essential oil diluted 50-fold with 10 mL of odorless propylene glycol for 20 min. Regarding limited efficacy, the oxytocin level 15 min postinhalation increased in 3 women and was unmeasurable in 2. The control group (n = 6) inhaled similarly without the 50-fold dilution of clary sage essential oil. Their oxytocin level increased in 2 women, decreased in 2, and was unmeasurable in 2. Uterine contraction was not observed in both groups. Regarding practicality, 3 of the 11 women could not collect sufficient saliva. The cortisol level decreased in both groups postinhalation. The protocol had no negative effects. Regarding acceptability, burden of the protocol was not observed. Trial registration The Clinical Trials Registry of University Hospital Medical Information Network in Japan-UMIN000017830. Registered:  June 8, 2015.

  19. Quantitative metabolomics of the thermophilic methylotroph Bacillus methanolicus.

    PubMed

    Carnicer, Marc; Vieira, Gilles; Brautaset, Trygve; Portais, Jean-Charles; Heux, Stephanie

    2016-06-01

    The gram-positive bacterium Bacillus methanolicus MGA3 is a promising candidate for methanol-based biotechnologies. Accurate determination of intracellular metabolites is crucial for engineering this bacteria into an efficient microbial cell factory. Due to the diversity of chemical and cell properties, an experimental protocol validated on B. methanolicus is needed. Here a systematic evaluation of different techniques for establishing a reliable basis for metabolome investigations is presented. Metabolome analysis was focused on metabolites closely linked with B. methanolicus central methanol metabolism. As an alternative to cold solvent based procedures, a solvent-free quenching strategy using stainless steel beads cooled to -20 °C was assessed. The precision, the consistency of the measurements, and the extent of metabolite leakage from quenched cells were evaluated in procedures with and without cell separation. The most accurate and reliable performance was provided by the method without cell separation, as significant metabolite leakage occurred in the procedures based on fast filtration. As a biological test case, the best protocol was used to assess the metabolome of B. methanolicus grown in chemostat on methanol at two different growth rates and its validity was demonstrated. The presented protocol is a first and helpful step towards developing reliable metabolomics data for thermophilic methylotroph B. methanolicus. This will definitely help for designing an efficient methylotrophic cell factory.

  20. A clinical pathway for the postoperative management of hypocalcemia after pediatric thyroidectomy reduces blood draws.

    PubMed

    Patel, Neha A; Bly, Randall A; Adams, Seth; Carlin, Kristen; Parikh, Sanjay R; Dahl, John P; Manning, Scott

    2018-02-01

    Postoperative calcium management is challenging following pediatric thyroidectomy given potential limitations in self-reporting symptoms and compliance with phlebotomy. A protocol was created at our tertiary children's institution utilizing intraoperative parathyroid hormone (PTH) levels to guide electrolyte management during hospitalization. The objective of this study was to determine the effect of a new thyroidectomy postoperative management protocol on two primary outcomes: (1) the number of postoperative calcium blood draws and (2) the length of hospital stay. Institutional review board approved retrospective study (2010-2016). Consecutive pediatric total thyroidectomy and completion thyroidectomy ± neck dissection cases from 1/1/2010 through 8/5/2016 at a single tertiary children's institution were retrospectively reviewed before and after initiation of a new management protocol. All cases after 2/1/2014 comprised the experimental group (post-protocol implementation). The pre-protocol control group consisted of cases prior to 2/1/2014. Multivariable linear and Poisson regression models were used to compare the control and experimental groups for outcome measure of number of calcium lab draws and hospital length of stay. 53 patients were included (n = 23, control group; n = 30 experimental group). The median age was 15 years. 41 patients (77.4%) were female. Postoperative calcium draws decreased from a mean of 5.2 to 3.6 per day post-protocol implementation (Rate Ratio = 0.70, p < .001), adjusting for covariates. The mean number of total inpatient calcium draws before protocol initiation was 13.3 (±13.20) compared to 7.2 (±4.25) in the post-protocol implementation group. Length of stay was 2.1 days in the control group and 1.8 days post-protocol implementation (p = .29). Patients who underwent concurrent neck dissection had a longer mean length of stay of 2.32 days compared to 1.66 days in those patients who did not undergo a neck dissection (p = .02). Hypocalcemia was also associated with a longer mean length of stay of 2.41 days compared to 1.60 days in patients who did not develop hypocalcemia (p < .01). The number of calcium blood draws was significantly reduced after introduction of a standardized protocol based on intraoperative PTH levels. The hospital length of stay did not change. Adoption of a standardized postoperative protocol based on intraoperative PTH levels may reduce the number of blood draws in children undergoing thyroidectomy. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Exploring the Interplay between Rescue Drugs, Data Imputation, and Study Outcomes: Conceptual Review and Qualitative Analysis of an Acute Pain Data Set.

    PubMed

    Singla, Neil K; Meske, Diana S; Desjardins, Paul J

    2017-12-01

    In placebo-controlled acute surgical pain studies, provisions must be made for study subjects to receive adequate analgesic therapy. As such, most protocols allow study subjects to receive a pre-specified regimen of open-label analgesic drugs (rescue drugs) as needed. The selection of an appropriate rescue regimen is a critical experimental design choice. We hypothesized that a rescue regimen that is too liberal could lead to all study arms receiving similar levels of pain relief (thereby confounding experimental results), while a regimen that is too stringent could lead to a high subject dropout rate (giving rise to a preponderance of missing data). Despite the importance of rescue regimen as a study design feature, there exist no published review articles or meta-analysis focusing on the impact of rescue therapy on experimental outcomes. Therefore, when selecting a rescue regimen, researchers must rely on clinical factors (what analgesics do patients usually receive in similar surgical scenarios) and/or anecdotal evidence. In the following article, we attempt to bridge this gap by reviewing and discussing the experimental impacts of rescue therapy on a common acute surgical pain population: first metatarsal bunionectomy. The function of this analysis is to (1) create a framework for discussion and future exploration of rescue as a methodological study design feature, (2) discuss the interplay between data imputation techniques and rescue drugs, and (3) inform the readership regarding the impact of data imputation techniques on the validity of study conclusions. Our findings indicate that liberal rescue may degrade assay sensitivity, while stringent rescue may lead to unacceptably high dropout rates.

  2. Development of a method for the characterization and operation of UV-LED for water treatment.

    PubMed

    Kheyrandish, Ataollah; Mohseni, Madjid; Taghipour, Fariborz

    2017-10-01

    Tremendous improvements in semiconductor technology have made ultraviolet light-emitting diodes (UV-LEDs) a viable alternative to conventional UV sources for water treatment. A robust and validated experimental protocol for studying the kinetics of microorganism inactivation is key to the further development of UV-LEDs for water treatment. This study proposes a protocol to operate UV-LEDs and control their output as a polychromatic radiation source. In order to systematically develop this protocol, the results of spectral power distribution, radiation profile, and radiant power measurements of a variety of UV-LEDs are presented. A wide range of UV-LEDs was selected for this study, covering various UVA, UVB, and UVC wavelengths, viewing angles from 3.5° to 135°, and a variety of output powers. The effects of operational conditions and measurement techniques were investigated on these UV-LEDs using a specially designed and fabricated setup. Operating conditions, such as the UV-LED electrical current and solder temperature, were found to significantly affect the power and peak wavelength output. The measurement techniques and equipment, including the detector size, detector distance from the UV-LED, and potential reflection from the environment, were shown to influence the results for many of the UV-LEDs. The results obtained from these studies were analyzed and applied to the development of a protocol for UV-LED characterization. This protocol is presented as a guideline that allows the operation and control of UV-LEDs in any structure, as well as accurately measuring the UV-LED output. Such information is essential for performing a reliable UV-LED assessment for the inactivation of microorganisms and for obtaining precise kinetic data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Move to improve: the feasibility of using an early mobility protocol to increase ambulation in the intensive and intermediate care settings.

    PubMed

    Drolet, Anne; DeJuilio, Patti; Harkless, Sherri; Henricks, Sherry; Kamin, Elizabeth; Leddy, Elizabeth A; Lloyd, Joanna M; Waters, Carissa; Williams, Sarah

    2013-02-01

    Prolonged bed rest in hospitalized patients leads to deconditioning, impaired mobility, and the potential for longer hospital stays. The purpose of this study was to determine the effectiveness of a nurse-driven mobility protocol to increase the percentage of patients ambulating during the first 72 hours of their hospital stay. A quasi-experimental design was used before and after intervention in a 16-bed adult medical/surgical intensive care unit (ICU) and a 26-bed adult intermediate care unit (IMCU) at a large community hospital. A multidisciplinary team developed and implemented a mobility order set with an embedded algorithm to guide nursing assessment of mobility potential. Based on the assessments, the protocol empowers the nurse to consult physical therapists or occupational therapists when appropriate. Daily ambulation status reports were reviewed each morning to determine each patient's activity level. Retrospective and prospective chart reviews were performed to evaluate the effectiveness of the protocol for patients 18 years of age and older who were hospitalized 72 hours or longer. In the 3 months prior to implementation of the Move to Improve project, 6.2% (12 of 193) of the ICU patients and 15.5% (54 of 349) of the IMCU patients ambulated during the first 72 hours of their hospitalization. During the 6 months following implementation, those rates rose to 20.2% (86 of 426) and 71.8% (257 of 358), respectively. The study was carried out at only one center. The initial experience with a nurse-driven mobility protocol suggests that the rate of patient ambulation in an adult ICU and IMCU during the first 72 hours of a hospital stay can be increased.

  4. PANATIKI: A Network Access Control Implementation Based on PANA for IoT Devices

    PubMed Central

    Sanchez, Pedro Moreno; Lopez, Rafa Marin; Gomez Skarmeta, Antonio F.

    2013-01-01

    Internet of Things (IoT) networks are the pillar of recent novel scenarios, such as smart cities or e-healthcare applications. Among other challenges, these networks cover the deployment and interaction of small devices with constrained capabilities and Internet protocol (IP)-based networking connectivity. These constrained devices usually require connection to the Internet to exchange information (e.g., management or sensing data) or access network services. However, only authenticated and authorized devices can, in general, establish this connection. The so-called authentication, authorization and accounting (AAA) services are in charge of performing these tasks on the Internet. Thus, it is necessary to deploy protocols that allow constrained devices to verify their credentials against AAA infrastructures. The Protocol for Carrying Authentication for Network Access (PANA) has been standardized by the Internet engineering task force (IETF) to carry the Extensible Authentication Protocol (EAP), which provides flexible authentication upon the presence of AAA. To the best of our knowledge, this paper is the first deep study of the feasibility of EAP/PANA for network access control in constrained devices. We provide light-weight versions and implementations of these protocols to fit them into constrained devices. These versions have been designed to reduce the impact in standard specifications. The goal of this work is two-fold: (1) to demonstrate the feasibility of EAP/PANA in IoT devices; (2) to provide the scientific community with the first light-weight interoperable implementation of EAP/PANA for constrained devices in the Contiki operating system (Contiki OS), called PANATIKI. The paper also shows a testbed, simulations and experimental results obtained from real and simulated constrained devices. PMID:24189332

  5. PANATIKI: a network access control implementation based on PANA for IoT devices.

    PubMed

    Moreno Sanchez, Pedro; Marin Lopez, Rafa; Gomez Skarmeta, Antonio F

    2013-11-01

    Internet of Things (IoT) networks are the pillar of recent novel scenarios, such as smart cities or e-healthcare applications. Among other challenges, these networks cover the deployment and interaction of small devices with constrained capabilities and Internet protocol (IP)-based networking connectivity. These constrained devices usually require connection to the Internet to exchange information (e.g., management or sensing data) or access network services. However, only authenticated and authorized devices can, in general, establish this connection. The so-called authentication, authorization and accounting (AAA) services are in charge of performing these tasks on the Internet. Thus, it is necessary to deploy protocols that allow constrained devices to verify their credentials against AAA infrastructures. The Protocol for Carrying Authentication for Network Access (PANA) has been standardized by the Internet engineering task force (IETF) to carry the Extensible Authentication Protocol (EAP), which provides flexible authentication upon the presence of AAA. To the best of our knowledge, this paper is the first deep study of the feasibility of EAP/PANA for network access control in constrained devices. We provide light-weight versions and implementations of these protocols to fit them into constrained devices. These versions have been designed to reduce the impact in standard specifications. The goal of this work is two-fold: (1) to demonstrate the feasibility of EAP/PANA in IoT devices; (2) to provide the scientific community with the first light-weight interoperable implementation of EAP/PANA for constrained devices in the Contiki operating system (Contiki OS), called PANATIKI. The paper also shows a testbed, simulations and experimental results obtained from real and simulated constrained devices.

  6. Rethinking developmental toxicity testing: Evolution or revolution?

    PubMed

    Scialli, Anthony R; Daston, George; Chen, Connie; Coder, Prägati S; Euling, Susan Y; Foreman, Jennifer; Hoberman, Alan M; Hui, Julia; Knudsen, Thomas; Makris, Susan L; Morford, LaRonda; Piersma, Aldert H; Stanislaus, Dinesh; Thompson, Kary E

    2018-06-01

    Current developmental toxicity testing adheres largely to protocols suggested in 1966 involving the administration of test compound to pregnant laboratory animals. After more than 50 years of embryo-fetal development testing, are we ready to consider a different approach to human developmental toxicity testing? A workshop was held under the auspices of the Developmental and Reproductive Toxicology Technical Committee of the ILSI Health and Environmental Sciences Institute to consider how we might design developmental toxicity testing if we started over with 21st century knowledge and techniques (revolution). We first consider what changes to the current protocols might be recommended to make them more predictive for human risk (evolution). The evolutionary approach includes modifications of existing protocols and can include humanized models, disease models, more accurate assessment and testing of metabolites, and informed approaches to dose selection. The revolution could start with hypothesis-driven testing where we take what we know about a compound or close analog and answer specific questions using targeted experimental techniques rather than a one-protocol-fits-all approach. Central to the idea of hypothesis-driven testing is the concept that testing can be done at the level of mode of action. It might be feasible to identify a small number of key events at a molecular or cellular level that predict an adverse outcome and for which testing could be performed in vitro or in silico or, rarely, using limited in vivo models. Techniques for evaluating these key events exist today or are in development. Opportunities exist for refining and then replacing current developmental toxicity testing protocols using techniques that have already been developed or are within reach. © 2018 The Authors. Birth Defects Research Published by Wiley Periodicals, Inc.

  7. Effects of a peer support programme for youth social services employees experiencing potentially traumatic events: a protocol for a prospective cohort study.

    PubMed

    Guay, Stephane; Tremblay, Nicole; Goncalves, Jane; Bilodeau, Henriette; Geoffrion, Steve

    2017-06-24

    The use of peer support programmes to help workers experiencing potentially traumatic events (PTE) has increased in high-risk organisations in the last decades. However, the scientific evidence of its effectiveness is still very limited. This paper aims to describe the protocol of a prospective cohort study that assesses the efficacy of a peer support programme among youth social services employees exposed to a PTE at work on psychological well-being, work functioning and needs of support. This is a mixed-methods prospective study that will examine workers' evolution four times over a 12-month period in Canada. This study involves: (1) quantitative data obtained through self-administrated questionnaires among 222 workers, and (2) qualitative in-depth interviews with a subsample of 45 workers. This study will compare findings from a cohort who received the support of a peer following a PTE (peer support-experimental protocol) as part of the experimental protocol of the Montreal Youth Social Services-University Institute (MYSS-UI), the second group of workers did not ask for the peer support (no peer support-experimental protocol) but was part of MYSS-UI, and the third group received standard organisational support from the Monteregie Youth Social Services (MYSS) (standard organisational protocol). The protocol and informed consent form complied with the ethics guidelines of the MYSS-UI. The Research Ethics Board of MYSS-UI and MYSS reviewed and accepted the protocol as required. The results of the study will be published in peer-reviewed journals, presented at research and general public conferences, disseminated via a public report for the institute that funded the project and for all workers. Results of this study will influence decision making regarding intervention policies following PTE and peer support interventions may be expanded throughout the youth social services in Canada and worldwide. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Improved Modeling of Side-Chain–Base Interactions and Plasticity in Protein–DNA Interface Design

    PubMed Central

    Thyme, Summer B.; Baker, David; Bradley, Philip

    2012-01-01

    Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed “motifs”) was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein–DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. PMID:22426128

  9. Improved modeling of side-chain--base interactions and plasticity in protein--DNA interface design.

    PubMed

    Thyme, Summer B; Baker, David; Bradley, Philip

    2012-06-08

    Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed "motifs") was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein-DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. Published by Elsevier Ltd.

  10. Buprenorphine During Pregnancy Reduces Neonate Distress

    MedlinePlus

    ... supported clinical trial, the Maternal Opioid Treatment: Human Experimental Research (MOTHER) study, has found buprenorphine to be ... to the two medications. They surmise that the experimental treatment protocols may have moved patients from morphine ...

  11. Experimental quantum fingerprinting with weak coherent pulses

    PubMed Central

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-01-01

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity. PMID:26515586

  12. Experimental quantum fingerprinting with weak coherent pulses.

    PubMed

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-10-30

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity.

  13. Experimental quantum fingerprinting with weak coherent pulses

    NASA Astrophysics Data System (ADS)

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-10-01

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity.

  14. The Instrumental Model

    ERIC Educational Resources Information Center

    Yeates, Devin Rodney

    2011-01-01

    The goal of this dissertation is to enable better predictive models by engaging raw experimental data through the Instrumental Model. The Instrumental Model captures the protocols and procedures of experimental data analysis. The approach is formalized by encoding the Instrumental Model in an XML record. Decoupling the raw experimental data from…

  15. The Design of Finite State Machine for Asynchronous Replication Protocol

    NASA Astrophysics Data System (ADS)

    Wang, Yanlong; Li, Zhanhuai; Lin, Wei; Hei, Minglei; Hao, Jianhua

    Data replication is a key way to design a disaster tolerance system and to achieve reliability and availability. It is difficult for a replication protocol to deal with the diverse and complex environment. This means that data is less well replicated than it ought to be. To reduce data loss and to optimize replication protocols, we (1) present a finite state machine, (2) run it to manage an asynchronous replication protocol and (3) report a simple evaluation of the asynchronous replication protocol based on our state machine. It's proved that our state machine is applicable to guarantee the asynchronous replication protocol running in the proper state to the largest extent in the event of various possible events. It also can helpful to build up replication-based disaster tolerance systems to ensure the business continuity.

  16. Uniform magnetic fields and double-wrapped coil systems: improved techniques for the design of bioelectromagnetic experiments.

    PubMed

    Kirschvink, J L

    1992-01-01

    A common mistake in biomagnetic experimentation is the assumption that Helmholtz coils provide uniform magnetic fields; this is true only for a limited volume at their center. Substantial improvements on this design have been made during the past 140 years with systems of three, four, and five coils. Numerical comparisons of the field uniformity generated by these designs are made here, along with a table of construction details and recommendations for their use in experiments in which large volumes of uniform intensity magnetic exposures are needed. Double-wrapping, or systems of bifilar windings, can also help control for the non-magnetic effects of the electric coils used in many experiments. In this design, each coil is wrapped in parallel with two separate, adjacent strands of copper wire, rather than the single strand used normally. If currents are flowing in antiparallel directions, the magnetic fields generated by each strand will cancel and yield virtually no external magnetic field, whereas parallel currents will yield an external field. Both cases will produce similar non-magnetic effects of ohmic heating, and simple measures can reduce the small vibration and electric field differences. Control experiments can then be designed such that the only major difference between treated and untreated groups is the presence or absence of the magnetic field. Double-wrapped coils also facilitate the use of truly double-blind protocol, as the same apparatus can be used either for experimental or control groups.

  17. Evaluation of assertiveness training for psychiatric patients.

    PubMed

    Lin, Yen-Ru; Wu, Mei-Hsuen; Yang, Cheng-I; Chen, Tsai-Hwei; Hsu, Chen-Chuan; Chang, Yue-Cune; Tzeng, Wen-Chii; Chou, Yuan-Hwa; Chou, Kuei-Ru

    2008-11-01

    To investigate the effectiveness of assertiveness training programmes on psychiatric patients' assertiveness, self-esteem and social anxiety. Assertiveness training programmes are designed to improve an individual's assertive beliefs and behaviours, which can help the individual change how they view themselves and establish self-confidence and social anxiety. It is useful for patients with depression, depressive phase of bipolar disorder, anxiety disorder or adjustment disorder. Experimental. There were 68 subjects (28, experimental group; 40, diagnosis-matched comparison group). Subjects in experimental groups participated in experimenter-designed assertiveness training twice a week (two hours each) for four weeks. The comparison groups participated the usual activities. Data were collected in the two groups at the same time: before, after and one month after training programme. Efficacy was measured by assertiveness, self-esteem and social anxiety inventories. A generalised estimating equation was used for analysis. After training, subjects had a significant increase in assertiveness immediately after the assertiveness training programme and one-month follow-up. There was a significant decrease in social anxiety after training, but the improvement was not significant after one month. Self-esteem did not increase significantly after training. With our sample of patients with mixed diagnoses, assertiveness seemed to be improved after assertiveness training. Patients would benefit more from the assertiveness training programme for the change in how they view themselves, improve their assertiveness, properly express their individual moods and thoughts and further establish self-confidence. The assertiveness training protocol could be provided as a reference guide to clinical nurses.

  18. Novel Networked Remote Laboratory Architecture for Open Connectivity Based on PLC-OPC-LabVIEW-EJS Integration. Application in Remote Fuzzy Control and Sensors Data Acquisition.

    PubMed

    González, Isaías; Calderón, Antonio José; Mejías, Andrés; Andújar, José Manuel

    2016-10-31

    In this paper the design and implementation of a network for integrating Programmable Logic Controllers (PLC), the Object-Linking and Embedding for Process Control protocol (OPC) and the open-source Easy Java Simulations (EJS) package is presented. A LabVIEW interface and the Java-Internet-LabVIEW (JIL) server complete the scheme for data exchange. This configuration allows the user to remotely interact with the PLC. Such integration can be considered a novelty in scientific literature for remote control and sensor data acquisition of industrial plants. An experimental application devoted to remote laboratories is developed to demonstrate the feasibility and benefits of the proposed approach. The experiment to be conducted is the parameterization and supervision of a fuzzy controller of a DC servomotor. The graphical user interface has been developed with EJS and the fuzzy control is carried out by our own PLC. In fact, the distinctive features of the proposed novel network application are the integration of the OPC protocol to share information with the PLC and the application under control. The user can perform the tuning of the controller parameters online and observe in real time the effect on the servomotor behavior. The target group is engineering remote users, specifically in control- and automation-related tasks. The proposed architecture system is described and experimental results are presented.

  19. Novel Networked Remote Laboratory Architecture for Open Connectivity Based on PLC-OPC-LabVIEW-EJS Integration. Application in Remote Fuzzy Control and Sensors Data Acquisition

    PubMed Central

    González, Isaías; Calderón, Antonio José; Mejías, Andrés; Andújar, José Manuel

    2016-01-01

    In this paper the design and implementation of a network for integrating Programmable Logic Controllers (PLC), the Object-Linking and Embedding for Process Control protocol (OPC) and the open-source Easy Java Simulations (EJS) package is presented. A LabVIEW interface and the Java-Internet-LabVIEW (JIL) server complete the scheme for data exchange. This configuration allows the user to remotely interact with the PLC. Such integration can be considered a novelty in scientific literature for remote control and sensor data acquisition of industrial plants. An experimental application devoted to remote laboratories is developed to demonstrate the feasibility and benefits of the proposed approach. The experiment to be conducted is the parameterization and supervision of a fuzzy controller of a DC servomotor. The graphical user interface has been developed with EJS and the fuzzy control is carried out by our own PLC. In fact, the distinctive features of the proposed novel network application are the integration of the OPC protocol to share information with the PLC and the application under control. The user can perform the tuning of the controller parameters online and observe in real time the effect on the servomotor behavior. The target group is engineering remote users, specifically in control- and automation-related tasks. The proposed architecture system is described and experimental results are presented. PMID:27809229

  20. Large-Scale Low-Cost NGS Library Preparation Using a Robust Tn5 Purification and Tagmentation Protocol

    PubMed Central

    Hennig, Bianca P.; Velten, Lars; Racke, Ines; Tu, Chelsea Szu; Thoms, Matthias; Rybin, Vladimir; Besir, Hüseyin; Remans, Kim; Steinmetz, Lars M.

    2017-01-01

    Efficient preparation of high-quality sequencing libraries that well represent the biological sample is a key step for using next-generation sequencing in research. Tn5 enables fast, robust, and highly efficient processing of limited input material while scaling to the parallel processing of hundreds of samples. Here, we present a robust Tn5 transposase purification strategy based on an N-terminal His6-Sumo3 tag. We demonstrate that libraries prepared with our in-house Tn5 are of the same quality as those processed with a commercially available kit (Nextera XT), while they dramatically reduce the cost of large-scale experiments. We introduce improved purification strategies for two versions of the Tn5 enzyme. The first version carries the previously reported point mutations E54K and L372P, and stably produces libraries of constant fragment size distribution, even if the Tn5-to-input molecule ratio varies. The second Tn5 construct carries an additional point mutation (R27S) in the DNA-binding domain. This construct allows for adjustment of the fragment size distribution based on enzyme concentration during tagmentation, a feature that opens new opportunities for use of Tn5 in customized experimental designs. We demonstrate the versatility of our Tn5 enzymes in different experimental settings, including a novel single-cell polyadenylation site mapping protocol as well as ultralow input DNA sequencing. PMID:29118030

Top