Sample records for individually optimized protocol

  1. Quantum cryptography: individual eavesdropping with the knowledge of the error-correcting protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horoshko, D B

    2007-12-31

    The quantum key distribution protocol BB84 combined with the repetition protocol for error correction is analysed from the point of view of its security against individual eavesdropping relying on quantum memory. It is shown that the mere knowledge of the error-correcting protocol changes the optimal attack and provides the eavesdropper with additional information on the distributed key. (fifth seminar in memory of d.n. klyshko)

  2. Individual Optimal Frequency in Whole-Body Vibration: Effect of Protocol, Joint Angle, and Fatiguing Exercise.

    PubMed

    Carlucci, Flaminia; Felici, Francesco; Piccinini, Alberto; Haxhi, Jonida; Sacchetti, Massimo

    2016-12-01

    Carlucci, F, Felici, F, Piccinini, A, Haxhi, J, and Sacchetti, M. Individual optimal frequency in whole-body vibration: effect of protocol, joint angle, and fatiguing exercise. J Strength Cond Res 30(12): 3503-3511, 2016-Recent studies have shown the importance of individualizing the vibration intervention to produce greater effects on the neuromuscular system in less time. The purpose of this study was to assess the individual optimal vibration frequency (OVF) corresponding to the highest muscle activation (RMSmax) during vibration at different frequencies, comparing different protocols. Twenty-nine university students underwent 3 continuous (C) and 2 random (R) different vibrating protocols, maintaining a squat position on a vibration platform. The C protocol lasted 50 seconds and involved the succession of ascending frequencies from 20 to 55 Hz, every 5 seconds. The same protocol was performed twice, having the knee angle at 120° (C) and 90° (C90), to assess the effect of joint angle and after a fatiguing squatting exercise (CF) to evaluate the influence of fatigue on OVF assessment. In the random protocols, vibration time was 20 seconds with a 2-minute (R2) and a 4-minute (R4) pauses between tested frequencies. Muscle activation and OVF values did not differ significantly in the C, R2, and R4 protocols. RMSmax was higher in C90 (p < 0.001) and in CF (p = 0.04) compared with the C protocol. Joint angle and fatiguing exercise had no effect on OVF. In conclusion, the shorter C protocol produced similar myoelectrical activity in the R2 and the R4 protocols, and therefore, it could be equally valid in identifying the OVF with considerable time efficiency. Knee joint angle and fatiguing exercise had an effect on surface electromyography response during vibration but did not affect OVF identification significantly.

  3. An intelligent case-adjustment algorithm for the automated design of population-based quality auditing protocols.

    PubMed

    Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A

    2004-01-01

    We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.

  4. On the optimality of individual entangling-probe attacks against BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Herbauts, I. M.; Bettelli, S.; Hã¼bel, H.; Peev, M.

    2008-02-01

    Some MIT researchers [Phys. Rev. A 75, 042327 (2007)] have recently claimed that their implementation of the Slutsky-Brandt attack [Phys. Rev. A 57, 2383 (1998); Phys. Rev. A 71, 042312 (2005)] to the BB84 quantum-key-distribution (QKD) protocol puts the security of this protocol “to the test” by simulating “the most powerful individual-photon attack” [Phys. Rev. A 73, 012315 (2006)]. A related unfortunate news feature by a scientific journal [G. Brumfiel, Quantum cryptography is hacked, News @ Nature (april 2007); Nature 447, 372 (2007)] has spurred some concern in the QKD community and among the general public by misinterpreting the implications of this work. The present article proves the existence of a stronger individual attack on QKD protocols with encrypted error correction, for which tight bounds are shown, and clarifies why the claims of the news feature incorrectly suggest a contradiction with the established “old-style” theory of BB84 individual attacks. The full implementation of a quantum cryptographic protocol includes a reconciliation and a privacy-amplification stage, whose choice alters in general both the maximum extractable secret and the optimal eavesdropping attack. The authors of [Phys. Rev. A 75, 042327 (2007)] are concerned only with the error-free part of the so-called sifted string, and do not consider faulty bits, which, in the version of their protocol, are discarded. When using the provably superior reconciliation approach of encrypted error correction (instead of error discard), the Slutsky-Brandt attack is no more optimal and does not “threaten” the security bound derived by Lütkenhaus [Phys. Rev. A 59, 3301 (1999)]. It is shown that the method of Slutsky and collaborators [Phys. Rev. A 57, 2383 (1998)] can be adapted to reconciliation with error correction, and that the optimal entangling probe can be explicitly found. Moreover, this attack fills Lütkenhaus bound, proving that it is tight (a fact which was not previously known).

  5. An optimized 13C-urea breath test for the diagnosis of H pylori infection

    PubMed Central

    Campuzano-Maya, Germán

    2007-01-01

    AIM: To validate an optimized 13C-urea breath test (13C-UBT) protocol for the diagnosis of H pylori infection that is cost-efficient and maintains excellent diagnostic accuracy. METHODS: 70 healthy volunteers were tested with two simplified 13C-UBT protocols, with test meal (Protocol 2) and without test meal (Protocol 1). Breath samples were collected at 10, 20 and 30 min after ingestion of 50 mg 13C-urea dissolved in 10 mL of water, taken as a single swallow, followed by 200 mL of water (pH 6.0) and a circular motion around the waistline to homogenize the urea solution. Performance of both protocols was analyzed at various cut-off values. Results were validated against the European protocol. RESULTS: According to the reference protocol, 65.7% individuals were positive for H pylori infection and 34.3% were negative. There were no significant differences in the ability of both protocols to correctly identify positive and negative H pylori individuals. However, only Protocol 1 with no test meal achieved accuracy, sensitivity, specificity, positive and negative predictive values of 100%. The highest values achieved by Protocol 2 were 98.57%, 97.83%, 100%, 100% and 100%, respectively. CONCLUSION: A 10 min, 50 mg 13C-UBT with no test meal using a cut-off value of 2-2.5 is a highly accurate test for the diagnosis of H pylori infection at a reduced cost. PMID:17907288

  6. Penicillin allergy: optimizing diagnostic protocols, public health implications, and future research needs.

    PubMed

    Macy, Eric

    2015-08-01

    Unverified penicillin allergy is being increasingly recognized as a public health concern. The ideal protocol for verifying true clinically significant IgE-mediated penicillin allergy needs to use only commercially available materials, be well tolerated and easy to perform in both the inpatient and outpatient settings, and minimize false-positive determinations. This review concentrates on articles published in 2013 and 2014 that present new data relating to the diagnosis and management of penicillin allergy. Penicillin allergy can be safely evaluated at this time, in patients with an appropriate clinical history of penicillin allergy, using only penicilloyl-poly-lysine and native penicillin G as skin test reagents, if an oral challenge with amoxicillin 250 mg, followed by 1 h of observation, is given to all skin test negative individuals. Millions of individuals falsely labeled with penicillin allergy need to be evaluated to safely allow them to use penicillin-class antibiotics and avoid morbidity associated with penicillin avoidance. Further research is needed to determine optimal protocol(s). There will still be a 1-2% rate of adverse reactions reported with all future therapeutic penicillin-class antibiotic use, even with optimal methods used to determine acute penicillin tolerance. Only a small minority of these new reactions will be IgE-mediated.

  7. Intravenous Ketamine Infusions for Neuropathic Pain Management: A Promising Therapy in Need of Optimization.

    PubMed

    Maher, Dermot P; Chen, Lucy; Mao, Jianren

    2017-02-01

    Intravenous ketamine infusions have been used extensively to treat often-intractable neuropathic pain conditions. Because there are many widely divergent ketamine infusion protocols described in the literature, the variation in these protocols presents a challenge for direct comparison of one protocol with another and in discerning an optimal protocol. Careful examination of the published literature suggests that ketamine infusions can be useful to treat neuropathic pain and that certain characteristics of ketamine infusions may be associated with better clinical outcomes. Increased duration of relief from neuropathic pain is associated with (1) higher total infused doses of ketamine; (2) prolonged infusion durations, although the rate of infusion does not appear to be a factor; and (3) coadministration of adjunct medications such as midazolam and/or clonidine that mitigate some of the unpleasant psychomimetic side effects. However, there are few studies designed to optimize ketamine infusion protocols by defining what an effective infusion protocol entails with regard to a respective neuropathic pain condition. Therefore, despite common clinical practice, the current state of the literature leaves the use of ketamine infusions without meaningful guidance from high-quality comparative evidence. The objectives of this topical review are to (1) analyze the available clinical evidence related to ketamine infusion protocols and (2) call for clinical studies to identify optimal ketamine infusion protocols tailored for individual neuropathic pain conditions. The Oxford Center for Evidence-Based Medicine classification for levels of evidence was used to stratify the grades of clinical recommendation for each infusion variable studied.

  8. Work extraction and thermodynamics for individual quantum systems

    NASA Astrophysics Data System (ADS)

    Skrzypczyk, Paul; Short, Anthony J.; Popescu, Sandu

    2014-06-01

    Thermodynamics is traditionally concerned with systems comprised of a large number of particles. Here we present a framework for extending thermodynamics to individual quantum systems, including explicitly a thermal bath and work-storage device (essentially a ‘weight’ that can be raised or lowered). We prove that the second law of thermodynamics holds in our framework, and gives a simple protocol to extract the optimal amount of work from the system, equal to its change in free energy. Our results apply to any quantum system in an arbitrary initial state, in particular including non-equilibrium situations. The optimal protocol is essentially reversible, similar to classical Carnot cycles, and indeed, we show that it can be used to construct a quantum Carnot engine.

  9. Work extraction and thermodynamics for individual quantum systems.

    PubMed

    Skrzypczyk, Paul; Short, Anthony J; Popescu, Sandu

    2014-06-27

    Thermodynamics is traditionally concerned with systems comprised of a large number of particles. Here we present a framework for extending thermodynamics to individual quantum systems, including explicitly a thermal bath and work-storage device (essentially a 'weight' that can be raised or lowered). We prove that the second law of thermodynamics holds in our framework, and gives a simple protocol to extract the optimal amount of work from the system, equal to its change in free energy. Our results apply to any quantum system in an arbitrary initial state, in particular including non-equilibrium situations. The optimal protocol is essentially reversible, similar to classical Carnot cycles, and indeed, we show that it can be used to construct a quantum Carnot engine.

  10. Optimization of Saanen sperm genes amplification: evaluation of standardized protocols in genetically uncharacterized rural goats reared under a subtropical environment.

    PubMed

    Barbour, Elie K; Saade, Maya F; Sleiman, Fawwak T; Hamadeh, Shady K; Mouneimne, Youssef; Kassaifi, Zeina; Kayali, Ghazi; Harakeh, Steve; Jaber, Lina S; Shaib, Houssam A

    2012-10-01

    The purpose of this research is to optimize quantitatively the amplification of specific sperm genes in reference genomically characterized Saanen goat and to evaluate the standardized protocols applicability on sperms of uncharacterized genome of rural goats reared under subtropical environment for inclusion in future selection programs. The optimization of the protocols in Saanen sperms included three production genes (growth hormone (GH) exons 2, 3, and 4, αS1-casein (CSN1S1), and α-lactalbumin) and two health genes (MHC class II DRB and prion (PrP)). The optimization was based on varying the primers concentrations and the inclusion of a PCR cosolvent (Triton X). The impact of the studied variables on statistically significant increase in the yield of amplicons was noticed in four out of five (80%) optimized protocols, namely in those related to GH, CSN1S1, α-lactalbumin, and PrP genes (P < 0.05). There was no significant difference in the yield of amplicons related to MHC class II DRB gene, regardless of the variables used (P > 0.05). The applicability of the optimized protocols of Saanen sperm genes on amplification of uncharacterized rural goat sperms revealed a 100% success in tested individuals for amplification of GH, CSN1S1, α-lactalbumin, and MHC class II DRB genes and a 75% success for the PrP gene. The significant success in applicability of the Saanen quantitatively optimized protocols to other uncharacterized genome of rural goats allows for their inclusion in future selection, targeting the sustainability of this farming system in a subtropical environment and the improvement of the farmers livelihood.

  11. Determination of the exercise intensity that elicits maximal fat oxidation in individuals with obesity.

    PubMed

    Dandanell, Sune; Præst, Charlotte Boslev; Søndergård, Stine Dam; Skovborg, Camilla; Dela, Flemming; Larsen, Steen; Helge, Jørn Wulff

    2017-04-01

    Maximal fat oxidation (MFO) and the exercise intensity that elicits MFO (Fat Max ) are commonly determined by indirect calorimetry during graded exercise tests in both obese and normal-weight individuals. However, no protocol has been validated in individuals with obesity. Thus, the aims were to develop a graded exercise protocol for determination of Fat Max in individuals with obesity, and to test validity and inter-method reliability. Fat oxidation was assessed over a range of exercise intensities in 16 individuals (age: 28 (26-29) years; body mass index: 36 (35-38) kg·m -2 ; 95% confidence interval) on a cycle ergometer. The graded exercise protocol was validated against a short continuous exercise (SCE) protocol, in which Fat Max was determined from fat oxidation at rest and during 10 min of continuous exercise at 35%, 50%, and 65% of maximal oxygen uptake. Intraclass and Pearson correlation coefficients between the protocols were 0.75 and 0.72 and within-subject coefficient of variation (CV) was 5 (3-7)%. A Bland-Altman plot revealed a bias of -3% points of maximal oxygen uptake (limits of agreement: -12 to 7). A tendency towards a systematic difference (p = 0.06) was observed, where Fat Max occurred at 42 (40-44)% and 45 (43-47)% of maximal oxygen uptake with the graded and the SCE protocol, respectively. In conclusion, there was a high-excellent correlation and a low CV between the 2 protocols, suggesting that the graded exercise protocol has a high inter-method reliability. However, considerable intra-individual variation and a trend towards systematic difference between the protocols reveal that further optimization of the graded exercise protocol is needed to improve validity.

  12. Evaluation of a 15-week CHOP protocol for the treatment of canine multicentric lymphoma.

    PubMed

    Burton, J H; Garrett-Mayer, E; Thamm, D H

    2013-12-01

    Dose intense CHOP protocols have been shown to improve outcome for people with non-Hodgkin's lymphoma, but evaluation of dose intense CHOP protocols for canine lymphoma is currently limited. The hypothesis of this retrospective study was that a 15-week dose intense CHOP protocol would have shorter treatment duration with similar efficacy to other doxorubicin-based multidrug protocols. Thirty-one client owned dogs with multicentric lymphoma were treated with a 15-week CHOP chemotherapy protocol with an overall response rate of 100% and a median progression-free interval (PFI) of 140 days [95% confidence interval (CI) 91-335 days]. Dogs that had two or more treatment delays had significantly prolonged PFI and overall survival in multivariate analysis. Dose intensity did not correlate with patient outcome. Dogs experiencing multiple treatment delays secondary to adverse events may receive their individual maximally tolerated dose while dogs with no adverse events may be underdosed. Future studies should focus on individual patient dose optimization. © 2012 Blackwell Publishing Ltd.

  13. Optimal and secure measurement protocols for quantum sensor networks

    NASA Astrophysics Data System (ADS)

    Eldredge, Zachary; Foss-Feig, Michael; Gross, Jonathan A.; Rolston, S. L.; Gorshkov, Alexey V.

    2018-04-01

    Studies of quantum metrology have shown that the use of many-body entangled states can lead to an enhancement in sensitivity when compared with unentangled states. In this paper, we quantify the metrological advantage of entanglement in a setting where the measured quantity is a linear function of parameters individually coupled to each qubit. We first generalize the Heisenberg limit to the measurement of nonlocal observables in a quantum network, deriving a bound based on the multiparameter quantum Fisher information. We then propose measurement protocols that can make use of Greenberger-Horne-Zeilinger (GHZ) states or spin-squeezed states and show that in the case of GHZ states the protocol is optimal, i.e., it saturates our bound. We also identify nanoscale magnetic resonance imaging as a promising setting for this technology.

  14. Compliance with AAPM Practice Guideline 1.a: CT Protocol Management and Review — from the perspective of a university hospital

    PubMed Central

    Bour, Robert K.; Pozniak, Myron; Ranallo, Frank N.

    2015-01-01

    The purpose of this paper is to describe our experience with the AAPM Medical Physics Practice Guideline 1.a: “CT Protocol Management and Review Practice Guideline”. Specifically, we will share how our institution's quality management system addresses the suggestions within the AAPM practice report. We feel this paper is needed as it was beyond the scope of the AAPM practice guideline to provide specific details on fulfilling individual guidelines. Our hope is that other institutions will be able to emulate some of our practices and that this article would encourage other types of centers (e.g., community hospitals) to share their methodology for approaching CT protocol optimization and quality control. Our institution had a functioning CT protocol optimization process, albeit informal, since we began using CT. Recently, we made our protocol development and validation process compliant with a number of the ISO 9001:2008 clauses and this required us to formalize the roles of the members of our CT protocol optimization team. We rely heavily on PACS‐based IT solutions for acquiring radiologist feedback on the performance of our CT protocols and the performance of our CT scanners in terms of dose (scanner output) and the function of the automatic tube current modulation. Specific details on our quality management system covering both quality control and ongoing optimization have been provided. The roles of each CT protocol team member have been defined, and the critical role that IT solutions provides for the management of files and the monitoring of CT protocols has been reviewed. In addition, the invaluable role management provides by being a champion for the project has been explained; lack of a project champion will mitigate the efforts of a CT protocol optimization team. Meeting the guidelines set forth in the AAPM practice guideline was not inherently difficult, but did, in our case, require the cooperation of radiologists, technologists, physicists, IT, administrative staff, and hospital management. Some of the IT solutions presented in this paper are novel and currently unique to our institution. PACS number: 87.57.Q PMID:26103176

  15. Development of a bedside viable ultrasound protocol to quantify appendicular lean tissue mass.

    PubMed

    Paris, Michael T; Lafleur, Benoit; Dubin, Joel A; Mourtzakis, Marina

    2017-10-01

    Ultrasound is a non-invasive and readily available tool that can be prospectively applied at the bedside to assess muscle mass in clinical settings. The four-site protocol, which images two anatomical sites on each quadriceps, may be a viable bedside method, but its ability to predict musculature has not been compared against whole-body reference methods. Our primary objectives were to (i) compare the four-site protocol's ability to predict appendicular lean tissue mass from dual-energy X-ray absorptiometry; (ii) optimize the predictability of the four-site protocol with additional anatomical muscle thicknesses and easily obtained covariates; and (iii) assess the ability of the optimized protocol to identify individuals with low lean tissue mass. This observational cross-sectional study recruited 96 university and community dwelling adults. Participants underwent ultrasound scans for assessment of muscle thickness and whole-body dual-energy X-ray absorptiometry scans for assessment of appendicular lean tissue. Ultrasound protocols included (i) the nine-site protocol, which images nine anterior and posterior muscle groups in supine and prone positions, and (ii) the four-site protocol, which images two anterior sites on each quadriceps muscle group in a supine position. The four-site protocol was strongly associated (R 2  = 0.72) with appendicular lean tissue mass, but Bland-Altman analysis displayed wide limits of agreement (-5.67, 5.67 kg). Incorporating the anterior upper arm muscle thickness, and covariates age and sex, alongside the four-site protocol, improved the association (R 2  = 0.91) with appendicular lean tissue and displayed narrower limits of agreement (-3.18, 3.18 kg). The optimized protocol demonstrated a strong ability to identify low lean tissue mass (area under the curve = 0.89). The four-site protocol can be improved with the addition of the anterior upper arm muscle thickness, sex, and age when predicting appendicular lean tissue mass. This optimized protocol can accurately identify low lean tissue mass, while still being easily applied at the bedside. © 2017 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.

  16. Development of a bedside viable ultrasound protocol to quantify appendicular lean tissue mass

    PubMed Central

    Paris, Michael T.; Lafleur, Benoit; Dubin, Joel A.

    2017-01-01

    Abstract Background Ultrasound is a non‐invasive and readily available tool that can be prospectively applied at the bedside to assess muscle mass in clinical settings. The four‐site protocol, which images two anatomical sites on each quadriceps, may be a viable bedside method, but its ability to predict musculature has not been compared against whole‐body reference methods. Our primary objectives were to (i) compare the four‐site protocol's ability to predict appendicular lean tissue mass from dual‐energy X‐ray absorptiometry; (ii) optimize the predictability of the four‐site protocol with additional anatomical muscle thicknesses and easily obtained covariates; and (iii) assess the ability of the optimized protocol to identify individuals with low lean tissue mass. Methods This observational cross‐sectional study recruited 96 university and community dwelling adults. Participants underwent ultrasound scans for assessment of muscle thickness and whole‐body dual‐energy X‐ray absorptiometry scans for assessment of appendicular lean tissue. Ultrasound protocols included (i) the nine‐site protocol, which images nine anterior and posterior muscle groups in supine and prone positions, and (ii) the four‐site protocol, which images two anterior sites on each quadriceps muscle group in a supine position. Results The four‐site protocol was strongly associated (R 2 = 0.72) with appendicular lean tissue mass, but Bland–Altman analysis displayed wide limits of agreement (−5.67, 5.67 kg). Incorporating the anterior upper arm muscle thickness, and covariates age and sex, alongside the four‐site protocol, improved the association (R 2 = 0.91) with appendicular lean tissue and displayed narrower limits of agreement (−3.18, 3.18 kg). The optimized protocol demonstrated a strong ability to identify low lean tissue mass (area under the curve = 0.89). Conclusions The four‐site protocol can be improved with the addition of the anterior upper arm muscle thickness, sex, and age when predicting appendicular lean tissue mass. This optimized protocol can accurately identify low lean tissue mass, while still being easily applied at the bedside. PMID:28722298

  17. Impact of uncertain head tissue conductivity in the optimization of transcranial direct current stimulation for an auditory target

    NASA Astrophysics Data System (ADS)

    Schmidt, Christian; Wagner, Sven; Burger, Martin; van Rienen, Ursula; Wolters, Carsten H.

    2015-08-01

    Objective. Transcranial direct current stimulation (tDCS) is a non-invasive brain stimulation technique to modify neural excitability. Using multi-array tDCS, we investigate the influence of inter-individually varying head tissue conductivity profiles on optimal electrode configurations for an auditory cortex stimulation. Approach. In order to quantify the uncertainty of the optimal electrode configurations, multi-variate generalized polynomial chaos expansions of the model solutions are used based on uncertain conductivity profiles of the compartments skin, skull, gray matter, and white matter. Stochastic measures, probability density functions, and sensitivity of the quantities of interest are investigated for each electrode and the current density at the target with the resulting stimulation protocols visualized on the head surface. Main results. We demonstrate that the optimized stimulation protocols are only comprised of a few active electrodes, with tolerable deviations in the stimulation amplitude of the anode. However, large deviations in the order of the uncertainty in the conductivity profiles could be noted in the stimulation protocol of the compensating cathodes. Regarding these main stimulation electrodes, the stimulation protocol was most sensitive to uncertainty in skull conductivity. Finally, the probability that the current density amplitude in the auditory cortex target region is supra-threshold was below 50%. Significance. The results suggest that an uncertain conductivity profile in computational models of tDCS can have a substantial influence on the prediction of optimal stimulation protocols for stimulation of the auditory cortex. The investigations carried out in this study present a possibility to predict the probability of providing a therapeutic effect with an optimized electrode system for future auditory clinical and experimental procedures of tDCS applications.

  18. Optimization of Delayed Tolerance Induction in Swine: A Clinically-Relevant Protocol for Immunosuppression-Free Vascularized Composite Allotransplantation

    DTIC Science & Technology

    2017-10-01

    information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE October...been achieved in nonhuman primates (NHPs) using a the delayed period protocol, i.e combination of post- transplant non -myeloablative conditioning and...seminars, study groups , and individual study. Include participation in conferences, workshops, and seminars not listed under major activities

  19. Rapid Design of Knowledge-Based Scoring Potentials for Enrichment of Near-Native Geometries in Protein-Protein Docking.

    PubMed

    Sasse, Alexander; de Vries, Sjoerd J; Schindler, Christina E M; de Beauchêne, Isaure Chauvot; Zacharias, Martin

    2017-01-01

    Protein-protein docking protocols aim to predict the structures of protein-protein complexes based on the structure of individual partners. Docking protocols usually include several steps of sampling, clustering, refinement and re-scoring. The scoring step is one of the bottlenecks in the performance of many state-of-the-art protocols. The performance of scoring functions depends on the quality of the generated structures and its coupling to the sampling algorithm. A tool kit, GRADSCOPT (GRid Accelerated Directly SCoring OPTimizing), was designed to allow rapid development and optimization of different knowledge-based scoring potentials for specific objectives in protein-protein docking. Different atomistic and coarse-grained potentials can be created by a grid-accelerated directly scoring dependent Monte-Carlo annealing or by a linear regression optimization. We demonstrate that the scoring functions generated by our approach are similar to or even outperform state-of-the-art scoring functions for predicting near-native solutions. Of additional importance, we find that potentials specifically trained to identify the native bound complex perform rather poorly on identifying acceptable or medium quality (near-native) solutions. In contrast, atomistic long-range contact potentials can increase the average fraction of near-native poses by up to a factor 2.5 in the best scored 1% decoys (compared to existing scoring), emphasizing the need of specific docking potentials for different steps in the docking protocol.

  20. Evaluation of telomere length in human cardiac tissues using cardiac quantitative FISH.

    PubMed

    Sharifi-Sanjani, Maryam; Meeker, Alan K; Mourkioti, Foteini

    2017-09-01

    Telomere length has been correlated with various diseases, including cardiovascular disease and cancer. The use of currently available telomere-length measurement techniques is often restricted by the requirement of a large amount of cells (Southern-based techniques) or the lack of information on individual cells or telomeres (PCR-based methods). Although several methods have been used to measure telomere length in tissues as a whole, the assessment of cell-type-specific telomere length provides valuable information on individual cell types. The development of fluorescence in situ hybridization (FISH) technologies enables the quantification of telomeres in individual chromosomes, but the use of these methods is dependent on the availability of isolated cells, which prevents their use with fixed archival samples. Here we describe an optimized quantitative FISH (Q-FISH) protocol for measuring telomere length that bypasses the previous limitations by avoiding contributions from undesired cell types. We have used this protocol on small paraffin-embedded cardiac-tissue samples. This protocol describes step-by-step procedures for tissue preparation, permeabilization, cardiac-tissue pretreatment and hybridization with a Cy3-labeled telomeric repeat complementing (CCCTAA) 3 peptide nucleic acid (PNA) probe coupled with cardiac-specific antibody staining. We also describe how to quantify telomere length by means of the fluorescence intensity and area of each telomere within individual nuclei. This protocol provides comparative cell-type-specific telomere-length measurements in relatively small human cardiac samples and offers an attractive technique to test hypotheses implicating telomere length in various cardiac pathologies. The current protocol (from tissue collection to image procurement) takes ∼28 h along with three overnight incubations. We anticipate that the protocol could be easily adapted for use on different tissue types.

  1. Granulocyte-colony stimulating factor in the prevention of postoperative infectious complications and sub-optimal recovery from operation in patients with colorectal cancer and increased preoperative risk (ASA 3 and 4). Protocol of a controlled clinical trial developed by consensus of an international study group. Part three: individual patient, complication algorithm and quality manage.

    PubMed

    Stinner, B; Bauhofer, A; Lorenz, W; Rothmund, M; Plaul, U; Torossian, A; Celik, I; Sitter, H; Koller, M; Black, A; Duda, D; Encke, A; Greger, B; van Goor, H; Hanisch, E; Hesterberg, R; Klose, K J; Lacaine, F; Lorijn, R H; Margolis, C; Neugebauer, E; Nyström, P O; Reemst, P H; Schein, M; Solovera, J

    2001-05-01

    Presentation of a new type of a study protocol for evaluation of the effectiveness of an immune modifier (rhG-CSF, filgrastim): prevention of postoperative infectious complications and of sub-optimal recovery from operation in patients with colorectal cancer and increased preoperative risk (ASA 3 and 4). A randomised, placebo controlled, double-blinded, single-centre study is performed at an University Hospital (n = 40 patients for each group). This part presents the course of the individual patient and a complication algorithm for the management of anastomotic leakage and quality management. In part three of the protocol, the three major sections include: The course of the individual patient using a comprehensive graphic display, including the perioperative period, hospital stay and post discharge outcome. A center based clinical practice guideline for the management of the most important postoperative complication--anastomotic leakage--including evidence based support for each step of the algorithm. Data management, ethics and organisational structure. Future studies with immune modifiers will also fail if not better structured (reduction of variance) to achieve uniform patient management in a complex clinical scenario. This new type of a single-centre trial aims to reduce the gap between animal experiments and clinical trials or--if it fails--at least demonstrates new ways for explaining the failures.

  2. Numerical simulation of the optimal two-mode attacks for two-way continuous-variable quantum cryptography in reverse reconciliation

    NASA Astrophysics Data System (ADS)

    Zhang, Yichen; Li, Zhengyu; Zhao, Yijia; Yu, Song; Guo, Hong

    2017-02-01

    We analyze the security of the two-way continuous-variable quantum key distribution protocol in reverse reconciliation against general two-mode attacks, which represent all accessible attacks at fixed channel parameters. Rather than against one specific attack model, the expression of secret key rates of the two-way protocol are derived against all accessible attack models. It is found that there is an optimal two-mode attack to minimize the performance of the protocol in terms of both secret key rates and maximal transmission distances. We identify the optimal two-mode attack, give the specific attack model of the optimal two-mode attack and show the performance of the two-way protocol against the optimal two-mode attack. Even under the optimal two-mode attack, the performances of two-way protocol are still better than the corresponding one-way protocol, which shows the advantage of making double use of the quantum channel and the potential of long-distance secure communication using a two-way protocol.

  3. Efficient Mobility Management Signalling in Network Mobility Supported PMIPV6

    PubMed Central

    Jebaseeli Samuelraj, Ananthi; Jayapal, Sundararajan

    2015-01-01

    Proxy Mobile IPV6 (PMIPV6) is a network based mobility management protocol which supports node's mobility without the contribution from the respective mobile node. PMIPV6 is initially designed to support individual node mobility and it should be enhanced to support mobile network movement. NEMO-BSP is an existing protocol to support network mobility (NEMO) in PMIPV6 network. Due to the underlying differences in basic protocols, NEMO-BSP cannot be directly applied to PMIPV6 network. Mobility management signaling and data structures used for individual node's mobility should be modified to support group nodes' mobility management efficiently. Though a lot of research work is in progress to implement mobile network movement in PMIPV6, it is not yet standardized and each suffers with different shortcomings. This research work proposes modifications in NEMO-BSP and PMIPV6 to achieve NEMO support in PMIPV6. It mainly concentrates on optimizing the number and size of mobility signaling exchanged while mobile network or mobile network node changes its access point. PMID:26366431

  4. Efficient Mobility Management Signalling in Network Mobility Supported PMIPV6.

    PubMed

    Samuelraj, Ananthi Jebaseeli; Jayapal, Sundararajan

    2015-01-01

    Proxy Mobile IPV6 (PMIPV6) is a network based mobility management protocol which supports node's mobility without the contribution from the respective mobile node. PMIPV6 is initially designed to support individual node mobility and it should be enhanced to support mobile network movement. NEMO-BSP is an existing protocol to support network mobility (NEMO) in PMIPV6 network. Due to the underlying differences in basic protocols, NEMO-BSP cannot be directly applied to PMIPV6 network. Mobility management signaling and data structures used for individual node's mobility should be modified to support group nodes' mobility management efficiently. Though a lot of research work is in progress to implement mobile network movement in PMIPV6, it is not yet standardized and each suffers with different shortcomings. This research work proposes modifications in NEMO-BSP and PMIPV6 to achieve NEMO support in PMIPV6. It mainly concentrates on optimizing the number and size of mobility signaling exchanged while mobile network or mobile network node changes its access point.

  5. Crispr-mediated Gene Targeting of Human Induced Pluripotent Stem Cells.

    PubMed

    Byrne, Susan M; Church, George M

    2015-01-01

    CRISPR/Cas9 nuclease systems can create double-stranded DNA breaks at specific sequences to efficiently and precisely disrupt, excise, mutate, insert, or replace genes. However, human embryonic stem or induced pluripotent stem cells (iPSCs) are more difficult to transfect and less resilient to DNA damage than immortalized tumor cell lines. Here, we describe an optimized protocol for genome engineering of human iPSCs using a simple transient transfection of plasmids and/or single-stranded oligonucleotides. With this protocol, we achieve transfection efficiencies greater than 60%, with gene disruption efficiencies from 1-25% and gene insertion/replacement efficiencies from 0.5-10% without any further selection or enrichment steps. We also describe how to design and assess optimal sgRNA target sites and donor targeting vectors; cloning individual iPSC by single cell FACS sorting, and genotyping successfully edited cells.

  6. Phase Transition in Protocols Minimizing Work Fluctuations

    NASA Astrophysics Data System (ADS)

    Solon, Alexandre P.; Horowitz, Jordan M.

    2018-05-01

    For two canonical examples of driven mesoscopic systems—a harmonically trapped Brownian particle and a quantum dot—we numerically determine the finite-time protocols that optimize the compromise between the standard deviation and the mean of the dissipated work. In the case of the oscillator, we observe a collection of protocols that smoothly trade off between average work and its fluctuations. However, for the quantum dot, we find that as we shift the weight of our optimization objective from average work to work standard deviation, there is an analog of a first-order phase transition in protocol space: two distinct protocols exchange global optimality with mixed protocols akin to phase coexistence. As a result, the two types of protocols possess qualitatively different properties and remain distinct even in the infinite duration limit: optimal-work-fluctuation protocols never coalesce with the minimal-work protocols, which therefore never become quasistatic.

  7. SU-F-18C-01: Minimum Detectability Analysis for Comprehensive Sized Based Optimization of Image Quality and Radiation Dose Across CT Protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smitherman, C; Chen, B; Samei, E

    2014-06-15

    Purpose: This work involved a comprehensive modeling of task-based performance of CT across a wide range of protocols. The approach was used for optimization and consistency of dose and image quality within a large multi-vendor clinical facility. Methods: 150 adult protocols from the Duke University Medical Center were grouped into sub-protocols with similar acquisition characteristics. A size based image quality phantom (Duke Mercury Phantom) was imaged using these sub-protocols for a range of clinically relevant doses on two CT manufacturer platforms (Siemens, GE). The images were analyzed to extract task-based image quality metrics such as the Task Transfer Function (TTF),more » Noise Power Spectrum, and Az based on designer nodule task functions. The data were analyzed in terms of the detectability of a lesion size/contrast as a function of dose, patient size, and protocol. A graphical user interface (GUI) was developed to predict image quality and dose to achieve a minimum level of detectability. Results: Image quality trends with variations in dose, patient size, and lesion contrast/size were evaluated and calculated data behaved as predicted. The GUI proved effective to predict the Az values representing radiologist confidence for a targeted lesion, patient size, and dose. As an example, an abdomen pelvis exam for the GE scanner, with a task size/contrast of 5-mm/50-HU, and an Az of 0.9 requires a dose of 4.0, 8.9, and 16.9 mGy for patient diameters of 25, 30, and 35 cm, respectively. For a constant patient diameter of 30 cm, the minimum detected lesion size at those dose levels would be 8.4, 5, and 3.9 mm, respectively. Conclusion: The designed CT protocol optimization platform can be used to evaluate minimum detectability across dose levels and patient diameters. The method can be used to improve individual protocols as well as to improve protocol consistency across CT scanners.« less

  8. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    PubMed

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  9. Individual, social and environmental determinants of sleep among women: Protocol for a systematic review and meta-analysis

    USDA-ARS?s Scientific Manuscript database

    Sleep is important to promote optimal health and avoid negative health outcomes. Short-duration and low-quality sleep may be more common and more detrimental among women compared with men. Identifying the determinants of behaviour is one of the first steps in designing effective interventions. To ou...

  10. A conceptual model for worksite intelligent physical exercise training--IPET--intervention for decreasing life style health risk indicators among employees: a randomized controlled trial.

    PubMed

    Sjøgaard, Gisela; Justesen, Just Bendix; Murray, Mike; Dalager, Tina; Søgaard, Karen

    2014-06-26

    Health promotion at the work site in terms of physical activity has proven positive effects but optimization of relevant exercise training protocols and implementation for high adherence are still scanty. The aim of this paper is to present a study protocol with a conceptual model for planning the optimal individually tailored physical exercise training for each worker based on individual health check, existing guidelines and state of the art sports science training recommendations in the broad categories of cardiorespiratory fitness, muscle strength in specific body parts, and functional training including balance training. The hypotheses of this research are that individually tailored worksite-based intelligent physical exercise training, IPET, among workers with inactive job categories will: 1) Improve cardiorespiratory fitness and/or individual health risk indicators, 2) Improve muscle strength and decrease musculoskeletal disorders, 3) Succeed in regular adherence to worksite and leisure physical activity training, and 3) Reduce sickness absence and productivity losses (presenteeism) in office workers. The present RCT study enrolled almost 400 employees with sedentary jobs in the private as well as public sectors. The training interventions last 2 years with measures at baseline as well as one and two years follow-up. If proven effective, the intelligent physical exercise training scheduled as well as the information for its practical implementation can provide meaningful scientifically based information for public health policy. ClinicalTrials.gov, number: NCT01366950.

  11. A conceptual model for worksite intelligent physical exercise training - IPET - intervention for decreasing life style health risk indicators among employees: a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Health promotion at the work site in terms of physical activity has proven positive effects but optimization of relevant exercise training protocols and implementation for high adherence are still scanty. Methods/Design The aim of this paper is to present a study protocol with a conceptual model for planning the optimal individually tailored physical exercise training for each worker based on individual health check, existing guidelines and state of the art sports science training recommendations in the broad categories of cardiorespiratory fitness, muscle strength in specific body parts, and functional training including balance training. The hypotheses of this research are that individually tailored worksite-based intelligent physical exercise training, IPET, among workers with inactive job categories will: 1) Improve cardiorespiratory fitness and/or individual health risk indicators, 2) Improve muscle strength and decrease musculoskeletal disorders, 3) Succeed in regular adherence to worksite and leisure physical activity training, and 3) Reduce sickness absence and productivity losses (presenteeism) in office workers. The present RCT study enrolled almost 400 employees with sedentary jobs in the private as well as public sectors. The training interventions last 2 years with measures at baseline as well as one and two years follow-up. Discussion If proven effective, the intelligent physical exercise training scheduled as well as the information for its practical implementation can provide meaningful scientifically based information for public health policy. Trial Registration ClinicalTrials.gov, number: NCT01366950. PMID:24964869

  12. Outcomes of Optimized over Standard Protocol of Rabbit Antithymocyte Globulin for Severe Aplastic Anemia: A Single-Center Experience

    PubMed Central

    Ge, Meili; Shao, Yingqi; Huang, Jinbo; Huang, Zhendong; Zhang, Jing; Nie, Neng; Zheng, Yizhou

    2013-01-01

    Background Previous reports showed that outcome of rabbit antithymocyte globulin (rATG) was not satisfactory as the first-line therapy for severe aplastic anemia (SAA). We explored a modifying schedule of administration of rATG. Design and Methods Outcomes of a cohort of 175 SAA patients, including 51 patients administered with standard protocol (3.55 mg/kg/d for 5 days) and 124 cases with optimized protocol (1.97 mg/kg/d for 9 days) of rATG plus cyclosporine (CSA), were analyzed retrospectively. Results Of all 175 patients, response rates at 3 and 6 months were 36.6% and 56.0%, respectively. 51 cases received standard protocol had poor responses at 3 (25.5%) and 6 months (41.2%). However, 124 patients received optimized protocol had better responses at 3 (41.1%, P = 0.14) and 6 (62.1%, P = 0.01). Higher incidences of infection (57.1% versus 37.9%, P = 0.02) and early mortality (17.9% versus 0.8%, P<0.001) occurred in patients received standard protocol compared with optimized protocol. The 5-year overall survival in favor of the optimized over standard rATG protocol (76.0% versus. 50.3%, P<0.001) was observed. By multivariate analysis, optimized protocol (RR = 2.21, P = 0.04), response at 3 months (RR = 10.31, P = 0.03) and shorter interval (<23 days) between diagnosis and initial dose of rATG (RR = 5.35, P = 0.002) were independent favorable predictors of overall survival. Conclusions Optimized instead of standard rATG protocol in combination with CSA remained efficacious as a first-line immunosuppressive regimen for SAA. PMID:23554855

  13. Photonic quantum simulator for unbiased phase covariant cloning

    NASA Astrophysics Data System (ADS)

    Knoll, Laura T.; López Grande, Ignacio H.; Larotonda, Miguel A.

    2018-01-01

    We present the results of a linear optics photonic implementation of a quantum circuit that simulates a phase covariant cloner, using two different degrees of freedom of a single photon. We experimentally simulate the action of two mirrored 1→ 2 cloners, each of them biasing the cloned states into opposite regions of the Bloch sphere. We show that by applying a random sequence of these two cloners, an eavesdropper can mitigate the amount of noise added to the original input state and therefore, prepare clones with no bias, but with the same individual fidelity, masking its presence in a quantum key distribution protocol. Input polarization qubit states are cloned into path qubit states of the same photon, which is identified as a potential eavesdropper in a quantum key distribution protocol. The device has the flexibility to produce mirrored versions that optimally clone states on either the northern or southern hemispheres of the Bloch sphere, as well as to simulate optimal and non-optimal cloning machines by tuning the asymmetry on each of the cloning machines.

  14. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  15. Optimization of stent implantation using a high pressure inflation protocol.

    PubMed

    Vallurupalli, Srikanth; Bahia, Amit; Ruiz-Rodriguez, Ernesto; Ahmed, Zubair; Hakeem, Abdul; Uretsky, Barry F

    2016-01-01

    High-pressure inflation is the universal standard for stent deployment but a specific protocol for its use is lacking. We developed a standardized "pressure optimization protocol" (POP) using time to inflation pressure stability as an endpoint for determining the required duration of stent inflation. The primary study purpose was to determine the stent inflation time (IT) in a large patient cohort using the standardized inflation protocol, to correlate various patient and lesion characteristics with IT, and ascertain in an in vitro study the time for pressure accommodation within an inflation system. Six hundred fifteen stent implants in 435 patients were studied. Multivariate analysis was performed to determine predictors of longer ITs. In an in vitro study, various stents and balloons were inflated in air to determine the pressure accommodation time of the inflation system. The mean stent IT was 104 ± 41 sec (range 30-380 sec). Stent length was the only predictor of prolonged stent inflation. The "accommodation time" in vitro of the stent inflation system itself was 33 ± 24 sec. The protocol was safe requiring premature inflation termination in <3% of stent implants. No serious adverse events occurred. Achieving stable inflation pressure requires on average over 100 sec and may require several minutes in individual cases. Stent length increases IT. These results suggest that the widespread practice of rapid inflation/deflation may not be sufficient to fully expand the stent and that the use of a pressure stability protocol will allow for safe, predictable, and more complete stent deployment. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  16. What is the optimal way to prepare a Bell state using measurement and feedback?

    NASA Astrophysics Data System (ADS)

    Martin, Leigh; Sayrafi, Mahrud; Whaley, K. Birgitta

    2017-12-01

    Recent work has shown that the use of quantum feedback can significantly enhance both the speed and success rate of measurement-based remote entanglement generation, but it is generally unknown what feedback protocols are optimal for these tasks. Here we consider two common measurements that are capable of projecting into pairwise entangled states, namely half- and full-parity measurements of two qubits, and determine in each case a globally optimal protocol for generation of entanglement. For the half-parity measurement, we rederive a previously described protocol using more general methods and prove that it is globally optimal for several figures of merit, including maximal concurrence or fidelity and minimal time to reach a specified concurrence or fidelity. For the full-parity measurement, we derive a protocol for rapid entanglement generation related to that of (Hill, Ralph, Phys. Rev. A 77, 014305), and then map the dynamics of the concurrence of the state to the Bloch vector length of an effective qubit. This mapping allows us to prove several optimality results for feedback protocols with full-parity measurements. We further show that our full-parity protocol transfers entanglement optimally from one qubit to the other amongst all measurement-based schemes. The methods developed here will be useful for deriving feedback protocols and determining their optimality properties in many other quantum systems subject to measurement and unitary operations.

  17. Evaluation of commercial DNA and RNA extraction methods for high-throughput sequencing of FFPE samples.

    PubMed

    Kresse, Stine H; Namløs, Heidi M; Lorenz, Susanne; Berner, Jeanne-Marie; Myklebost, Ola; Bjerkehagen, Bodil; Meza-Zepeda, Leonardo A

    2018-01-01

    Nucleic acid material of adequate quality is crucial for successful high-throughput sequencing (HTS) analysis. DNA and RNA isolated from archival FFPE material are frequently degraded and not readily amplifiable due to chemical damage introduced during fixation. To identify optimal nucleic acid extraction kits, DNA and RNA quantity, quality and performance in HTS applications were evaluated. DNA and RNA were isolated from five sarcoma archival FFPE blocks, using eight extraction protocols from seven kits from three different commercial vendors. For DNA extraction, the truXTRAC FFPE DNA kit from Covaris gave higher yields and better amplifiable DNA, but all protocols gave comparable HTS library yields using Agilent SureSelect XT and performed well in downstream variant calling. For RNA extraction, all protocols gave comparable yields and amplifiable RNA. However, for fusion gene detection using the Archer FusionPlex Sarcoma Assay, the truXTRAC FFPE RNA kit from Covaris and Agencourt FormaPure kit from Beckman Coulter showed the highest percentage of unique read-pairs, providing higher complexity of HTS data and more frequent detection of recurrent fusion genes. truXTRAC simultaneous DNA and RNA extraction gave similar outputs as individual protocols. These findings show that although successful HTS libraries could be generated in most cases, the different protocols gave variable quantity and quality for FFPE nucleic acid extraction. Selecting the optimal procedure is highly valuable and may generate results in borderline quality specimens.

  18. Automatic CT simulation optimization for radiation therapy: A general strategy.

    PubMed

    Li, Hua; Yu, Lifeng; Anastasio, Mark A; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M; Low, Daniel A; Mutic, Sasa

    2014-03-01

    In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube potentials for patient sizes of 38, 43, 48, 53, and 58 cm were 120, 140, 140, 140, and 140 kVp, respectively, and the corresponding minimum CTDIvol for achieving the optimal image quality index 4.4 were 9.8, 32.2, 100.9, 241.4, and 274.1 mGy, respectively. For patients with lateral sizes of 43-58 cm, 120-kVp scan protocols yielded up to 165% greater radiation dose relative to 140-kVp protocols, and 140-kVp protocols always yielded a greater image quality index compared to the same dose-level 120-kVp protocols. The trace of target and organ dosimetry coverage and the γ passing rates of seven IMRT dose distribution pairs indicated the feasibility of the proposed image quality index for the predication strategy. A general strategy to predict the optimal CT simulation protocols in a flexible and quantitative way was developed that takes into account patient size, treatment planning task, and radiation dose. The experimental study indicated that the optimal CT simulation protocol and the corresponding radiation dose varied significantly for different patient sizes, contouring accuracy, and radiation treatment planning tasks.

  19. Optimization of wireless sensor networks based on chicken swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Qingxi; Zhu, Lihua

    2017-05-01

    In order to reduce the energy consumption of wireless sensor network and improve the survival time of network, the clustering routing protocol of wireless sensor networks based on chicken swarm optimization algorithm was proposed. On the basis of LEACH agreement, it was improved and perfected that the points on the cluster and the selection of cluster head using the chicken group optimization algorithm, and update the location of chicken which fall into the local optimum by Levy flight, enhance population diversity, ensure the global search capability of the algorithm. The new protocol avoided the die of partial node of intensive using by making balanced use of the network nodes, improved the survival time of wireless sensor network. The simulation experiments proved that the protocol is better than LEACH protocol on energy consumption, also is better than that of clustering routing protocol based on particle swarm optimization algorithm.

  20. Contrast Media Administration in Coronary Computed Tomography Angiography - A Systematic Review.

    PubMed

    Mihl, Casper; Maas, Monique; Turek, Jakub; Seehofnerova, Anna; Leijenaar, Ralph T H; Kok, Madeleine; Lobbes, Marc B I; Wildberger, Joachim E; Das, Marco

    2017-04-01

    Background  Various different injection parameters influence enhancement of the coronary arteries. There is no consensus in the literature regarding the optimal contrast media (CM) injection protocol. The aim of this study is to provide an update on the effect of different CM injection parameters on the coronary attenuation in coronary computed tomographic angiography (CCTA). Method  Studies published between January 2001 and May 2014 identified by Pubmed, Embase and MEDLINE were evaluated. Using predefined inclusion criteria and a data extraction form, the content of each eligible study was assessed. Initially, 2551 potential studies were identified. After applying our criteria, 36 studies were found to be eligible. Studies were systematically assessed for quality based on the validated Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-II checklist. Results  Extracted data proved to be heterogeneous and often incomplete. The injection protocol and outcome of the included publications were very diverse and results are difficult to compare. Based on the extracted data, it remains unclear which of the injection parameters is the most important determinant for adequate attenuation. It is likely that one parameter which combines multiple parameters (e. g. IDR) will be the most suitable determinant of coronary attenuation in CCTA protocols. Conclusion  Research should be directed towards determining the influence of different injection parameters and defining individualized optimal IDRs tailored to patient-related factors (ideally in large randomized trials). Key points   · This systematic review provides insight into decisive factors on coronary attenuation.. · Different and contradicting outcomes are reported on coronary attenuation in CCTA.. · One parameter combining multiple parameters (IDR) is likely decisive in coronary attenuation.. · Research should aim at defining individualized optimal IDRs tailored to individual factors.. · Future directions should be tailored towards the influence of different injection parameters.. Citation Format · Mihl C, Maas M, Turek J et al. Contrast Media Administration in Coronary Computed Tomography Angiography - A Systematic Review. Fortschr Röntgenstr 2017; 189: 312 - 325. © Georg Thieme Verlag KG Stuttgart · New York.

  1. A modular method for the extraction of DNA and RNA, and the separation of DNA pools from diverse environmental sample types

    PubMed Central

    Lever, Mark A.; Torti, Andrea; Eickenbusch, Philip; Michaud, Alexander B.; Šantl-Temkiv, Tina; Jørgensen, Bo Barker

    2015-01-01

    A method for the extraction of nucleic acids from a wide range of environmental samples was developed. This method consists of several modules, which can be individually modified to maximize yields in extractions of DNA and RNA or separations of DNA pools. Modules were designed based on elaborate tests, in which permutations of all nucleic acid extraction steps were compared. The final modular protocol is suitable for extractions from igneous rock, air, water, and sediments. Sediments range from high-biomass, organic rich coastal samples to samples from the most oligotrophic region of the world's oceans and the deepest borehole ever studied by scientific ocean drilling. Extraction yields of DNA and RNA are higher than with widely used commercial kits, indicating an advantage to optimizing extraction procedures to match specific sample characteristics. The ability to separate soluble extracellular DNA pools without cell lysis from intracellular and particle-complexed DNA pools may enable new insights into the cycling and preservation of DNA in environmental samples in the future. A general protocol is outlined, along with recommendations for optimizing this general protocol for specific sample types and research goals. PMID:26042110

  2. An optimized immunohistochemistry protocol for detecting the guidance cue Netrin-1 in neural tissue.

    PubMed

    Salameh, Samer; Nouel, Dominique; Flores, Cecilia; Hoops, Daniel

    2018-01-01

    Netrin-1, an axon guidance protein, is difficult to detect using immunohistochemistry. We performed a multi-step, blinded, and controlled protocol optimization procedure to establish an efficient and effective fluorescent immunohistochemistry protocol for characterizing Netrin-1 expression. Coronal mouse brain sections were used to test numerous antigen retrieval methods and combinations thereof in order to optimize the stain quality of a commercially available Netrin-1 antibody. Stain quality was evaluated by experienced neuroanatomists for two criteria: signal intensity and signal-to-noise ratio. After five rounds of testing protocol variants, we established a modified immunohistochemistry protocol that produced a Netrin-1 signal with good signal intensity and a high signal-to-noise ratio. The key protocol modifications are as follows: •Use phosphate buffer (PB) as the blocking solution solvent.•Use 1% sodium dodecyl sulfate (SDS) treatment for antigen retrieval. The original protocol was optimized for use with the Netrin-1 antibody produced by Novus Biologicals. However, we subsequently further modified the protocol to work with the antibody produced by Abcam. The Abcam protocol uses PBS as the blocking solution solvent and adds a citrate buffer antigen retrieval step.

  3. Effect of rehabilitation length of stay on outcomes in individuals with traumatic brain injury or spinal cord injury: a systematic review protocol.

    PubMed

    Lamontagne, Marie-Eve; Gagnon, Cynthia; Allaire, Anne-Sophie; Noreau, Luc

    2013-07-20

    Rehabilitation interventions are a key component of the services required by individuals with neurotrauma to recover or compensate for altered abilities and achieve optimal social participation. Primary studies have produced evidence of the effect of rehabilitation length of stay on individuals with neurotrauma. However, to date no systematic review of this evidence has been performed. This makes it difficult for managers and clinicians to base their rehabilitation practices upon evidence. Supported by a committee of stakeholders, we will search electronic databases for research articles examining the association between length of stay or intensity of inpatient rehabilitation services and outcomes or the determinants of inpatient rehabilitation length of stay in adults with neurotrauma published after January 1990. Two researchers will independently screen the article titles and abstracts for inclusion. Two reviewers will independently extract the data. Primary outcomes of interest will be level of function, participation and return to work. If the data allow it, a meta-analysis of the studies will be performed. The results of this systematic review will clarify the factors that influence length of stay and intensity of rehabilitation services for individuals with TBI and SCI. They will give clinicians indications for optimal length of stay in these patient populations, contributing to better quality of care and better functional results. This review protocol has been registered on the PROSPERO database (CRD42012003120) and is available at http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012003120.

  4. Outcome and toxicity associated with a dose-intensified, maintenance-free CHOP-based chemotherapy protocol in canine lymphoma: 130 cases.

    PubMed

    Sorenmo, Karin; Overley, B; Krick, E; Ferrara, T; LaBlanc, A; Shofer, F

    2010-09-01

    A dose-intensified/dose-dense chemotherapy protocol for canine lymphoma was designed and implemented at the Veterinary Hospital of the University of Pennsylvania. In this study, we describe the clinical characteristics, prognostic factors, efficacy and toxicity in 130 dogs treated with this protocol. The majority of the dogs had advanced stage disease (63.1% stage V) and sub-stage b (58.5%). The median time to progression (TTP) and lymphoma-specific survival were 219 and 323 days, respectively. These results are similar to previous less dose-intense protocols. Sub-stage was a significant negative prognostic factor for survival. The incidence of toxicity was high; 53.9 and 45% of the dogs needed dose reductions and treatment delays, respectively. Dogs that required dose reductions and treatment delays had significantly longer TTP and lymphoma-specific survival times. These results suggest that dose density is important, but likely relative, and needs to be adjusted according to the individual patient's toxicity for optimal outcome.

  5. Enhancement of multimodality texture-based prediction models via optimization of PET and MR image acquisition protocols: a proof of concept

    NASA Astrophysics Data System (ADS)

    Vallières, Martin; Laberge, Sébastien; Diamant, André; El Naqa, Issam

    2017-11-01

    Texture-based radiomic models constructed from medical images have the potential to support cancer treatment management via personalized assessment of tumour aggressiveness. While the identification of stable texture features under varying imaging settings is crucial for the translation of radiomics analysis into routine clinical practice, we hypothesize in this work that a complementary optimization of image acquisition parameters prior to texture feature extraction could enhance the predictive performance of texture-based radiomic models. As a proof of concept, we evaluated the possibility of enhancing a model constructed for the early prediction of lung metastases in soft-tissue sarcomas by optimizing PET and MR image acquisition protocols via computerized simulations of image acquisitions with varying parameters. Simulated PET images from 30 STS patients were acquired by varying the extent of axial data combined per slice (‘span’). Simulated T 1-weighted and T 2-weighted MR images were acquired by varying the repetition time and echo time in a spin-echo pulse sequence, respectively. We analyzed the impact of the variations of PET and MR image acquisition parameters on individual textures, and we investigated how these variations could enhance the global response and the predictive properties of a texture-based model. Our results suggest that it is feasible to identify an optimal set of image acquisition parameters to improve prediction performance. The model constructed with textures extracted from simulated images acquired with a standard clinical set of acquisition parameters reached an average AUC of 0.84 +/- 0.01 in bootstrap testing experiments. In comparison, the model performance significantly increased using an optimal set of image acquisition parameters (p = 0.04 ), with an average AUC of 0.89 +/- 0.01 . Ultimately, specific acquisition protocols optimized to generate superior radiomics measurements for a given clinical problem could be developed and standardized via dedicated computer simulations and thereafter validated using clinical scanners.

  6. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    PubMed

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P < .01) and the standard deviation decreased from 3.9 to 1.6 HU (P < .01). Mean SSDE decreased from 11.9 to 7.5 mGy, a 37% reduction (P < .01). The process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  7. Intervention to Match Young Black Men and Transwomen Who Have Sex With Men or Transwomen to HIV Testing Options (All About Me): Protocol for a Randomized Controlled Trial.

    PubMed

    Koblin, Beryl; Hirshfield, Sabina; Chiasson, Mary Ann; Wilton, Leo; Usher, DaShawn; Nandi, Vijay; Hoover, Donald R; Frye, Victoria

    2017-12-19

    HIV testing is a critical component of HIV prevention and care. Interventions to increase HIV testing rates among young black men who have sex with men (MSM) and black transgender women (transwomen) are needed. Personalized recommendations for an individual's optimal HIV testing approach may increase testing. This randomized trial tests the hypothesis that a personalized recommendation of an optimal HIV testing approach will increase HIV testing more than standard HIV testing information. A randomized trial among 236 young black men and transwomen who have sex with men or transwomen is being conducted. Participants complete a computerized baseline assessment and are randomized to electronically receive a personalized HIV testing recommendation or standard HIV testing information. Follow-up surveys are conducted online at 3 and 6 months after baseline. The All About Me randomized trial was launched in June 2016. Enrollment is completed and 3-month retention is 92.4% (218/236) and has exceeded study target goals. The All About Me intervention is an innovative approach to increase HIV testing by providing a personalized recommendation of a person's optimal HIV testing approach. If successful, optimizing this intervention for mobile devices will widen access to large numbers of individuals. ClinicalTrial.gov NCT02834572; https://clinicaltrials.gov/ct2/show/NCT02834572 (Archived by WebCite at http://www.webcitation.org/6vLJWOS1B). ©Beryl Koblin, Sabina Hirshfield, Mary Ann Chiasson, Leo Wilton, DaShawn Usher, Vijay Nandi, Donald R Hoover, Victoria Frye. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 19.12.2017.

  8. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    NASA Astrophysics Data System (ADS)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  9. One-pot refolding of core histones from bacterial inclusion bodies allows rapid reconstitution of histone octamer.

    PubMed

    Lee, Young-Tae; Gibbons, Garrett; Lee, Shirley Y; Nikolovska-Coleska, Zaneta; Dou, Yali

    2015-06-01

    We report an optimized method to purify and reconstitute histone octamer, which utilizes high expression of histones in inclusion bodies but eliminates the time consuming steps of individual histone purification. In the newly modified protocol, Xenopus laevis H2A, H2B, H3, and H4 are expressed individually into inclusion bodies of bacteria, which are subsequently mixed together and denatured in 8M guanidine hydrochloride. Histones are refolded and reconstituted into soluble octamer by dialysis against 2M NaCl, and metal-affinity purified through an N-terminal polyhistidine-tag added on the H2A. After cleavage of the polyhistidine-tag, histone octamer is further purified by size exclusion chromatography. We show that the nucleosomes reconstituted using the purified histone octamer above are fully functional. They serve as effective substrates for the histone methyltransferases DOT1L and MLL1. Small angle X-ray scattering further confirms that the reconstituted nucleosomes have correct structural integration of histone octamer and DNA as observed in the X-ray crystal structure. Our new protocol enables rapid reconstitution of histone octamer with an optimal yield. We expect this simplified approach to facilitate research using recombinant nucleosomes in vitro. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Improving the efficiency of single and multiple teleportation protocols based on the direct use of partially entangled states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br

    We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The threemore » techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.« less

  11. Suppression of work fluctuations by optimal control: An approach based on Jarzynski's equality

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2014-11-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, aspects of work fluctuations will be an important factor in designing nanoscale heat engines. In this work, an optimal control approach directly exploiting Jarzynski's equality is proposed to effectively suppress the fluctuations in the work statistics, for systems (initially at thermal equilibrium) subject to a work protocol but isolated from a bath during the protocol. The control strategy is to minimize the deviations of individual values of e-β W from their ensemble average given by e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. It is further shown that even when the system Hamiltonian is not fully known, it is still possible to suppress work fluctuations through a feedback loop, by refining the control target function on the fly through Jarzynski's equality itself. Numerical experiments are based on linear and nonlinear parametric oscillators. Optimal control results for linear parametric oscillators are also benchmarked with early results based on shortcuts to adiabaticity.

  12. Optimizing the design of a reproduction toxicity test with the pond snail Lymnaea stagnalis.

    PubMed

    Charles, Sandrine; Ducrot, Virginie; Azam, Didier; Benstead, Rachel; Brettschneider, Denise; De Schamphelaere, Karel; Filipe Goncalves, Sandra; Green, John W; Holbech, Henrik; Hutchinson, Thomas H; Faber, Daniel; Laranjeiro, Filipe; Matthiessen, Peter; Norrgren, Leif; Oehlmann, Jörg; Reategui-Zirena, Evelyn; Seeland-Fremer, Anne; Teigeler, Matthias; Thome, Jean-Pierre; Tobor Kaplon, Marysia; Weltje, Lennart; Lagadic, Laurent

    2016-11-01

    This paper presents the results from two ring-tests addressing the feasibility, robustness and reproducibility of a reproduction toxicity test with the freshwater gastropod Lymnaea stagnalis (RENILYS strain). Sixteen laboratories (from inexperienced to expert laboratories in mollusc testing) from nine countries participated in these ring-tests. Survival and reproduction were evaluated in L. stagnalis exposed to cadmium, tributyltin, prochloraz and trenbolone according to an OECD draft Test Guideline. In total, 49 datasets were analysed to assess the practicability of the proposed experimental protocol, and to estimate the between-laboratory reproducibility of toxicity endpoint values. The statistical analysis of count data (number of clutches or eggs per individual-day) leading to ECx estimation was specifically developed and automated through a free web-interface. Based on a complementary statistical analysis, the optimal test duration was established and the most sensitive and cost-effective reproduction toxicity endpoint was identified, to be used as the core endpoint. This validation process and the resulting optimized protocol were used to consolidate the OECD Test Guideline for the evaluation of reproductive effects of chemicals in L. stagnalis. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  14. Dynamic Hierarchical Energy-Efficient Method Based on Combinatorial Optimization for Wireless Sensor Networks.

    PubMed

    Chang, Yuchao; Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Yuan, Baoqing Li andXiaobing

    2017-07-19

    Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum-minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms.

  15. Bedside diagnosis of dysphagia: a systematic review.

    PubMed

    O'Horo, John C; Rogus-Pulia, Nicole; Garcia-Arguello, Lisbeth; Robbins, JoAnne; Safdar, Nasia

    2015-04-01

    Dysphagia is associated with aspiration, pneumonia, and malnutrition, but remains challenging to identify at the bedside. A variety of exam protocols and maneuvers are commonly used, but the efficacy of these maneuvers is highly variable. We conducted a comprehensive search of 7 databases, including MEDLINE, Embase, and Scopus, from each database's earliest inception through June 9, 2014. Studies reporting diagnostic performance of a bedside examination maneuver compared to a reference gold standard (videofluoroscopic swallow study or flexible endoscopic evaluation of swallowing with sensory testing) were included for analysis. From each study, data were abstracted based on the type of diagnostic method and reference standard study population and inclusion/exclusion characteristics, design, and prediction of aspiration. The search strategy identified 38 articles meeting inclusion criteria. Overall, most bedside examinations lacked sufficient sensitivity to be used for screening purposes across all patient populations examined. Individual studies found dysphonia assessments, abnormal pharyngeal sensation assessments, dual axis accelerometry, and 1 description of water swallow testing to be sensitive tools, but none were reported as consistently sensitive. A preponderance of identified studies was in poststroke adults, limiting the generalizability of results. No bedside screening protocol has been shown to provide adequate predictive value for presence of aspiration. Several individual exam maneuvers demonstrated reasonable sensitivity, but reproducibility and consistency of these protocols was not established. More research is needed to design an optimal protocol for dysphagia detection. © 2015 Society of Hospital Medicine.

  16. The Danish Centre for Strategic Research in Type 2 Diabetes (DD2) study: expected outcome from the DD2 project and two intervention studies

    PubMed Central

    Beck-Nielsen, Henning; Solomon, Thomas PJ; Lauridsen, Jørgen; Karstoft, Kristian; Pedersen, Bente K; Johnsen, Søren P; Nielsen, Jens Steen; Kryger, Tine Bjerregaard; Sortsø, Camilla; Vaag, Allan

    2012-01-01

    The overall aim of the Danish Centre for Strategic Research in Type 2 Diabetes (DD2) is to near-normalize metabolic control in newly diagnosed patients with type 2 diabetes (T2D) using an individualized treatment approach. We hypothesize that this will not only prevent complications and improve quality of life for T2D patients but also result in increased cost efficiency compared with current treatment modalities. This paper provides an overview of the expected outcomes from DD2, focusing on the two main intervention studies. The main data for the DD2 project are collected during patient enrollment and stored using the individual civil registration number. This enables subsequent linking to other national databases where supplemental data can be obtained. All data will be used for designing treatment guidelines and continuously monitoring the development of diabetic complications, thereby obtaining knowledge about predictors for the long-term outcome and identifying targets for new interventions. Further data are being collected from two intervention studies. The aim of the first intervention study is to improve T2D treatment using an individualized treatment modality optimizing medication according to individual metabolic responses and phenotypic characteristics. The aim of the second intervention study is to develop an evidence-based training protocol to be implemented as a treatment modality for T2D and used for initiating lifelong changes in physical activity levels in patients with T2D. An initial pilot study evaluating an interval-based walking protocol is ongoing, and preliminary results indicate that this protocol is an optimal “free-living” training intervention. An initial health-economic analysis will also be performed as a basis for analysis of the data collected during the project. A cost-benefit analysis of the two intervention studies will be conducted. The DD2 project is expected to lead to improved treatment modalities and increased knowledge about existing treatment guidelines, and will also provide a solid base for health-economic decision-making. PMID:23071408

  17. Whole brain inhomogeneous magnetization transfer (ihMT) imaging: Sensitivity enhancement within a steady-state gradient echo sequence.

    PubMed

    Mchinda, Samira; Varma, Gopal; Prevost, Valentin H; Le Troter, Arnaud; Rapacchi, Stanislas; Guye, Maxime; Pelletier, Jean; Ranjeva, Jean-Philippe; Alsop, David C; Duhamel, Guillaume; Girard, Olivier M

    2018-05-01

    To implement, characterize, and optimize an interleaved inhomogeneous magnetization transfer (ihMT) gradient echo sequence allowing for whole-brain imaging within a clinically compatible scan time. A general framework for ihMT modelling was developed based on the Provotorov theory of radiofrequency saturation, which accounts for the dipolar order underpinning the ihMT effect. Experimental studies and numerical simulations were performed to characterize and optimize the ihMT-gradient echo dependency with sequence timings, saturation power, and offset frequency. The protocol was optimized in terms of maximum signal intensity and the reproducibility assessed for a nominal resolution of 1.5 mm isotropic. All experiments were performed on healthy volunteers at 1.5T. An important mechanism driving signal optimization and leading to strong ihMT signal enhancement that relies on the dynamics of radiofrequency energy deposition has been identified. By taking advantage of the delay allowed for readout between ihMT pulse bursts, it was possible to boost the ihMT signal by almost 2-fold compared to previous implementation. Reproducibility of the optimal protocol was very good, with an intra-individual error < 2%. The proposed sensitivity-boosted and time-efficient steady-state ihMT-gradient echo sequence, implemented and optimized at 1.5T, allowed robust high-resolution 3D ihMT imaging of the whole brain within a clinically compatible scan time. Magn Reson Med 79:2607-2619, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Dosage optimization in positron emission tomography: state-of-the-art methods and future prospects

    PubMed Central

    Karakatsanis, Nicolas A; Fokou, Eleni; Tsoumpas, Charalampos

    2015-01-01

    Positron emission tomography (PET) is widely used nowadays for tumor staging and therapy response in the clinic. However, average PET radiation exposure has increased due to higher PET utilization. This study aims to review state-of-the-art PET tracer dosage optimization methods after accounting for the effects of human body attenuation and scan protocol parameters on the counting rate. In particular, the relationship between the noise equivalent count rate (NECR) and the dosage (NECR-dosage curve) for a range of clinical PET systems and body attenuation sizes will be systematically studied to prospectively estimate the minimum dosage required for sufficiently high NECR. The optimization criterion can be determined either as a function of the peak of the NECR-dosage curve or as a fixed NECR score when NECR uniformity across a patient population is important. In addition, the systematic NECR assessments within a controllable environment of realistic simulations and phantom experiments can lead to a NECR-dosage response model, capable of predicting the optimal dosage for every individual PET scan. Unlike conventional guidelines suggesting considerably large dosage levels for obese patients, NECR-based optimization recommends: i) moderate dosage to achieve 90% of peak NECR for obese patients, ii) considerable dosage reduction for slimmer patients such that uniform NECR is attained across the patient population, and iii) prolongation of scans for PET/MR protocols, where longer PET acquisitions are affordable due to lengthy MR sequences, with motion compensation becoming important then. Finally, the need for continuous adaptation of dosage optimization to emerging technologies will be discussed. PMID:26550543

  19. Academic consortium for the evaluation of computer-aided diagnosis (CADx) in mammography

    NASA Astrophysics Data System (ADS)

    Mun, Seong K.; Freedman, Matthew T.; Wu, Chris Y.; Lo, Shih-Chung B.; Floyd, Carey E., Jr.; Lo, Joseph Y.; Chan, Heang-Ping; Helvie, Mark A.; Petrick, Nicholas; Sahiner, Berkman; Wei, Datong; Chakraborty, Dev P.; Clarke, Laurence P.; Kallergi, Maria; Clark, Bob; Kim, Yongmin

    1995-04-01

    Computer aided diagnosis (CADx) is a promising technology for the detection of breast cancer in screening mammography. A number of different approaches have been developed for CADx research that have achieved significant levels of performance. Research teams now recognize the need for a careful and detailed evaluation study of approaches to accelerate the development of CADx, to make CADx more clinically relevant and to optimize the CADx algorithms based on unbiased evaluations. The results of such a comparative study may provide each of the participating teams with new insights into the optimization of their individual CADx algorithms. This consortium of experienced CADx researchers is working as a group to compare results of the algorithms and to optimize the performance of CADx algorithms by learning from each other. Each institution will be contributing an equal number of cases that will be collected under a standard protocol for case selection, truth determination, and data acquisition to establish a common and unbiased database for the evaluation study. An evaluation procedure for the comparison studies are being developed to analyze the results of individual algorithms for each of the test cases in the common database. Optimization of individual CADx algorithms can be made based on the comparison studies. The consortium effort is expected to accelerate the eventual clinical implementation of CADx algorithms at participating institutions.

  20. Broken symmetry in a two-qubit quantum control landscape

    NASA Astrophysics Data System (ADS)

    Bukov, Marin; Day, Alexandre G. R.; Weinberg, Phillip; Polkovnikov, Anatoli; Mehta, Pankaj; Sels, Dries

    2018-05-01

    We analyze the physics of optimal protocols to prepare a target state with high fidelity in a symmetrically coupled two-qubit system. By varying the protocol duration, we find a discontinuous phase transition, which is characterized by a spontaneous breaking of a Z2 symmetry in the functional form of the optimal protocol, and occurs below the quantum speed limit. We study in detail this phase and demonstrate that even though high-fidelity protocols come degenerate with respect to their fidelity, they lead to final states of different entanglement entropy shared between the qubits. Consequently, while globally both optimal protocols are equally far away from the target state, one is locally closer than the other. An approximate variational mean-field theory which captures the physics of the different phases is developed.

  1. Droplet-based pyrosequencing using digital microfluidics.

    PubMed

    Boles, Deborah J; Benton, Jonathan L; Siew, Germaine J; Levy, Miriam H; Thwar, Prasanna K; Sandahl, Melissa A; Rouse, Jeremy L; Perkins, Lisa C; Sudarsan, Arjun P; Jalili, Roxana; Pamula, Vamsee K; Srinivasan, Vijay; Fair, Richard B; Griffin, Peter B; Eckhardt, Allen E; Pollack, Michael G

    2011-11-15

    The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., "sample-to-sequence" capability) could eventually be achieved using this low-cost platform.

  2. Droplet-Based Pyrosequencing Using Digital Microfluidics

    PubMed Central

    Boles, Deborah J.; Benton, Jonathan L.; Siew, Germaine J.; Levy, Miriam H.; Thwar, Prasanna K.; Sandahl, Melissa A.; Rouse, Jeremy L.; Perkins, Lisa C.; Sudarsan, Arjun P.; Jalili, Roxana; Pamula, Vamsee K.; Srinivasan, Vijay; Fair, Richard B.; Griffin, Peter B.; Eckhardt, Allen E.; Pollack, Michael G.

    2013-01-01

    The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., “sample-to-sequence” capability) could eventually be achieved using this low-cost platform. PMID:21932784

  3. Using connectome-based predictive modeling to predict individual behavior from brain connectivity

    PubMed Central

    Shen, Xilin; Finn, Emily S.; Scheinost, Dustin; Rosenberg, Monica D.; Chun, Marvin M.; Papademetris, Xenophon; Constable, R Todd

    2017-01-01

    Neuroimaging is a fast developing research area where anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale datasets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: 1) feature selection, 2) feature summarization, 3) model building, and 4) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a significant amount of the variance in these measures. It has been demonstrated that the CPM protocol performs equivalently or better than most of the existing approaches in brain-behavior prediction. However, because CPM focuses on linear modeling and a purely data-driven driven approach, neuroscientists with limited or no experience in machine learning or optimization would find it easy to implement the protocols. Depending on the volume of data to be processed, the protocol can take 10–100 minutes for model building, 1–48 hours for permutation testing, and 10–20 minutes for visualization of results. PMID:28182017

  4. Identifying Balance Measures Most Likely to Identify Recent Falls.

    PubMed

    Criter, Robin E; Honaker, Julie A

    2016-01-01

    Falls sustained by older adults are an increasing health care issue. Early identification of those at risk for falling can lead to successful prevention of falls. Balance complaints are common among individuals who fall or are at risk for falling. The purpose of this study was to evaluate the clinical utility of a multifaceted balance protocol used for fall risk screening, with the hypothesis that this protocol would successfully identify individuals who had a recent fall (within the previous 12 months). This is a retrospective review of 30 individuals who self-referred for a free fall risk screening. Measures included case history, Activities-Specific Balance Confidence Scale, modified Clinical Test of Sensory Interaction on Balance, Timed Up and Go test, and Dynamic Visual Acuity. Statistical analyses were focused on the ability of the test protocol to identify a fall within the past 12 months and included descriptive statistics, clinical utility indices, logistic regression, receiver operating characteristic curve, area under the curve analysis, effect size (Cohen d), and Spearman correlation coefficients. All individuals who self-referred for this free screening had current imbalance complaints, and were typically women (70%), had a mean age of 77.2 years, and had a fear of falling (70%). Almost half (46.7%) reported at least 1 lifetime fall and 40.0% within the past 12 months. Regression analysis suggested that the Timed Up and Go test was the most important indicator of a recent fall. A cutoff score of 12 or more seconds was optimal (sensitivity: 83.3%; specificity: 61.1%). Older adults with current complaints of imbalance have a higher rate of falls, fall-related injury, and fear of falling than the general community-dwelling public. The Timed Up and Go test is useful for determining recent fall history in individuals with imbalance.

  5. Dynamic Hierarchical Energy-Efficient Method Based on Combinatorial Optimization for Wireless Sensor Networks

    PubMed Central

    Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Li, Baoqing; Yuan, Xiaobing

    2017-01-01

    Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum–minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms. PMID:28753962

  6. TH-A-BRF-05: MRI of Individual Lymph Nodes to Guide Regional Breast Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heijst, T van; Asselen, B van; Lagendijk, J

    2014-06-15

    Purpose: In regional radiotherapy (RT) for breast-cancer patients, direct visualization of individual lymph nodes (LNs) may reduce target volumes and Result in lower toxicity (i.e. reduced radiation pneumonitis, arm edema, arm morbidity), relative to standard CT-based delineations. To this end, newly designed magnetic resonance imaging (MRI) sequences were optimized and assessed qualitatively and quantitatively. Methods: In ten healthy female volunteers, a scanning protocol was developed and optimized. Coronal images were acquired in supine RT position positioned on a wedge board on a 1.5 T Ingenia (Philips) wide-bore MRI. In four volunteers the optimized MRI protocol was applied, including a 3-dimensionalmore » (3D) T1-weighted (T1w) fast-field-echo (FFE). T2w sequences, including 3D FFE, 3D and 2D fast spin echo (FSE), and diffusion-weighted single-shot echo-planar imaging (DWI) were also performed. Several fatsuppression techniques were used. Qualitative evaluation parameters included LN contrast, motion susceptibility, visibility of anatomical structures, and fat suppression. The number of visible axillary and supraclavicular LNs was also determined. Results: T1 FFE, insensitive to motion, lacked contrast of LNs, which often blended in with soft tissue and blood. T2 FFE showed high contrast, but some LNs were obscured due to motion. Both 2D and 3D FSE were motion-insensitive having high contrast, although some blood remained visible. 2D FSE showed more anatomical details, while in 3D FSE, some blurring occurred. DWI showed high LN contrast, but suffered from geometric distortions and low resolution. Fat suppression by mDixon was the most reliable in regions with magnetic-field inhomogeneities. The FSE sequences showed the highest sensitivity for LN detection. Conclusion: MRI of regional LNs was achieved in volunteers. The FSE techniques were robust and the most sensitive. Our optimized MRI sequences can facilitate direct delineation of individual LNs. This can Result in smaller target volumes and reduced toxicity in regional RT compared to standard CT planning.« less

  7. Quantitative evaluation of multi-parametric MR imaging marker changes post-laser interstitial ablation therapy (LITT) for epilepsy

    NASA Astrophysics Data System (ADS)

    Tiwari, Pallavi; Danish, Shabbar; Wong, Stephen; Madabhushi, Anant

    2013-03-01

    Laser-induced interstitial thermal therapy (LITT) has recently emerged as a new, less invasive alternative to craniotomy for treating epilepsy; which allows for focussed delivery of laser energy monitored in real time by MRI, for precise removal of the epileptogenic foci. Despite being minimally invasive, the effects of laser ablation on the epileptogenic foci (reflected by changes in MR imaging markers post-LITT) are currently unknown. In this work, we present a quantitative framework for evaluating LITT-related changes by quantifying per-voxel changes in MR imaging markers which may be more reflective of local treatment related changes (TRC) that occur post-LITT, as compared to the standard volumetric analysis which involves monitoring a more global volume change across pre-, and post-LITT MRI. Our framework focuses on three objectives: (a) development of temporal MRI signatures that characterize TRC corresponding to patients with seizure freedom by comparing differences in MR imaging markers and monitoring them over time, (b) identification of the optimal time point when early LITT induced effects (such as edema and mass effect) subside by monitoring TRC at subsequent time-points post-LITT, and (c) identification of contributions of individual MRI protocols towards characterizing LITT-TRC for epilepsy by identifying MR markers that change most dramatically over time and employ individual contributions to create a more optimal weighted MP-MRI temporal profile that can better characterize TRC compared to any individual imaging marker. A cohort of patients were monitored at different time points post-LITT via MP-MRI involving T1-w, T2-w, T2-GRE, T2-FLAIR, and apparent diffusion coefficient (ADC) protocols. Post affine registration of individual MRI protocols to a reference MRI protocol pre-LITT, differences in individual MR markers are computed on a per-voxel basis, at different time-points with respect to baseline (pre-LITT) MRI as well as across subsequent time-points. A time-dependent MRI profile corresponding to successful (seizure-free) is then created that captures changes in individual MR imaging markers over time. Our preliminary analysis on two patient studies suggests that (a) LITT related changes (attributed to swelling and edema) appear to subside within 4-weeks post-LITT, (b) ADC may be more sensitive for evaluating early TRC (up to 3-months), and T1-w may be more sensitive in evaluating early delayed TRC (1-month, 3-months), while T2-w and T2-FLAIR appeared to be more sensitive in identifying late TRC (around 6-months post-LITT) compared to the other MRI protocols under evaluation. T2-GRE was found to be only nominally sensitive in identifying TRC at any follow-up time-point post-LITT. The framework presented in this work thus serves as an important precursor to a comprehensive treatment evaluation framework that can be used to identify sensitive MR markers corresponding to patient response (seizure-freedom or seizure recurrence), with an ultimate objective of making prognostic predictions about patient outcome post-LITT.

  8. Optimization of coronary attenuation in coronary computed tomography angiography using diluted contrast material.

    PubMed

    Kawaguchi, Naoto; Kurata, Akira; Kido, Teruhito; Nishiyama, Yoshiko; Kido, Tomoyuki; Miyagawa, Masao; Ogimoto, Akiyoshi; Mochizuki, Teruhito

    2014-01-01

    The purpose of this study was to evaluate a personalized protocol with diluted contrast material (CM) for coronary computed tomography angiography (CTA). One hundred patients with suspected coronary artery disease underwent retrospective electrocardiogram-gated coronary CTA on a 256-slice multidetector-row CT scanner. In the diluted CM protocol (n=50), the optimal scan timing and CM dilution rate were determined by the timing bolus scan, with 20% CM dilution (5ml/s during 10s) being considered suitable to achieve the target arterial attenuation of 350 Hounsfield units (HU). In the body weight (BW)-adjusted protocol (n=50, 222mg iodine/kg), only the optimal scan timing was determined by the timing bolus scan. The injection rate and volume in the timing bolus scan and real scan were identical between the 2 protocols. We compared the means and variations in coronary attenuation between the 2 protocols. Coronary attenuation (mean±SD) in the diluted CM and BW-adjusted protocols was 346.1±23.9 HU and 298.8±45.2 HU, respectively. The diluted CM protocol provided significantly higher coronary attenuation and lower variance than did the BW-adjusted protocol (P<0.05, in each). The diluted CM protocol facilitates more uniform attenuation on coronary CTA in comparison with the BW-adjusted protocol.  

  9. A Power-Optimized Cooperative MAC Protocol for Lifetime Extension in Wireless Sensor Networks.

    PubMed

    Liu, Kai; Wu, Shan; Huang, Bo; Liu, Feng; Xu, Zhen

    2016-10-01

    In wireless sensor networks, in order to satisfy the requirement of long working time of energy-limited nodes, we need to design an energy-efficient and lifetime-extended medium access control (MAC) protocol. In this paper, a node cooperation mechanism that one or multiple nodes with higher channel gain and sufficient residual energy help a sender relay its data packets to its recipient is employed to achieve this objective. We first propose a transmission power optimization algorithm to prolong network lifetime by optimizing the transmission powers of the sender and its cooperative nodes to maximize their minimum residual energy after their data packet transmissions. Based on it, we propose a corresponding power-optimized cooperative MAC protocol. A cooperative node contention mechanism is designed to ensure that the sender can effectively select a group of cooperative nodes with the lowest energy consumption and the best channel quality for cooperative transmissions, thus further improving the energy efficiency. Simulation results show that compared to typical MAC protocol with direct transmissions and energy-efficient cooperative MAC protocol, the proposed cooperative MAC protocol can efficiently improve the energy efficiency and extend the network lifetime.

  10. A Power-Optimized Cooperative MAC Protocol for Lifetime Extension in Wireless Sensor Networks

    PubMed Central

    Liu, Kai; Wu, Shan; Huang, Bo; Liu, Feng; Xu, Zhen

    2016-01-01

    In wireless sensor networks, in order to satisfy the requirement of long working time of energy-limited nodes, we need to design an energy-efficient and lifetime-extended medium access control (MAC) protocol. In this paper, a node cooperation mechanism that one or multiple nodes with higher channel gain and sufficient residual energy help a sender relay its data packets to its recipient is employed to achieve this objective. We first propose a transmission power optimization algorithm to prolong network lifetime by optimizing the transmission powers of the sender and its cooperative nodes to maximize their minimum residual energy after their data packet transmissions. Based on it, we propose a corresponding power-optimized cooperative MAC protocol. A cooperative node contention mechanism is designed to ensure that the sender can effectively select a group of cooperative nodes with the lowest energy consumption and the best channel quality for cooperative transmissions, thus further improving the energy efficiency. Simulation results show that compared to typical MAC protocol with direct transmissions and energy-efficient cooperative MAC protocol, the proposed cooperative MAC protocol can efficiently improve the energy efficiency and extend the network lifetime. PMID:27706079

  11. Optimization of oligonucleotide arrays and RNA amplification protocols for analysis of transcript structure and alternative splicing.

    PubMed

    Castle, John; Garrett-Engele, Phil; Armour, Christopher D; Duenwald, Sven J; Loerch, Patrick M; Meyer, Michael R; Schadt, Eric E; Stoughton, Roland; Parrish, Mark L; Shoemaker, Daniel D; Johnson, Jason M

    2003-01-01

    Microarrays offer a high-resolution means for monitoring pre-mRNA splicing on a genomic scale. We have developed a novel, unbiased amplification protocol that permits labeling of entire transcripts. Also, hybridization conditions, probe characteristics, and analysis algorithms were optimized for detection of exons, exon-intron edges, and exon junctions. These optimized protocols can be used to detect small variations and isoform mixtures, map the tissue specificity of known human alternative isoforms, and provide a robust, scalable platform for high-throughput discovery of alternative splicing.

  12. Optimization of oligonucleotide arrays and RNA amplification protocols for analysis of transcript structure and alternative splicing

    PubMed Central

    Castle, John; Garrett-Engele, Phil; Armour, Christopher D; Duenwald, Sven J; Loerch, Patrick M; Meyer, Michael R; Schadt, Eric E; Stoughton, Roland; Parrish, Mark L; Shoemaker, Daniel D; Johnson, Jason M

    2003-01-01

    Microarrays offer a high-resolution means for monitoring pre-mRNA splicing on a genomic scale. We have developed a novel, unbiased amplification protocol that permits labeling of entire transcripts. Also, hybridization conditions, probe characteristics, and analysis algorithms were optimized for detection of exons, exon-intron edges, and exon junctions. These optimized protocols can be used to detect small variations and isoform mixtures, map the tissue specificity of known human alternative isoforms, and provide a robust, scalable platform for high-throughput discovery of alternative splicing. PMID:14519201

  13. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    PubMed

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  14. FDA approved drugs complexed to their targets: evaluating pose prediction accuracy of docking protocols.

    PubMed

    Bohari, Mohammed H; Sastry, G Narahari

    2012-09-01

    Efficient drug discovery programs can be designed by utilizing existing pools of knowledge from the already approved drugs. This can be achieved in one way by repositioning of drugs approved for some indications to newer indications. Complex of drug to its target gives fundamental insight into molecular recognition and a clear understanding of putative binding site. Five popular docking protocols, Glide, Gold, FlexX, Cdocker and LigandFit have been evaluated on a dataset of 199 FDA approved drug-target complexes for their accuracy in predicting the experimental pose. Performance for all the protocols is assessed at default settings, with root mean square deviation (RMSD) between the experimental ligand pose and the docked pose of less than 2.0 Å as the success criteria in predicting the pose. Glide (38.7 %) is found to be the most accurate in top ranked pose and Cdocker (58.8 %) in top RMSD pose. Ligand flexibility is a major bottleneck in failure of docking protocols to correctly predict the pose. Resolution of the crystal structure shows an inverse relationship with the performance of docking protocol. All the protocols perform optimally when a balanced type of hydrophilic and hydrophobic interaction or dominant hydrophilic interaction exists. Overall in 16 different target classes, hydrophobic interactions dominate in the binding site and maximum success is achieved for all the docking protocols in nuclear hormone receptor class while performance for the rest of the classes varied based on individual protocol.

  15. Self-Configuration and Self-Optimization Process in Heterogeneous Wireless Networks

    PubMed Central

    Guardalben, Lucas; Villalba, Luis Javier García; Buiati, Fábio; Sobral, João Bosco Mangueira; Camponogara, Eduardo

    2011-01-01

    Self-organization in Wireless Mesh Networks (WMN) is an emergent research area, which is becoming important due to the increasing number of nodes in a network. Consequently, the manual configuration of nodes is either impossible or highly costly. So it is desirable for the nodes to be able to configure themselves. In this paper, we propose an alternative architecture for self-organization of WMN based on Optimized Link State Routing Protocol (OLSR) and the ad hoc on demand distance vector (AODV) routing protocols as well as using the technology of software agents. We argue that the proposed self-optimization and self-configuration modules increase the throughput of network, reduces delay transmission and network load, decreases the traffic of HELLO messages according to network’s scalability. By simulation analysis, we conclude that the self-optimization and self-configuration mechanisms can significantly improve the performance of OLSR and AODV protocols in comparison to the baseline protocols analyzed. PMID:22346584

  16. Self-configuration and self-optimization process in heterogeneous wireless networks.

    PubMed

    Guardalben, Lucas; Villalba, Luis Javier García; Buiati, Fábio; Sobral, João Bosco Mangueira; Camponogara, Eduardo

    2011-01-01

    Self-organization in Wireless Mesh Networks (WMN) is an emergent research area, which is becoming important due to the increasing number of nodes in a network. Consequently, the manual configuration of nodes is either impossible or highly costly. So it is desirable for the nodes to be able to configure themselves. In this paper, we propose an alternative architecture for self-organization of WMN based on Optimized Link State Routing Protocol (OLSR) and the ad hoc on demand distance vector (AODV) routing protocols as well as using the technology of software agents. We argue that the proposed self-optimization and self-configuration modules increase the throughput of network, reduces delay transmission and network load, decreases the traffic of HELLO messages according to network's scalability. By simulation analysis, we conclude that the self-optimization and self-configuration mechanisms can significantly improve the performance of OLSR and AODV protocols in comparison to the baseline protocols analyzed.

  17. Bedside Diagnosis of Dysphagia: A Systematic Review

    PubMed Central

    O’Horo, John C.; Rogus-Pulia, Nicole; Garcia-Arguello, Lisbeth; Robbins, JoAnne; Safdar, Nasia

    2015-01-01

    Background Dysphagia is associated with aspiration, pneumonia and malnutrition, but remains challenging to identify at the bedside. A variety of exam protocols and maneuvers are commonly used, but the efficacy of these maneuvers is highly variable. Methods We conducted a comprehensive search of seven databases, including MEDLINE, EMBASE and Scopus, from each database’s earliest inception through June 5th, 2013. Studies reporting diagnostic performance of a bedside examination maneuver compared to a reference gold standard (videofluoroscopic swallow study [VFSS] or flexible endoscopic evaluation of swallowing with sensory testing [FEEST]) were included for analysis. From each study, data were abstracted based on the type of diagnostic method and reference standard study population and inclusion/exclusion characteristics, design and prediction of aspiration. Results The search strategy identified 38 articles meeting inclusion criteria. Overall, most bedside examinations lacked sufficient sensitivity to be used for screening purposes across all patient populations examined. Individual studies found dysphonia assessments, abnormal pharyngeal sensation assessments, dual axis accelerometry, and one description of water swallow testing to be sensitive tools, but none were reported as consistently sensitive. A preponderance of identified studies was in post-stroke adults, limiting the generalizability of results. Conclusions No bedside screening protocol has been shown to provide adequate predictive value for presence of aspiration. Several individual exam maneuvers demonstrated reasonable sensitivity, but reproducibility and consistency of these protocols was not established. More research is needed to design an optimal protocol for dysphagia detection. PMID:25581840

  18. Intelligent QoS routing algorithm based on improved AODV protocol for Ad Hoc networks

    NASA Astrophysics Data System (ADS)

    Huibin, Liu; Jun, Zhang

    2016-04-01

    Mobile Ad Hoc Networks were playing an increasingly important part in disaster reliefs, military battlefields and scientific explorations. However, networks routing difficulties are more and more outstanding due to inherent structures. This paper proposed an improved cuckoo searching-based Ad hoc On-Demand Distance Vector Routing protocol (CSAODV). It elaborately designs the calculation methods of optimal routing algorithm used by protocol and transmission mechanism of communication-package. In calculation of optimal routing algorithm by CS Algorithm, by increasing QoS constraint, the found optimal routing algorithm can conform to the requirements of specified bandwidth and time delay, and a certain balance can be obtained among computation spending, bandwidth and time delay. Take advantage of NS2 simulation software to take performance test on protocol in three circumstances and validate the feasibility and validity of CSAODV protocol. In results, CSAODV routing protocol is more adapt to the change of network topological structure than AODV protocol, which improves package delivery fraction of protocol effectively, reduce the transmission time delay of network, reduce the extra burden to network brought by controlling information, and improve the routing efficiency of network.

  19. Individualized drug dosing using RBF-Galerkin method: Case of anemia management in chronic kidney disease.

    PubMed

    Mirinejad, Hossein; Gaweda, Adam E; Brier, Michael E; Zurada, Jacek M; Inanc, Tamer

    2017-09-01

    Anemia is a common comorbidity in patients with chronic kidney disease (CKD) and is frequently associated with decreased physical component of quality of life, as well as adverse cardiovascular events. Current treatment methods for renal anemia are mostly population-based approaches treating individual patients with a one-size-fits-all model. However, FDA recommendations stipulate individualized anemia treatment with precise control of the hemoglobin concentration and minimal drug utilization. In accordance with these recommendations, this work presents an individualized drug dosing approach to anemia management by leveraging the theory of optimal control. A Multiple Receding Horizon Control (MRHC) approach based on the RBF-Galerkin optimization method is proposed for individualized anemia management in CKD patients. Recently developed by the authors, the RBF-Galerkin method uses the radial basis function approximation along with the Galerkin error projection to solve constrained optimal control problems numerically. The proposed approach is applied to generate optimal dosing recommendations for individual patients. Performance of the proposed approach (MRHC) is compared in silico to that of a population-based anemia management protocol and an individualized multiple model predictive control method for two case scenarios: hemoglobin measurement with and without observational errors. In silico comparison indicates that hemoglobin concentration with MRHC method has less variation among the methods, especially in presence of measurement errors. In addition, the average achieved hemoglobin level from the MRHC is significantly closer to the target hemoglobin than that of the other two methods, according to the analysis of variance (ANOVA) statistical test. Furthermore, drug dosages recommended by the MRHC are more stable and accurate and reach the steady-state value notably faster than those generated by the other two methods. The proposed method is highly efficient for the control of hemoglobin level, yet provides accurate dosage adjustments in the treatment of CKD anemia. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. PFIM 4.0, an extended R program for design evaluation and optimization in nonlinear mixed-effect models.

    PubMed

    Dumont, Cyrielle; Lestini, Giulia; Le Nagard, Hervé; Mentré, France; Comets, Emmanuelle; Nguyen, Thu Thuy; Group, For The Pfim

    2018-03-01

    Nonlinear mixed-effect models (NLMEMs) are increasingly used for the analysis of longitudinal studies during drug development. When designing these studies, the expected Fisher information matrix (FIM) can be used instead of performing time-consuming clinical trial simulations. The function PFIM is the first tool for design evaluation and optimization that has been developed in R. In this article, we present an extended version, PFIM 4.0, which includes several new features. Compared with version 3.0, PFIM 4.0 includes a more complete pharmacokinetic/pharmacodynamic library of models and accommodates models including additional random effects for inter-occasion variability as well as discrete covariates. A new input method has been added to specify user-defined models through an R function. Optimization can be performed assuming some fixed parameters or some fixed sampling times. New outputs have been added regarding the FIM such as eigenvalues, conditional numbers, and the option of saving the matrix obtained after evaluation or optimization. Previously obtained results, which are summarized in a FIM, can be taken into account in evaluation or optimization of one-group protocols. This feature enables the use of PFIM for adaptive designs. The Bayesian individual FIM has been implemented, taking into account a priori distribution of random effects. Designs for maximum a posteriori Bayesian estimation of individual parameters can now be evaluated or optimized and the predicted shrinkage is also reported. It is also possible to visualize the graphs of the model and the sensitivity functions without performing evaluation or optimization. The usefulness of these approaches and the simplicity of use of PFIM 4.0 are illustrated by two examples: (i) an example of designing a population pharmacokinetic study accounting for previous results, which highlights the advantage of adaptive designs; (ii) an example of Bayesian individual design optimization for a pharmacodynamic study, showing that the Bayesian individual FIM can be a useful tool in therapeutic drug monitoring, allowing efficient prediction of estimation precision and shrinkage for individual parameters. PFIM 4.0 is a useful tool for design evaluation and optimization of longitudinal studies in pharmacometrics and is freely available at http://www.pfim.biostat.fr. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Optimization of intra-voxel incoherent motion imaging at 3.0 Tesla for fast liver examination.

    PubMed

    Leporq, Benjamin; Saint-Jalmes, Hervé; Rabrait, Cecile; Pilleul, Frank; Guillaud, Olivier; Dumortier, Jérôme; Scoazec, Jean-Yves; Beuf, Olivier

    2015-05-01

    Optimization of multi b-values MR protocol for fast intra-voxel incoherent motion imaging of the liver at 3.0 Tesla. A comparison of four different acquisition protocols were carried out based on estimated IVIM (DSlow , DFast , and f) and ADC-parameters in 25 healthy volunteers. The effects of respiratory gating compared with free breathing acquisition then diffusion gradient scheme (simultaneous or sequential) and finally use of weighted averaging for different b-values were assessed. An optimization study based on Cramer-Rao lower bound theory was then performed to minimize the number of b-values required for a suitable quantification. The duration-optimized protocol was evaluated on 12 patients with chronic liver diseases No significant differences of IVIM parameters were observed between the assessed protocols. Only four b-values (0, 12, 82, and 1310 s.mm(-2) ) were found mandatory to perform a suitable quantification of IVIM parameters. DSlow and DFast significantly decreased between nonadvanced and advanced fibrosis (P < 0.05 and P < 0.01) whereas perfusion fraction and ADC variations were not found to be significant. Results showed that IVIM could be performed in free breathing, with a weighted-averaging procedure, a simultaneous diffusion gradient scheme and only four optimized b-values (0, 10, 80, and 800) reducing scan duration by a factor of nine compared with a nonoptimized protocol. Preliminary results have shown that parameters such as DSlow and DFast based on optimized IVIM protocol can be relevant biomarkers to distinguish between nonadvanced and advanced fibrosis. © 2014 Wiley Periodicals, Inc.

  2. Transparent DNA/RNA Co-extraction Workflow Protocol Suitable for Inhibitor-Rich Environmental Samples That Focuses on Complete DNA Removal for Transcriptomic Analyses

    PubMed Central

    Lim, Natalie Y. N.; Roco, Constance A.; Frostegård, Åsa

    2016-01-01

    Adequate comparisons of DNA and cDNA libraries from complex environments require methods for co-extraction of DNA and RNA due to the inherent heterogeneity of such samples, or risk bias caused by variations in lysis and extraction efficiencies. Still, there are few methods and kits allowing simultaneous extraction of DNA and RNA from the same sample, and the existing ones generally require optimization. The proprietary nature of kit components, however, makes modifications of individual steps in the manufacturer’s recommended procedure difficult. Surprisingly, enzymatic treatments are often performed before purification procedures are complete, which we have identified here as a major problem when seeking efficient genomic DNA removal from RNA extracts. Here, we tested several DNA/RNA co-extraction commercial kits on inhibitor-rich soils, and compared them to a commonly used phenol-chloroform co-extraction method. Since none of the kits/methods co-extracted high-quality nucleic acid material, we optimized the extraction workflow by introducing small but important improvements. In particular, we illustrate the need for extensive purification prior to all enzymatic procedures, with special focus on the DNase digestion step in RNA extraction. These adjustments led to the removal of enzymatic inhibition in RNA extracts and made it possible to reduce genomic DNA to below detectable levels as determined by quantitative PCR. Notably, we confirmed that DNase digestion may not be uniform in replicate extraction reactions, thus the analysis of “representative samples” is insufficient. The modular nature of our workflow protocol allows optimization of individual steps. It also increases focus on additional purification procedures prior to enzymatic processes, in particular DNases, yielding genomic DNA-free RNA extracts suitable for metatranscriptomic analysis. PMID:27803690

  3. Understanding protocol performance: impact of test performance.

    PubMed

    Turner, Robert G

    2013-01-01

    This is the second of two articles that examine the factors that determine protocol performance. The objective of these articles is to provide a general understanding of protocol performance that can be used to estimate performance, establish limits on performance, decide if a protocol is justified, and ultimately select a protocol. The first article was concerned with protocol criterion and test correlation. It demonstrated the advantages and disadvantages of different criterion when all tests had the same performance. It also examined the impact of increasing test correlation on protocol performance and the characteristics of the different criteria. To examine the impact on protocol performance when individual tests in a protocol have different performance. This is evaluated for different criteria and test correlations. The results of the two articles are combined and summarized. A mathematical model is used to calculate protocol performance for different protocol criteria and test correlations when there are small to large variations in the performance of individual tests in the protocol. The performance of the individual tests that make up a protocol has a significant impact on the performance of the protocol. As expected, the better the performance of the individual tests, the better the performance of the protocol. Many of the characteristics of the different criteria are relatively independent of the variation in the performance of the individual tests. However, increasing test variation degrades some criteria advantages and causes a new disadvantage to appear. This negative impact increases as test variation increases and as more tests are added to the protocol. Best protocol performance is obtained when individual tests are uncorrelated and have the same performance. In general, the greater the variation in the performance of tests in the protocol, the more detrimental this variation is to protocol performance. Since this negative impact is increased as more tests are added to the protocol, greater test variation indicates using fewer tests in the protocol. American Academy of Audiology.

  4. Optimal approach to quantum communication using dynamic programming.

    PubMed

    Jiang, Liang; Taylor, Jacob M; Khaneja, Navin; Lukin, Mikhail D

    2007-10-30

    Reliable preparation of entanglement between distant systems is an outstanding problem in quantum information science and quantum communication. In practice, this has to be accomplished by noisy channels (such as optical fibers) that generally result in exponential attenuation of quantum signals at large distances. A special class of quantum error correction protocols, quantum repeater protocols, can be used to overcome such losses. In this work, we introduce a method for systematically optimizing existing protocols and developing more efficient protocols. Our approach makes use of a dynamic programming-based searching algorithm, the complexity of which scales only polynomially with the communication distance, letting us efficiently determine near-optimal solutions. We find significant improvements in both the speed and the final-state fidelity for preparing long-distance entangled states.

  5. Cryopreservation of sperm in Grey mullet Mugil cephalus (Linnaeus, 1758).

    PubMed

    Balamurugan, Ramachandran; Munuswamy, Natesan

    2017-10-01

    The aim of this study was to document the effects of cryopreservation on sperm motility and viability in Grey mullet Mugil cephalus. Cryopreservation of sperm was attempted by using two extenders ringer solution for marine fish (RSMF) and V2 extender (V2E) and cryoprotectants dimethylacetamide (DMA), dimethylsulfoxide (DMSO), ethylene glycol (EG), glycerol (GLY), propylene glycol (PG) and methanol (MeOH). Cryoprotectants were assessed at different concentrations individually as well as in combination with varying equilibration times (10 and 30min). For optimization of freezing rate, four freezing protocols (-5, -10, -20 and -30°C/min) were evaluated. After achieving final temperature, samples were plunged in liquid nitrogen (-196°C) and stored for a week. Samples were subsequently thawed in a water bath at 30°C for assessment of sperm motility and viability. Results indicated that cryomedium constituting of V2E extender+10% glycerol with a dilution ratio of 1:1 (sperm: cryomedium) at an equilibration time of 5 to- 10min and freezing rate of -20°C/min was more desirable compared with other factors that were assessed. Use of this protocol resulted in retaining the greatest sperm motility grade 3.0±0.0 (50%-80% sperm movement, fast swimming) and 48.19±3.12% of sperm viability. The results of the present study, therefore, provide base-line data for establishing a protocol for sperm cryopreservation in M.cephalus. Further studies are, however, required for optimization of most suitable sperm cryopreservation protocol. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. How to make optimal use of maximal multipartite entanglement in clock synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Changliang; Hofmann, Holger F.

    2014-12-04

    We introduce a multi-party quantum clock synchronization protocol that makes optimal use of the maximal multipartite entanglement of GHZ-type states. The measurement statistics of the protocol are analyzed and the efficiency is evaluated.

  7. SPOT: Optimization Tool for Network Adaptable Security

    NASA Astrophysics Data System (ADS)

    Ksiezopolski, Bogdan; Szalachowski, Pawel; Kotulski, Zbigniew

    Recently we have observed the growth of the intelligent application especially with its mobile character, called e-anything. The implementation of these applications provides guarantee of security requirements of the cryptographic protocols which are used in the application. Traditionally the protocols have been configured with the strongest possible security mechanisms. Unfortunately, when the application is used by means of the mobile devices, the strongest protection can lead to the denial of services for them. The solution of this problem is introducing the quality of protection models which will scale the protection level depending on the actual threat level. In this article we would like to introduce the application which manages the protection level of the processes in the mobile environment. The Security Protocol Optimizing Tool (SPOT) optimizes the cryptographic protocol and defines the protocol version appropriate to the actual threat level. In this article the architecture of the SPOT is presented with a detailed description of the included modules.

  8. Sensitivity regularization of the Cramér-Rao lower bound to minimize B1 nonuniformity effects in quantitative magnetization transfer imaging.

    PubMed

    Boudreau, Mathieu; Pike, G Bruce

    2018-05-07

    To develop and validate a regularization approach of optimizing B 1 insensitivity of the quantitative magnetization transfer (qMT) pool-size ratio (F). An expression describing the impact of B 1 inaccuracies on qMT fitting parameters was derived using a sensitivity analysis. To simultaneously optimize for robustness against noise and B 1 inaccuracies, the optimization condition was defined as the Cramér-Rao lower bound (CRLB) regularized by the B 1 -sensitivity expression for the parameter of interest (F). The qMT protocols were iteratively optimized from an initial search space, with and without B 1 regularization. Three 10-point qMT protocols (Uniform, CRLB, CRLB+B 1 regularization) were compared using Monte Carlo simulations for a wide range of conditions (e.g., SNR, B 1 inaccuracies, tissues). The B 1 -regularized CRLB optimization protocol resulted in the best robustness of F against B 1 errors, for a wide range of SNR and for both white matter and gray matter tissues. For SNR = 100, this protocol resulted in errors of less than 1% in mean F values for B 1 errors ranging between -10 and 20%, the range of B 1 values typically observed in vivo in the human head at field strengths of 3 T and less. Both CRLB-optimized protocols resulted in the lowest σ F values for all SNRs and did not increase in the presence of B 1 inaccuracies. This work demonstrates a regularized optimization approach for improving the robustness of auxiliary measurements (e.g., B 1 ) sensitivity of qMT parameters, particularly the pool-size ratio (F). Predicting substantially less B 1 sensitivity using protocols optimized with this method, B 1 mapping could even be omitted for qMT studies primarily interested in F. © 2018 International Society for Magnetic Resonance in Medicine.

  9. Potential Projective Material on the Rorschach: Comparing Comprehensive System Protocols to Their Modeled R-Optimized Administration Counterparts.

    PubMed

    Pianowski, Giselle; Meyer, Gregory J; Villemor-Amaral, Anna Elisa de

    2016-01-01

    Exner ( 1989 ) and Weiner ( 2003 ) identified 3 types of Rorschach codes that are most likely to contain personally relevant projective material: Distortions, Movement, and Embellishments. We examine how often these types of codes occur in normative data and whether their frequency changes for the 1st, 2nd, 3rd, 4th, or last response to a card. We also examine the impact on these variables of the Rorschach Performance Assessment System's (R-PAS) statistical modeling procedures that convert the distribution of responses (R) from Comprehensive System (CS) administered protocols to match the distribution of R found in protocols obtained using R-optimized administration guidelines. In 2 normative reference databases, the results indicated that about 40% of responses (M = 39.25) have 1 type of code, 15% have 2 types, and 1.5% have all 3 types, with frequencies not changing by response number. In addition, there were no mean differences in the original CS and R-optimized modeled records (M Cohen's d = -0.04 in both databases). When considered alongside findings showing minimal differences between the protocols of people randomly assigned to CS or R-optimized administration, the data suggest R-optimized administration should not alter the extent to which potential projective material is present in a Rorschach protocol.

  10. Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle

    DOE PAGES

    Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza; ...

    2017-05-18

    We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less

  11. Optimizing Variational Quantum Algorithms Using Pontryagin’s Minimum Principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhi -Cheng; Rahmani, Armin; Shabani, Alireza

    We use Pontryagin’s minimum principle to optimize variational quantum algorithms. We show that for a fixed computation time, the optimal evolution has a bang-bang (square pulse) form, both for closed and open quantum systems with Markovian decoherence. Our findings support the choice of evolution ansatz in the recently proposed quantum approximate optimization algorithm. Focusing on the Sherrington-Kirkpatrick spin glass as an example, we find a system-size independent distribution of the duration of pulses, with characteristic time scale set by the inverse of the coupling constants in the Hamiltonian. The optimality of the bang-bang protocols and the characteristic time scale ofmore » the pulses provide an efficient parametrization of the protocol and inform the search for effective hybrid (classical and quantum) schemes for tackling combinatorial optimization problems. Moreover, we find that the success rates of our optimal bang-bang protocols remain high even in the presence of weak external noise and coupling to a thermal bath.« less

  12. Adaptive hybrid optimal quantum control for imprecisely characterized systems.

    PubMed

    Egger, D J; Wilhelm, F K

    2014-06-20

    Optimal quantum control theory carries a huge promise for quantum technology. Its experimental application, however, is often hindered by imprecise knowledge of the input variables, the quantum system's parameters. We show how to overcome this by adaptive hybrid optimal control, using a protocol named Ad-HOC. This protocol combines open- and closed-loop optimal control by first performing a gradient search towards a near-optimal control pulse and then an experimental fidelity estimation with a gradient-free method. For typical settings in solid-state quantum information processing, adaptive hybrid optimal control enhances gate fidelities by an order of magnitude, making optimal control theory applicable and useful.

  13. Use of C-Arm Cone Beam CT During Hepatic Radioembolization: Protocol Optimization for Extrahepatic Shunting and Parenchymal Enhancement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoven, Andor F. van den, E-mail: a.f.vandenhoven@umcutrecht.nl; Prince, Jip F.; Keizer, Bart de

    PurposeTo optimize a C-arm computed tomography (CT) protocol for radioembolization (RE), specifically for extrahepatic shunting and parenchymal enhancement.Materials and MethodsA prospective development study was performed per IDEAL recommendations. A literature-based protocol was applied in patients with unresectable and chemorefractory liver malignancies undergoing an angiography before radioembolization. Contrast and scan settings were adjusted stepwise and repeatedly reviewed in a consensus meeting. Afterwards, two independent raters analyzed all scans. A third rater evaluated the SPECT/CT scans as a reference standard for extrahepatic shunting and lack of target segment perfusion.ResultsFifty scans were obtained in 29 procedures. The first protocol, using a 6 s delaymore » and 10 s scan, showed insufficient parenchymal enhancement. In the second protocol, the delay was determined by timing parenchymal enhancement on DSA power injection (median 8 s, range 4–10 s): enhancement improved, but breathing artifacts increased (from 0 to 27 %). Since the third protocol with a 5 s scan decremented subjective image quality, the second protocol was deemed optimal. Median CNR (range) was 1.7 (0.6–3.2), 2.2 (−1.4–4.0), and 2.1 (−0.3–3.0) for protocol 1, 2, and 3 (p = 0.80). Delineation of perfused segments was possible in 57, 73, and 44 % of scans (p = 0.13). In all C-arm CTs combined, the negative predictive value was 95 % for extrahepatic shunting and 83 % for lack of target segment perfusion.ConclusionAn optimized C-arm CT protocol was developed that can be used to detect extrahepatic shunts and non-perfusion of target segments during RE.« less

  14. Effects of different re-warm up activities in football players' performance.

    PubMed

    Abade, Eduardo; Sampaio, Jaime; Gonçalves, Bruno; Baptista, Jorge; Alves, Alberto; Viana, João

    2017-01-01

    Warm up routines are commonly used to optimize football performance and prevent injuries. Yet, official pre-match protocols may require players to passively rest for approximately 10 to 15 minutes between the warm up and the beginning of the match. Therefore, the aim of this study was to explore the effect of different re-warm up activities on the physical performance of football players. Twenty-Two Portuguese elite under-19 football players participated in the study conducted during the competitive season. Different re-warm up protocols were performed 6 minutes after the same standardized warm up in 4 consecutive days in a crossover controlled approach: without, eccentric, plyometric and repeated changes of direction. Vertical jump and Sprint performances were tested immediately after warm up and 12 minutes after warm up. Results showed that repeated changes of direction and plyometrics presented beneficial effects to jump and sprint. Different practical implications may be taken from the eccentric protocol since a vertical jump impairment was observed, suggesting a possibly harmful effect. The absence of re-warm up activities may be detrimental to players' physical performance. However, the inclusion of re-warm up prior to match is a complex issue, since the manipulation of volume, intensity and recovery may positively or negatively affect the subsequent performance. In fact, this exploratory study shows that eccentric exercise may be harmful for physical performance when performed prior a football match. However, plyometric and repeated changes of direction exercises seem to be simple, quick and efficient activities to attenuate losses in vertical jump and sprint capacity after warm up. Coaches should aim to develop individual optimal exercise modes in order to optimize physical performance after re warm activities.

  15. Optimized magnetic resonance diffusion protocol for ex-vivo whole human brain imaging with a clinical scanner

    NASA Astrophysics Data System (ADS)

    Scherrer, Benoit; Afacan, Onur; Stamm, Aymeric; Singh, Jolene; Warfield, Simon K.

    2015-03-01

    Diffusion-weighted magnetic resonance imaging (DW-MRI) provides a novel insight into the brain to facilitate our understanding of the brain connectivity and microstructure. While in-vivo DW-MRI enables imaging of living patients and longitudinal studies of brain changes, post-mortem ex-vivo DW-MRI has numerous advantages. Ex-vivo imaging benefits from greater resolution and sensitivity due to the lack of imaging time constraints; the use of tighter fitting coils; and the lack of movement artifacts. This allows characterization of normal and abnormal tissues with unprecedented resolution and sensitivity, facilitating our ability to investigate anatomical structures that are inaccessible in-vivo. This also offers the opportunity to develop today novel imaging biomarkers that will, with tomorrow's MR technology, enable improved in-vivo assessment of the risk of disease in an individual. Post-mortem studies, however, generally rely on the fixation of specimen to inhibit tissue decay which starts as soon as tissue is deprived from its blood supply. Unfortunately, fixation of tissues substantially alters tissue diffusivity profiles. In addition, ex-vivo DW-MRI requires particular care when packaging the specimen because the presence of microscopic air bubbles gives rise to geometric and intensity image distortion. In this work, we considered the specific requirements of post-mortem imaging and designed an optimized protocol for ex-vivo whole brain DW-MRI using a human clinical 3T scanner. Human clinical 3T scanners are available to a large number of researchers and, unlike most animal scanners, have a bore diameter large enough to image a whole human brain. Our optimized protocol will facilitate widespread ex-vivo investigations of large specimen.

  16. Monte Carlo calculations of electron beam quality conversion factors for several ion chamber types.

    PubMed

    Muir, B R; Rogers, D W O

    2014-11-01

    To provide a comprehensive investigation of electron beam reference dosimetry using Monte Carlo simulations of the response of 10 plane-parallel and 18 cylindrical ion chamber types. Specific emphasis is placed on the determination of the optimal shift of the chambers' effective point of measurement (EPOM) and beam quality conversion factors. The EGSnrc system is used for calculations of the absorbed dose to gas in ion chamber models and the absorbed dose to water as a function of depth in a water phantom on which cobalt-60 and several electron beam source models are incident. The optimal EPOM shifts of the ion chambers are determined by comparing calculations of R50 converted from I50 (calculated using ion chamber simulations in phantom) to R50 calculated using simulations of the absorbed dose to water vs depth in water. Beam quality conversion factors are determined as the calculated ratio of the absorbed dose to water to the absorbed dose to air in the ion chamber at the reference depth in a cobalt-60 beam to that in electron beams. For most plane-parallel chambers, the optimal EPOM shift is inside of the active cavity but different from the shift determined with water-equivalent scaling of the front window of the chamber. These optimal shifts for plane-parallel chambers also reduce the scatter of beam quality conversion factors, kQ, as a function of R50. The optimal shift of cylindrical chambers is found to be less than the 0.5 rcav recommended by current dosimetry protocols. In most cases, the values of the optimal shift are close to 0.3 rcav. Values of kecal are calculated and compared to those from the TG-51 protocol and differences are explained using accurate individual correction factors for a subset of ion chambers investigated. High-precision fits to beam quality conversion factors normalized to unity in a beam with R50 = 7.5 cm (kQ (')) are provided. These factors avoid the use of gradient correction factors as used in the TG-51 protocol although a chamber dependent optimal shift in the EPOM is required when using plane-parallel chambers while no shift is needed with cylindrical chambers. The sensitivity of these results to parameters used to model the ion chambers is discussed and the uncertainty related to the practical use of these results is evaluated. These results will prove useful as electron beam reference dosimetry protocols are being updated. The analysis of this work indicates that cylindrical ion chambers may be appropriate for use in low-energy electron beams but measurements are required to characterize their use in these beams.

  17. Pupil Diameter Tracks the Exploration-Exploitation Trade-off during Analogical Reasoning and Explains Individual Differences in Fluid Intelligence.

    PubMed

    Hayes, Taylor R; Petrov, Alexander A

    2016-02-01

    The ability to adaptively shift between exploration and exploitation control states is critical for optimizing behavioral performance. Converging evidence from primate electrophysiology and computational neural modeling has suggested that this ability may be mediated by the broad norepinephrine projections emanating from the locus coeruleus (LC) [Aston-Jones, G., & Cohen, J. D. An integrative theory of locus coeruleus-norepinephrine function: Adaptive gain and optimal performance. Annual Review of Neuroscience, 28, 403-450, 2005]. There is also evidence that pupil diameter covaries systematically with LC activity. Although imperfect and indirect, this link makes pupillometry a useful tool for studying the locus coeruleus norepinephrine system in humans and in high-level tasks. Here, we present a novel paradigm that examines how the pupillary response during exploration and exploitation covaries with individual differences in fluid intelligence during analogical reasoning on Raven's Advanced Progressive Matrices. Pupillometry was used as a noninvasive proxy for LC activity, and concurrent think-aloud verbal protocols were used to identify exploratory and exploitative solution periods. This novel combination of pupillometry and verbal protocols from 40 participants revealed a decrease in pupil diameter during exploitation and an increase during exploration. The temporal dynamics of the pupillary response was characterized by a steep increase during the transition to exploratory periods, sustained dilation for many seconds afterward, and followed by gradual return to baseline. Moreover, the individual differences in the relative magnitude of pupillary dilation accounted for 16% of the variance in Advanced Progressive Matrices scores. Assuming that pupil diameter is a valid index of LC activity, these results establish promising preliminary connections between the literature on locus coeruleus norepinephrine-mediated cognitive control and the literature on analogical reasoning and fluid intelligence.

  18. Protocol for a prospective collaborative systematic review and meta-analysis of individual patient data from randomized controlled trials of vasoactive drugs in acute stroke: The Blood pressure in Acute Stroke Collaboration, stage-3.

    PubMed

    Sandset, Else Charlotte; Sanossian, Nerses; Woodhouse, Lisa J; Anderson, Craig; Berge, Eivind; Lees, Kennedy R; Potter, John F; Robinson, Thompson G; Sprigg, Nikola; Wardlaw, Joanna M; Bath, Philip M

    2018-01-01

    Rationale Despite several large clinical trials assessing blood pressure lowering in acute stroke, equipoise remains particularly for ischemic stroke. The "Blood pressure in Acute Stroke Collaboration" commenced in the mid-1990s focussing on systematic reviews and meta-analysis of blood pressure lowering in acute stroke. From the start, Blood pressure in Acute Stroke Collaboration planned to assess safety and efficacy of blood pressure lowering in acute stroke using individual patient data. Aims To determine the optimal management of blood pressure in patients with acute stroke, including both intracerebral hemorrhage and ischemic stroke. Secondary aims are to assess which clinical and therapeutic factors may alter the optimal management of high blood pressure in patients with acute stroke and to assess the effect of vasoactive treatments on hemodynamic variables. Methods and design Individual patient data from randomized controlled trials of blood pressure management in participants with ischemic stroke and/or intracerebral hemorrhage enrolled during the ultra-acute (pre-hospital), hyper-acute (<6 h), acute (<48 h), and sub-acute (<168 h) phases of stroke. Study outcomes The primary effect variable will be functional outcome defined by the ordinal distribution of the modified Rankin Scale; analyses will also be carried out in pre-specified subgroups to assess the modifying effects of stroke-related and pre-stroke patient characteristics. Key secondary variables will include clinical, hemodynamic and neuroradiological variables; safety variables will comprise death and serious adverse events. Discussion Study questions will be addressed in stages, according to the protocol, before integrating these into a final overreaching analysis. We invite eligible trials to join the collaboration.

  19. Carb-3 is the superior anti-CD15 monoclonal antibody for immunohistochemistry.

    PubMed

    Røge, Rasmus; Nielsen, Søren; Vyberg, Mogens

    2014-07-01

    Immunohistochemical detection of CD15 is important in the diagnosis of Hodgkin lymphoma and may play a role in the classification of renal cell tumors (RCTs). In the NordiQC external quality assessment scheme, 4 CD15 tests, each with 71 to 121 participating laboratories, showed that 24% to 50% of the stains were insufficient. This was mainly because of very low primary antibody (Ab) concentration and insufficient heat-induced epitope retrieval, whereas the Ab clone performance seemed of little importance. The purpose of this study was to evaluate the performance of the most commonly used CD15 Abs on the basis of vendor-recommended and in-house optimized protocols. Multitissue blocks with 199 specimens including various malignant lymphomas, RCTs, and normal tissues were stained with 3 different concentrated (conc) CD15 Ab clones Carb-3, MMA, and BY87 according to predetermined in-house optimized protocols on 2 automated immunostaining platforms. Carb-3 and MMA were also applied in ready-to-use (RTU) formats utilized according to vendor protocols. Extension and intensity of stains was determined using the H-score method. Clone Carb-3-conc gave with an in-house optimized protocol the highest H-scores in Hodgkin lymphoma, RCTs, and normal kidney tissue. Clones Carb-3-RTU and MMA-conc gave slightly lower scores, whereas clones MMA-RTU and BY87-conc gave the lowest scores and a large proportion of false-negative reactions. For all concentrated Abs, in-house optimized protocols resulted in increased sensitivity and improved overall staining results compared with vendor-recommended protocols. The importance of Ab selection and protocol optimization in immunohistochemical laboratories is emphasized.

  20. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in terms of the concatenated protocol parameters. New source-to-destination routes are sought that optimize cross-layer interdependencies to achieve the "best available" performance in the WSN. The protocol design, modified from a known reactive protocol, adapts the achievable performance to the transient network conditions and resource levels. Control of network behavior is realized through the conditional rates of the MVPPs. Optimal cross-layer protocol parameters are determined by stochastic dynamic programming conditions derived from models of transient packetized sensor data flows. Moreover, the defining conditions for WSN configurations, grouping sensor nodes into clusters and establishing data aggregation at processing nodes within those clusters, lead to computationally tractable solutions to the stochastic differential equations that describe network dynamics. Closed-form solution characteristics provide an alternative to the "directed diffusion" methods for resource-efficient WSN protocols published previously by other researchers. Performance verification of the resulting cross-layer designs is found by embedding the optimality conditions for the protocols in actual WSN scenarios replicated in a wireless network simulation environment. Performance tradeoffs among protocol parameters remain for a sequel to the paper.

  1. Analysis of power management and system latency in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Oswald, Matthew T.; Rohwer, Judd A.; Forman, Michael A.

    2004-08-01

    Successful power management in a wireless sensor network requires optimization of the protocols which affect energy-consumption on each node and the aggregate effects across the larger network. System optimization for a given deployment scenario requires an analysis and trade off of desired node and network features with their associated costs. The sleep protocol for an energy-efficient wireless sensor network for event detection, target classification, and target tracking developed at Sandia National Laboratories is presented. The dynamic source routing (DSR) algorithm is chosen to reduce network maintenance overhead, while providing a self-configuring and self-healing network architecture. A method for determining the optimal sleep time is developed and presented, providing reference data which spans several orders of magnitude. Message timing diagrams show, that a node in a five-node cluster, employing an optimal cyclic single-radio sleep protocol, consumes 3% more energy and incurs a 16-s increase latency than nodes employing the more complex dual-radio STEM protocol.

  2. Continuous-variable quantum key distribution with a leakage from state preparation

    NASA Astrophysics Data System (ADS)

    Derkach, Ivan; Usenko, Vladyslav C.; Filip, Radim

    2017-12-01

    We address side-channel leakage in a trusted preparation station of continuous-variable quantum key distribution with coherent and squeezed states. We consider two different scenarios: multimode Gaussian modulation, directly accessible to an eavesdropper, or side-channel loss of the signal states prior to the modulation stage. We show the negative impact of excessive modulation on both the coherent- and squeezed-state protocols. The impact is more pronounced for squeezed-state protocols and may require optimization of squeezing in the case of noisy quantum channels. Further, we demonstrate that the coherent-state protocol is immune to side-channel signal state leakage prior to modulation, while the squeezed-state protocol is vulnerable to such attacks, becoming more sensitive to the noise in the channel. In the general case of noisy quantum channels the signal squeezing can be optimized to provide best performance of the protocol in the presence of side-channel leakage prior to modulation. Our results demonstrate that leakage from the trusted source in continuous-variable quantum key distribution should not be underestimated and squeezing optimization is needed to overcome coherent state protocols.

  3. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Chan, F; Newman, B

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less

  4. Optimization of Native and Formaldehyde iPOND Techniques for Use in Suspension Cells.

    PubMed

    Wiest, Nathaniel E; Tomkinson, Alan E

    2017-01-01

    The isolation of proteins on nascent DNA (iPOND) technique developed by the Cortez laboratory allows a previously unparalleled ability to examine proteins associated with replicating and newly synthesized DNA in mammalian cells. Both the original, formaldehyde-based iPOND technique and a more recent derivative, accelerated native iPOND (aniPOND), have mostly been performed in adherent cell lines. Here, we describe modifications to both protocols for use with suspension cell lines. These include cell culture, pulse, and chase conditions that optimize sample recovery in both protocols using suspension cells and several key improvements to the published aniPOND technique that reduce sample loss, increase signal to noise, and maximize sample recovery. Additionally, we directly and quantitatively compare the iPOND and aniPOND protocols to test the strengths and limitations of both. Finally, we present a detailed protocol to perform the optimized aniPOND protocol in suspension cell lines. © 2017 Elsevier Inc. All rights reserved.

  5. Healthy individuals' perspectives on clinical research protocols and influences on enrollment decisions.

    PubMed

    Roberts, Laura Weiss; Kim, Jane Paik

    2017-01-01

    Understanding the perspectives of healthy individuals is important ethically and for the advancement of science. We assessed perceptions of risk associated with research procedures, comparing views of healthy individuals with and without experience in clinical research, and the respondents' reported willingness to volunteer. Semistructured interviews and written surveys were conducted. Study participants were healthy individuals, half of whom were currently enrolled in clinical research and half of whom had no prior experience in clinical research. Participants were queried regarding seven "minimal risk" or "greater than minimal risk" protocol vignettes with procedures of three types: routine diagnostic tests, more burdensome (i.e., more effort or potential harm) diagnostic tests, and pharmacologic interventions. Views of influences on enrollment decisions were also assessed. Most healthy individuals indicated that protocols with more burdensome or pharmacologic interventions were very risky (59%, 58%), as opposed to routine diagnostic test procedures (32%). Respondents' willingness to enroll in protocols varied by type of protocol (p value < .001) and was inversely correlated with risk assessments (regression coefficients from GEE = -0.4; -0.5; -0.7). The odds of healthy individuals with research experience expressing strong willingness to enroll in the depicted protocols were twice the odds of healthy individuals without research experience expressing the same level of willingness (OR = 2.0 95% CI: [1.1, 3.9]). Respondents did not assign risk categories as institutional review boards (IRBs) would, as indicated by low agreement (26%) between respondent and expert opinion on minimal risk protocols. Perceptions of procedure risk appear to influence healthy individuals' willingness to enroll in protocols. Participants with experience in clinical research were far more likely to express willingness to enroll, a finding with important scientific and ethical implications. The lack of alignment between healthy individuals' views of protocol risk and IRB categorization warrants further study.

  6. Nondestructive Methods for Monitoring Cell Removal During Rat Liver Decellularization.

    PubMed

    Geerts, Sharon; Ozer, Sinan; Jaramillo, Maria; Yarmush, Martin L; Uygun, Basak E

    2016-07-01

    Whole liver engineering holds the promise to create transplantable liver grafts that may serve as substitutes for donor organs, addressing the donor shortage in liver transplantation. While decellularization and recellularization of livers in animal models have been successfully achieved, scale up to human livers has been slow. There are a number of donor human livers that are discarded because they are not found suitable for transplantation, but are available for engineering liver grafts. These livers are rejected due to a variety of reasons, which in turn may affect the decellularization outcome. Hence, a one-size-fit-for all decellularization protocol may not result in scaffolds with consistent matrix quality, subsequently influencing downstream recellularization and transplantation outcomes. There is a need for a noninvasive monitoring method to evaluate the extent of cell removal, while ensuring preservation of matrix components during decellularization. In this study, we decellularized rat livers using a protocol previously established by our group, and we monitored decellularization through traditional destructive techniques, including evaluation of DNA, collagen, and glycosaminoglycan (GAG) content in decellularized scaffolds, as well as histology. In addition, we used computed tomography and perfusate analysis as alternative nondestructive decellularization monitoring methods. We found that DNA removal correlates well with the Hounsfield unit of the liver, and perfusate analysis revealed that significant amount of GAG is removed during perfusion with 0.1% sodium dodecyl sulfate. This allowed for optimization of our decellularization protocol leading to scaffolds that have significantly higher GAG content, while maintaining appropriate removal of cellular contents. The significance of this is the creation of a nondestructive monitoring strategy that can be used for optimization of decellularization protocols for individual human livers available for liver engineering.

  7. The Effectiveness of Transcranial Brain Stimulation in Improving Clinical Signs of Hyperkinetic Movement Disorders.

    PubMed

    Obeso, Ignacio; Cerasa, Antonio; Quattrone, Aldo

    2015-01-01

    Repetitive transcranial magnetic stimulation (rTMS) is a safe and painless method for stimulating cortical neurons. In neurological realm, rTMS has prevalently been applied to understand pathophysiological mechanisms underlying movement disorders. However, this tool has also the potential to be translated into a clinically applicable therapeutic use. Several available studies supported this hypothesis, but differences in protocols, clinical enrollment, and variability of rTMS effects across individuals complicate better understanding of efficient clinical protocols. The aim of this present review is to discuss to what extent the evidence provided by the therapeutic use of rTMS may be generalized. In particular, we attempted to define optimal cortical regions and stimulation protocols that have been demonstrated to maximize the effectiveness seen in the actual literature for the three most prevalent hyperkinetic movement disorders: Parkinson's disease (PD) with levodopa-induced dyskinesias (LIDs), essential tremor (ET) and dystonia. A total of 28 rTMS studies met our search criteria. Despite clinical and methodological differences, overall these studies demonstrated that therapeutic applications of rTMS to "normalize" pathologically decreased or increased levels of cortical activity have given moderate progress in patient's quality of life. Moreover, the present literature suggests that altered pathophysiology in hyperkinetic movement disorders establishes motor, premotor or cerebellar structures as candidate regions to reset cortico-subcortical pathways back to normal. Although rTMS has the potential to become a powerful tool for ameliorating the clinical outcome of hyperkinetic neurological patients, until now there is not a clear consensus on optimal protocols for these motor disorders. Well-controlled multicenter randomized clinical trials with high numbers of patients are urgently required.

  8. Nondestructive Methods for Monitoring Cell Removal During Rat Liver Decellularization

    PubMed Central

    Geerts, Sharon; Ozer, Sinan; Jaramillo, Maria; Yarmush, Martin L.

    2016-01-01

    Whole liver engineering holds the promise to create transplantable liver grafts that may serve as substitutes for donor organs, addressing the donor shortage in liver transplantation. While decellularization and recellularization of livers in animal models have been successfully achieved, scale up to human livers has been slow. There are a number of donor human livers that are discarded because they are not found suitable for transplantation, but are available for engineering liver grafts. These livers are rejected due to a variety of reasons, which in turn may affect the decellularization outcome. Hence, a one-size-fit-for all decellularization protocol may not result in scaffolds with consistent matrix quality, subsequently influencing downstream recellularization and transplantation outcomes. There is a need for a noninvasive monitoring method to evaluate the extent of cell removal, while ensuring preservation of matrix components during decellularization. In this study, we decellularized rat livers using a protocol previously established by our group, and we monitored decellularization through traditional destructive techniques, including evaluation of DNA, collagen, and glycosaminoglycan (GAG) content in decellularized scaffolds, as well as histology. In addition, we used computed tomography and perfusate analysis as alternative nondestructive decellularization monitoring methods. We found that DNA removal correlates well with the Hounsfield unit of the liver, and perfusate analysis revealed that significant amount of GAG is removed during perfusion with 0.1% sodium dodecyl sulfate. This allowed for optimization of our decellularization protocol leading to scaffolds that have significantly higher GAG content, while maintaining appropriate removal of cellular contents. The significance of this is the creation of a nondestructive monitoring strategy that can be used for optimization of decellularization protocols for individual human livers available for liver engineering. PMID:27169332

  9. Management Strategies to Facilitate Optimal Outcomes for Patients Treated with Delayed-release Dimethyl Fumarate.

    PubMed

    Mayer, Lori; Fink, Mary Kay; Sammarco, Carrie; Laing, Lisa

    2018-04-01

    Delayed-release dimethyl fumarate is an oral disease-modifying therapy that has demonstrated significant efficacy in adults with relapsing-remitting multiple sclerosis. Incidences of flushing and gastrointestinal adverse events are common in the first month after delayed-release dimethyl fumarate initiation. Our objective was to propose mitigation strategies for adverse events related to initiation of delayed-release dimethyl fumarate in the treatment of patients with multiple sclerosis. Studies of individually developed mitigation strategies and chart reviews were evaluated. Those results, as well as mitigation protocols developed at multiple sclerosis care centers, are summarized. Key steps to optimize the effectiveness of delayed-release dimethyl fumarate treatment include education prior to and at the time of delayed-release dimethyl fumarate initiation, initiation dose protocol gradually increasing to maintenance dose, dietary suggestions for co-administration with food, gastrointestinal symptom management with over-the-counter medications, flushing symptom management with aspirin, and temporary dose reduction. Using the available evidence from clinical trials and evaluations of post-marketing studies, these strategies to manage gastrointestinal and flushing symptoms can be effective and helpful to the patient when initiating delayed-release dimethyl fumarate.

  10. Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy

    DOE PAGES

    Rames, Matthew; Yu, Yadong; Ren, Gang

    2014-08-15

    Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electronmore » microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol. Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high-resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography. Moreover, OpNS can be a high-throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.« less

  11. Three-input majority function as the unique optimal function for the bias amplification using nonlocal boxes

    NASA Astrophysics Data System (ADS)

    Mori, Ryuhei

    2016-11-01

    Brassard et al. [Phys. Rev. Lett. 96, 250401 (2006), 10.1103/PhysRevLett.96.250401] showed that shared nonlocal boxes with a CHSH (Clauser, Horne, Shimony, and Holt) probability greater than 3/+√{6 } 6 yield trivial communication complexity. There still exists a gap with the maximum CHSH probability 2/+√{2 } 4 achievable by quantum mechanics. It is an interesting open question to determine the exact threshold for the trivial communication complexity. Brassard et al.'s idea is based on recursive bias amplification by the three-input majority function. It was not obvious if another choice of function exhibits stronger bias amplification. We show that the three-input majority function is the unique optimal function, so that one cannot improve the threshold 3/+√{6 } 6 by Brassard et al.'s bias amplification. In this work, protocols for computing the function used for the bias amplification are restricted to be nonadaptive protocols or a particular adaptive protocol inspired by Pawłowski et al.'s protocol for information causality [Nature (London) 461, 1101 (2009), 10.1038/nature08400]. We first show an adaptive protocol inspired by Pawłowski et al.'s protocol, and then show that the adaptive protocol improves upon nonadaptive protocols. Finally, we show that the three-input majority function is the unique optimal function for the bias amplification if we apply the adaptive protocol to each step of the bias amplification.

  12. Distributed Cooperative Optimal Control for Multiagent Systems on Directed Graphs: An Inverse Optimal Approach.

    PubMed

    Zhang, Huaguang; Feng, Tao; Yang, Guang-Hong; Liang, Hongjing

    2015-07-01

    In this paper, the inverse optimal approach is employed to design distributed consensus protocols that guarantee consensus and global optimality with respect to some quadratic performance indexes for identical linear systems on a directed graph. The inverse optimal theory is developed by introducing the notion of partial stability. As a result, the necessary and sufficient conditions for inverse optimality are proposed. By means of the developed inverse optimal theory, the necessary and sufficient conditions are established for globally optimal cooperative control problems on directed graphs. Basic optimal cooperative design procedures are given based on asymptotic properties of the resulting optimal distributed consensus protocols, and the multiagent systems can reach desired consensus performance (convergence rate and damping rate) asymptotically. Finally, two examples are given to illustrate the effectiveness of the proposed methods.

  13. Optimization of a Sample Processing Protocol for Recovery of ...

    EPA Pesticide Factsheets

    Journal Article Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps.

  14. Pilonidal Sinus Disease: 10 Steps to Optimize Care.

    PubMed

    Harris, Connie; Sibbald, R Gary; Mufti, Asfandyar; Somayaji, Ranjani

    2016-10-01

    To present a 10-step approach to the assessment and treatment of pilonidal sinus disease (PSD) and related wounds based on the Harris protocol, expert opinion, and a current literature review. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to: Pilonidal sinus disease (PSD) is a common problem in young adults and particularly in males with a deep natal or intergluteal cleft and coarse body hair. An approach to an individual with PSD includes the assessment of pain, activities of daily living, the pilonidal sinus, and natal cleft. Local wound care includes the management of infection (if present), along with appropriate debridement and moisture management. Treatment is optimized with patient empowerment to manage the wound and periwound environment (cleansing, dressing changes, decontamination, hair removal, minimizing friction). Self-care education includes the recognition of recurrences or infection. Early surgical intervention of these wounds is often necessary for successful outcomes. Pilonidal sinus healing by secondary intention often takes weeks to months; however, the use of the Harris protocol may decrease healing times. A number of new surgical approaches may accelerate healing. Surgical closure by primary intention is often associated with higher recurrence rates. Expert opinion in this article is combined with an evidence-based literature review. The authors have tabulated 10 key steps from the Harris protocol, including a review of the surgical techniques to improve PSD patient outcomes.

  15. Establishment and optimization of NMR-based cell metabonomics study protocols for neonatal Sprague-Dawley rat cardiomyocytes.

    PubMed

    Zhang, Ming; Sun, Bo; Zhang, Qi; Gao, Rong; Liu, Qiao; Dong, Fangting; Fang, Haiqin; Peng, Shuangqing; Li, Famei; Yan, Xianzhong

    2017-01-15

    A quenching, harvesting, and extraction protocol was optimized for cardiomyocytes NMR metabonomics analysis in this study. Trypsin treatment and direct scraping cells in acetonitrile were compared for sample harvesting. The results showed trypsin treatment cause normalized concentration increasing of phosphocholine and metabolites leakage, since the trypsin-induced membrane broken and long term harvesting procedures. Then the intracellular metabolite extraction efficiency of methanol and acetonitrile were compared. As a result, washing twice with phosphate buffer, direct scraping cells and extracting with acetonitrile were chosen to prepare cardiomyocytes extracts samples for metabonomics studies. This optimized protocol is rapid, effective, and exhibits greater metabolite retention. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Relationship between dysfunctional breathing patterns and ability to achieve target heart rate variability with features of "coherence" during biofeedback.

    PubMed

    Courtney, Rosalba; Cohen, Marc; van Dixhoorn, Jan

    2011-01-01

    Heart rate variability (HRV) biofeedback is a self-regulation strategy used to improve conditions including asthma, stress, hypertension, and chronic obstructive pulmonary disease. Respiratory muscle function affects hemodynamic influences on respiratory sinus arrhythmia (RSA), and HRV and HRV-biofeedback protocols often include slow abdominal breathing to achieve physiologically optimal patterns of HRV with power spectral distribution concentrated around the 0.1-Hz frequency and large amplitude. It is likely that optimal balanced breathing patterns and ability to entrain heart rhythms to breathing reflect physiological efficiency and resilience and that individuals with dysfunctional breathing patterns may have difficulty voluntarily modulating HRV and RSA. The relationship between breathing movement patterns and HRV, however, has not been investigated. This study examines how individuals' habitual breathing patterns correspond with their ability to optimize HRV and RSA. Breathing pattern was assessed using the Manual Assessment of Respiratory Motion (MARM) and the Hi Lo manual palpation techniques in 83 people with possible dysfunctional breathing before they attempted HRV biofeedback. Mean respiratory rate was also assessed. Subsequently, participants applied a brief 5-minute biofeedback protocol, involving breathing and positive emotional focus, to achieve HRV patterns proposed to reflect physiological "coherence" and entrainment of heart rhythm oscillations to other oscillating body systems. Thoracic-dominant breathing was associated with decreased coherence of HRV (r = -.463, P = .0001). Individuals with paradoxical breathing had the lowest HRV coherence (t(8) = 10.7, P = .001), and the negative relationship between coherence of HRV and extent of thoracic breathing was strongest in this group (r = -.768, P = .03). Dysfunctional breathing patterns are associated with decreased ability to achieve HRV patterns that reflect cardiorespiratory efficiency and autonomic nervous system balance. This suggests that dysfunctional breathing patterns are not only biomechanically inefficient but also reflect decreased physiological resilience. Breathing assessment using simple manual techniques such as the MARM and Hi Lo may be useful in HRV biofeedback to identify if poor responders require more emphasis on correction of dysfunctional breathing.

  17. Role of Nutritional Supplements Complementing Nutrient-Dense Diets: General Versus Sport/Exercise-Specific Dietary Guidelines Related to Energy Expenditure

    NASA Astrophysics Data System (ADS)

    Kleiner, Susan; Greenwood, Mike

    A nutrient-dense diet is a critical aspect in attaining optimal exercise training and athletic performance outcomes. Although including safe and effective nutritional supplements in the dietary design can be extremely helpful in promoting adequate caloric ingestion, they are not sufficient for promoting adequate caloric ingestion based on individualized caloric expenditure needs without the proper diet. Specifically, a strategic and scientifically based nutrient-dense dietary profile should be created by qualified professionals to meet the sport/exercise-specific energy demands of any individual involved in select training intensity protocols. Finally, ingesting the right quantity and quality of nutrient dense calories at precise windows of opportunity becomes vital in attaining desired training and/or competitive performance outcomes.

  18. Use of a channelized Hotelling observer to assess CT image quality and optimize dose reduction for iteratively reconstructed images.

    PubMed

    Favazza, Christopher P; Ferrero, Andrea; Yu, Lifeng; Leng, Shuai; McMillan, Kyle L; McCollough, Cynthia H

    2017-07-01

    The use of iterative reconstruction (IR) algorithms in CT generally decreases image noise and enables dose reduction. However, the amount of dose reduction possible using IR without sacrificing diagnostic performance is difficult to assess with conventional image quality metrics. Through this investigation, achievable dose reduction using a commercially available IR algorithm without loss of low contrast spatial resolution was determined with a channelized Hotelling observer (CHO) model and used to optimize a clinical abdomen/pelvis exam protocol. A phantom containing 21 low contrast disks-three different contrast levels and seven different diameters-was imaged at different dose levels. Images were created with filtered backprojection (FBP) and IR. The CHO was tasked with detecting the low contrast disks. CHO performance indicated dose could be reduced by 22% to 25% without compromising low contrast detectability (as compared to full-dose FBP images) whereas 50% or more dose reduction significantly reduced detection performance. Importantly, default settings for the scanner and protocol investigated reduced dose by upward of 75%. Subsequently, CHO-based protocol changes to the default protocol yielded images of higher quality and doses more consistent with values from a larger, dose-optimized scanner fleet. CHO assessment provided objective data to successfully optimize a clinical CT acquisition protocol.

  19. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    PubMed

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  20. College of American Pathologists Cancer Protocols: Optimizing Format for Accuracy and Efficiency.

    PubMed

    Strickland-Marmol, Leah B; Muro-Cacho, Carlos A; Barnett, Scott D; Banas, Matthew R; Foulis, Philip R

    2016-06-01

    -The data in College of American Pathologists cancer protocols have to be presented effectively to health care providers. There is no consensus on the format of those protocols, resulting in various designs among pathologists. Cancer protocols are independently created by site-specific experts, so there is inconsistent wording and repetition of data. This lack of standardization can be confusing and may lead to interpretation errors. -To define a synopsis format that is effective in delivering essential pathologic information and to evaluate the aesthetic appeal and the impact of varying format styles on the speed and accuracy of data extraction. -We queried individuals from several health care backgrounds using varying formats of the fallopian tube protocol of the College of American Pathologists without content modification to investigate their aesthetic appeal, accuracy, efficiency, and readability/complexity. Descriptive statistics, an item difficulty index, and 3 tests of readability were used. -Columned formats were aesthetically more appealing than justified formats (P < .001) and were associated with greater accuracy and efficiency. Incorrect assumptions were made about items not included in the protocol. Uniform wording and short sentences were associated with better performance by participants. -Based on these data, we propose standardized protocol formats for cancer resections of the fallopian tube and the more-familiar colon, employing headers, short phrases, and uniform terminology. This template can be easily and minimally modified for other sites, standardizing format and verbiage and increasing user accuracy and efficiency. Principles of human factors engineering should be considered in the display of patient data.

  1. DNA Extraction Protocols for Whole-Genome Sequencing in Marine Organisms.

    PubMed

    Panova, Marina; Aronsson, Henrik; Cameron, R Andrew; Dahl, Peter; Godhe, Anna; Lind, Ulrika; Ortega-Martinez, Olga; Pereyra, Ricardo; Tesson, Sylvie V M; Wrange, Anna-Lisa; Blomberg, Anders; Johannesson, Kerstin

    2016-01-01

    The marine environment harbors a large proportion of the total biodiversity on this planet, including the majority of the earths' different phyla and classes. Studying the genomes of marine organisms can bring interesting insights into genome evolution. Today, almost all marine organismal groups are understudied with respect to their genomes. One potential reason is that extraction of high-quality DNA in sufficient amounts is challenging for many marine species. This is due to high polysaccharide content, polyphenols and other secondary metabolites that will inhibit downstream DNA library preparations. Consequently, protocols developed for vertebrates and plants do not always perform well for invertebrates and algae. In addition, many marine species have large population sizes and, as a consequence, highly variable genomes. Thus, to facilitate the sequence read assembly process during genome sequencing, it is desirable to obtain enough DNA from a single individual, which is a challenge in many species of invertebrates and algae. Here, we present DNA extraction protocols for seven marine species (four invertebrates, two algae, and a marine yeast), optimized to provide sufficient DNA quality and yield for de novo genome sequencing projects.

  2. Optimizing Equivalence-Based Instruction: Effects of Training Protocols on Equivalence Class Formation

    ERIC Educational Resources Information Center

    Fienup, Daniel M.; Wright, Nicole A.; Fields, Lanny

    2015-01-01

    Two experiments evaluated the effects of the simple-to-complex and simultaneous training protocols on the formation of academically relevant equivalence classes. The simple-to-complex protocol intersperses derived relations probes with training baseline relations. The simultaneous protocol conducts all training trials and test trials in separate…

  3. Brazilian Samba Protocol for Individuals With Parkinson's Disease: A Clinical Non-Randomized Study.

    PubMed

    Tillmann, Ana Cristina; Andrade, Alexandro; Swarowsky, Alessandra; Guimarães, Adriana Coutinho De Azevedo

    2017-07-04

    In the 10 most populated countries in the world, Parkinson's disease (PD) affects more than 5 million individuals. Despite optimal treatment options already developed for the disease, concomitant involvement of other areas of health care plays an important role in complementing the treatment. From this perspective, dancing can be viewed as a non-drug alternative that can reduce falls by improving some motor skills, such as mobility, balance, gait, and posture, and can also improve the overall quality of life. Brazilian samba promotes improvement in motor and non-motor symptoms in individuals with PD, providing a new treatment option for this population. The main objective of this quasi-experimental study is to provide a 12-week samba protocol (2x/week) for individuals with PD and to compare its effects with the group without intervention. The hypothesis is that the Brazilian samba protocol will promote improvement in primary (motor) and secondary (non-motor) outcomes in individuals with PD. The sample will be selected at random from individuals diagnosed with PD in the city of Florianopolis (SC, Brazil). Sample size calculation was performed with the G*Power 3.1.9.2 software, with 0.447 effect size, at 5% significance level, power of 0.9, and test and sample loss of 20%. This yielded 60 individuals divided between the intervention and control groups. The questionnaires will be filled out before and after the dance intervention. The data collection for the control group will be held simultaneously to the intervention group. The classes will last for 1 hour, twice a week in the evening for 12 weeks, and all classes will be divided into warm-up, main part, and relaxation. Two-way analysis of variance with repeated measures and Sidak post-hoc comparison test will be used for a comparative analysis of the final results of the control group with the experimental group and of the within-group changes between pre- and postintervention period. We expect to complete follow-up in September 2017. The major inspiration for this study was to encourage the creation of new rehabilitation programs that do not emphasize doctor involvement. This is a unique protocol for PD and we believe it can be an important tool to alleviate the motor and non-motor symptoms of individuals with PD. Dance is a simple activity depending on little equipment and few financial resources, facilitating its implementation and improving the cost-benefit relationship. In addition, activities that have a cultural aspect for the population in question, and which are pleasant, enable the participants to commit long term. This can enhance patient's compliance with the therapy, which is often a problem for many rehabilitation programs. ©Ana Cristina Tillmann, Alexandro Andrade, Alessandra Swarowsky, Adriana Coutinho De Azevedo Guimarães. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 04.07.2017.

  4. Near-optimal protocols in complex nonequilibrium transformations

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...

    2016-08-29

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less

  5. Memory-built-in quantum cloning in a hybrid solid-state spin register

    NASA Astrophysics Data System (ADS)

    Wang, Weibin; Zu, Chong; He, Li; Zhang, Wengang; Duan, Luming

    2015-05-01

    As a way to circumvent the quantum no-cloning theorem, approximate quantum cloning protocols have received wide attention with remarkable applications. Copying of quantum states to memory qubits provides an important strategy for eavesdropping in quantum cryptography. We report an experiment that realizes cloning of quantum states from an electron spin to a nuclear spin in a hybrid solid-state spin register with near-optimal fidelity. The nuclear spin provides an ideal memory qubit at room temperature, which stores the cloned quantum states for a millisecond under ambient conditions, exceeding the lifetime of the original quantum state carried by the electron spin by orders of magnitude, and making it an ideal memory qubit. Our experiment is based on control of an individual nitrogen vacancy (NV) center in the diamond, which is a diamond defect that attracts strong interest in recent years with great potential for implementation of quantum information protocols.

  6. Apricot (Prunus armeniaca L.).

    PubMed

    Petri, César; Alburquerque, Nuria; Burgos, Lorenzo

    2015-01-01

    A protocol for Agrobacterium-mediated stable transformation of whole leaf explants of the apricot (Prunus armeniaca) cultivars 'Helena' and 'Canino' is described. Regenerated buds were selected using a two-step selection strategy with paromomycin sulfate and transferred to bud multiplication medium 1 week after they were detected for optimal survival. After buds were transferred to bud multiplication medium, antibiotic was changed to kanamycin and concentration increased gradually at each transfer to fresh medium in order to eliminate possible escapes and chimeras. Transformation efficiency, based on PCR analysis of individual putative transformed shoots from independent lines, was 5.6%. Green and healthy buds, surviving high kanamycin concentration, were transferred to shoot multiplication medium where they elongated in shoots and proliferated. Elongated transgenic shoots were rooted in a medium containing 70 μM kanamycin. Rooted plants were acclimatized following standard procedures. This constitutes the only transformation protocol described for apricot clonal tissues and one of the few of Prunus.

  7. Comparing the force ripple during asynchronous and conventional stimulation.

    PubMed

    Downey, Ryan J; Tate, Mark; Kawai, Hiroyuki; Dixon, Warren E

    2014-10-01

    Asynchronous stimulation has been shown to reduce fatigue during electrical stimulation; however, it may also exhibit a force ripple. We quantified the ripple during asynchronous and conventional single-channel transcutaneous stimulation across a range of stimulation frequencies. The ripple was measured during 5 asynchronous stimulation protocols, 2 conventional stimulation protocols, and 3 volitional contractions in 12 healthy individuals. Conventional 40 Hz and asynchronous 16 Hz stimulation were found to induce contractions that were as smooth as volitional contractions. Asynchronous 8, 10, and 12 Hz stimulation induced contractions with significant ripple. Lower stimulation frequencies can reduce fatigue; however, they may also lead to increased ripple. Future efforts should study the relationship between force ripple and the smoothness of the evoked movements in addition to the relationship between stimulation frequency and NMES-induced fatigue to elucidate an optimal stimulation frequency for asynchronous stimulation. © 2014 Wiley Periodicals, Inc.

  8. WE-A-BRD-01: Innovation in Radiation Therapy Planning I: Knowledge Guided Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Q; Olsen, L

    2014-06-15

    Intensity modulated radiation therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT) offer the capability of normal tissues and organs sparing. However, the exact amount of sparing is often unknown until the plan is complete. This lack of prior guidance has led to the iterative, trial and-error approach in current planning practice. Even with this effort the search for patient-specific optimal organ sparing is still strongly influenced by planner's experience. While experience generally helps in maximizing the dosimetric advantages of IMRT/VMAT, there have been several reports showing unnecessarily high degree of plan quality variability at individual institutions and amongst different institutions,more » even with a large amount of experience and the best available tools. Further, when physician and physicist evaluate a plan, the dosimetric quality of the plan is often compared with a standard protocol that ignores individual patient anatomy and tumor characteristic variations. In recent years, developments of knowledge models for clinical IMRT/VMAT planning guidance have shown promising clinical potentials. These knowledge models extract past expert clinical experience into mathematical models that predict dose sparing references at patient-specific level. For physicians and planners, these references provide objective values that reflect best achievable dosimetric constraints. For quality assurance, applying patient-specific dosimetry requirements will enable more quantitative and objective assessment of protocol compliance for complex IMRT planning. Learning Objectives: Modeling and representation of knowledge for knowledge-guided treatment planning. Demonstrations of knowledge-guided treatment planning with a few clinical caanatomical sites. Validation and evaluation of knowledge models for cost and quality effective standardization of plan optimization.« less

  9. Heart-Rate Recovery After Warm-up in Swimming: A Useful Predictor of Training Heart-Rate Response?

    PubMed

    Ganzevles, Sander P M; de Haan, Arnold; Beek, Peter J; Daanen, Hein A M; Truijens, Martin J

    2017-07-01

    For training to be optimal, daily training load has to be adapted to the momentary status of the individual athlete, which is often difficult to establish. Therefore, the current study investigated the predictive value of heart-rate recovery (HRR) during a standardized warm-up for training load. Training load was quantified by the variation in heart rate during standardized training in competitive swimmers. Eight female and 5 male Dutch national-level swimmers participated in the study. They all performed 3 sessions consisting of a 300-m warm-up test and a 10 × 100-m training protocol. Both protocols were swum in front crawl at individually standardized velocities derived from an incremental step test. Velocity was related to 75% and 85% heart-rate reserve (% HR res ) for the warm-up and training, respectively. Relative HRR during the first 60 s after the warm-up (HR Rw-up ) and differences between the actual and intended heart rate for the warm-up and the training (ΔHR w-up and ΔHR tr ) were determined. No significant relationship between HRR w-up and ΔHR tr was found (F 1,37 = 2.96, P = .09, R 2 = .07, SEE = 4.65). There was considerable daily variation in ΔHR tr at a given swimming velocity (73-93% HR res ). ΔHR w-up and ΔHR tr were clearly related (F 1,37 = 74.31, P < .001, R 2 = .67, SEE = 2.78). HRR after a standardized warm-up does not predict heart rate during a directly subsequent and standardized training session. Instead, heart rate during the warm-up protocol seems a promising alternative for coaches to make daily individual-specific adjustments to training programs.

  10. Repopulation of interacting tumor cells during fractionated radiotherapy: stochastic modeling of the tumor control probability.

    PubMed

    Fakir, Hatim; Hlatky, Lynn; Li, Huamin; Sachs, Rainer

    2013-12-01

    Optimal treatment planning for fractionated external beam radiation therapy requires inputs from radiobiology based on recent thinking about the "five Rs" (repopulation, radiosensitivity, reoxygenation, redistribution, and repair). The need is especially acute for the newer, often individualized, protocols made feasible by progress in image guided radiation therapy and dose conformity. Current stochastic tumor control probability (TCP) models incorporating tumor repopulation effects consider "stem-like cancer cells" (SLCC) to be independent, but the authors here propose that SLCC-SLCC interactions may be significant. The authors present a new stochastic TCP model for repopulating SLCC interacting within microenvironmental niches. Our approach is meant mainly for comparing similar protocols. It aims at practical generalizations of previous mathematical models. The authors consider protocols with complete sublethal damage repair between fractions. The authors use customized open-source software and recent mathematical approaches from stochastic process theory for calculating the time-dependent SLCC number and thereby estimating SLCC eradication probabilities. As specific numerical examples, the authors consider predicted TCP results for a 2 Gy per fraction, 60 Gy protocol compared to 64 Gy protocols involving early or late boosts in a limited volume to some fractions. In sample calculations with linear quadratic parameters α = 0.3 per Gy, α∕β = 10 Gy, boosting is predicted to raise TCP from a dismal 14.5% observed in some older protocols for advanced NSCLC to above 70%. This prediction is robust as regards: (a) the assumed values of parameters other than α and (b) the choice of models for intraniche SLCC-SLCC interactions. However, α = 0.03 per Gy leads to a prediction of almost no improvement when boosting. The predicted efficacy of moderate boosts depends sensitively on α. Presumably, the larger values of α are the ones appropriate for individualized treatment protocols, with the smaller values relevant only to protocols for a heterogeneous patient population. On that assumption, boosting is predicted to be highly effective. Front boosting, apart from practical advantages and a possible advantage as regards iatrogenic second cancers, also probably gives a slightly higher TCP than back boosting. If the total number of SLCC at the start of treatment can be measured even roughly, it will provide a highly sensitive way of discriminating between various models and parameter choices. Updated mathematical methods for calculating repopulation allow credible generalizations of earlier results.

  11. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less

  12. Proposal of a method for the evaluation of inaccuracy of home sphygmomanometers.

    PubMed

    Akpolat, Tekin

    2009-10-01

    There is no formal protocol for evaluating the individual accuracy of home sphygmomanometers. The aims of this study were to propose a method for achieving accuracy in automated home sphygmomanometers and to test the applicability of the defined method. The purposes of this method were to avoid major inaccuracies and to estimate the optimal circumstance for individual accuracy. The method has three stages and sequential measurement of blood pressure is used. The tested devices were categorized into four groups: accurate, acceptable, inaccurate and very inaccurate (major inaccuracy). The defined method takes approximately 10 min (excluding relaxation time) and was tested on three different occasions. The application of the method has shown that inaccuracy is a common problem among non-tested devices, that validated devices are superior to those that are non-validated or whose validation status is unknown, that major inaccuracy is common, especially in non-tested devices and that validation does not guarantee individual accuracy. A protocol addressing the accuracy of a particular sphygmomanometer in an individual patient is required, and a practical method has been suggested to achieve this. This method can be modified, but the main idea and approach should be preserved unless a better method is proposed. The purchase of validated devices and evaluation of accuracy for the purchased device in an individual patient will improve the monitoring of self-measurement of blood pressure at home. This study addresses device inaccuracy, but errors related to the patient, observer or blood pressure measurement technique should not be underestimated, and strict adherence to the manufacturer's instructions is essential.

  13. Development of a protocol to optimize electric power consumption and life cycle environmental impacts for operation of wastewater treatment plant.

    PubMed

    Piao, Wenhua; Kim, Changwon; Cho, Sunja; Kim, Hyosoo; Kim, Minsoo; Kim, Yejin

    2016-12-01

    In wastewater treatment plants (WWTPs), the portion of operating costs related to electric power consumption is increasing. If the electric power consumption decreased, however, it would be difficult to comply with the effluent water quality requirements. A protocol was proposed to minimize the environmental impacts as well as to optimize the electric power consumption under the conditions needed to meet the effluent water quality standards in this study. This protocol was comprised of six phases of procedure and was tested using operating data from S-WWTP to prove its applicability. The 11 major operating variables were categorized into three groups using principal component analysis and K-mean cluster analysis. Life cycle assessment (LCA) was conducted for each group to deduce the optimal operating conditions for each operating state. Then, employing mathematical modeling, six improvement plans to reduce electric power consumption were deduced. The electric power consumptions for suggested plans were estimated using an artificial neural network. This was followed by a second round of LCA conducted on the plans. As a result, a set of optimized improvement plans were derived for each group that were able to optimize the electric power consumption and life cycle environmental impact, at the same time. Based on these test results, the WWTP operating management protocol presented in this study is deemed able to suggest optimal operating conditions under which power consumption can be optimized with minimal life cycle environmental impact, while allowing the plant to meet water quality requirements.

  14. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  15. Deterministic generation of remote entanglement with active quantum feedback

    DOE PAGES

    Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...

    2015-12-10

    We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less

  16. Whole-body computed tomography in trauma patients: optimization of the patient scanning position significantly shortens examination time while maintaining diagnostic image quality.

    PubMed

    Hickethier, Tilman; Mammadov, Kamal; Baeßler, Bettina; Lichtenstein, Thorsten; Hinkelbein, Jochen; Smith, Lucy; Plum, Patrick Sven; Chon, Seung-Hun; Maintz, David; Chang, De-Hua

    2018-01-01

    The study was conducted to compare examination time and artifact vulnerability of whole-body computed tomographies (wbCTs) for trauma patients using conventional or optimized patient positioning. Examination time was measured in 100 patients scanned with conventional protocol (Group A: arms positioned alongside the body for head and neck imaging and over the head for trunk imaging) and 100 patients scanned with optimized protocol (Group B: arms flexed on a chest pillow without repositioning). Additionally, influence of two different scanning protocols on image quality in the most relevant body regions was assessed by two blinded readers. Total wbCT duration was about 35% or 3:46 min shorter in B than in A. Artifacts in aorta (27 vs 6%), liver (40 vs 8%) and spleen (27 vs 5%) occurred significantly more often in B than in A. No incident of non-diagnostic image quality was reported, and no significant differences for lungs and spine were found. An optimized wbCT positioning protocol for trauma patients allows a significant reduction of examination time while still maintaining diagnostic image quality.

  17. Developing an Anti-Xa-Based Anticoagulation Protocol for Patients with Percutaneous Ventricular Assist Devices.

    PubMed

    Sieg, Adam; Mardis, B Andrew; Mardis, Caitlin R; Huber, Michelle R; New, James P; Meadows, Holly B; Cook, Jennifer L; Toole, J Matthew; Uber, Walter E

    2015-01-01

    Because of the complexities associated with anticoagulation in temporary percutaneous ventricular assist device (pVAD) recipients, a lack of standardization exists in their management. This retrospective analysis evaluates current anticoagulation practices at a single center with the aim of identifying an optimal anticoagulation strategy and protocol. Patients were divided into two cohorts based on pVAD implanted (CentriMag (Thoratec; Pleasanton, CA) / TandemHeart (CardiacAssist; Pittsburgh, PA) or Impella (Abiomed, Danvers, MA)), with each group individually analyzed for bleeding and thrombotic complications. Patients in the CentriMag/TandemHeart cohort were subdivided based on the anticoagulation monitoring strategy (activated partial thromboplastin time (aPTT) or antifactor Xa unfractionated heparin (anti-Xa) values). In the CentriMag/TandemHeart cohort, there were five patients with anticoagulation titrated based on anti-Xa values; one patient developed a device thrombosis and a major bleed, whereas another patient experienced major bleeding. Eight patients received an Impella pVAD. Seven total major bleeds in three patients and no thrombotic events were detected. Based on distinct differences between the devices, anti-Xa values, and outcomes, two protocols were created to guide anticoagulation adjustments. However, anticoagulation in patients who require pVAD support is complex with constantly evolving anticoagulation goals. The ideal level of anticoagulation should be individually determined using several coagulation laboratory parameters in concert with hemodynamic changes in the patient's clinical status, the device, and the device cannulation.

  18. Rationally optimized cryopreservation of multiple mouse embryonic stem cell lines: I--Comparative fundamental cryobiology of multiple mouse embryonic stem cell lines and the implications for embryonic stem cell cryopreservation protocols.

    PubMed

    Kashuba, Corinna M; Benson, James D; Critser, John K

    2014-04-01

    The post-thaw recovery of mouse embryonic stem cells (mESCs) is often assumed to be adequate with current methods. However as this publication will show, this recovery of viable cells actually varies significantly by genetic background. Therefore there is a need to improve the efficiency and reduce the variability of current mESC cryopreservation methods. To address this need, we employed the principles of fundamental cryobiology to improve the cryopreservation protocol of four mESC lines from different genetic backgrounds (BALB/c, CBA, FVB, and 129R1 mESCs) through a comparative study characterizing the membrane permeability characteristics and membrane integrity osmotic tolerance limits of each cell line. In the companion paper, these values were used to predict optimal cryoprotectants, cooling rates, warming rates, and plunge temperatures, and then these predicted optimal protocols were validated against standard freezing protocols. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Social support systems as determinants of self-management and quality of life of people with diabetes across Europe: study protocol for an observational study.

    PubMed

    Koetsenruijter, Jan; van Lieshout, Jan; Vassilev, Ivaylo; Portillo, Mari Carmen; Serrano, Manuel; Knutsen, Ingrid; Roukova, Poli; Lionis, Christos; Todorova, Elka; Foss, Christina; Rogers, Anne; Wensing, Michel

    2014-03-04

    Long-term conditions pose major challenges for healthcare systems. Optimizing self-management of people with long-term conditions is an important strategy to improve quality of life, health outcomes, patient experiences in healthcare, and the sustainability of healthcare systems. Much research on self-management focuses on individual competencies, while the social systems of support that facilitate self-management are underexplored. The presented study aims to explore the role of social systems of support for self-management and quality of life, focusing on the social networks of people with diabetes and community organisations that serve them. The protocol concerns a cross-sectional study in 18 geographic areas in six European countries, involving a total of 1800 individuals with diabetes and 900 representatives of community organisations. In each country, we include a deprived rural area, a deprived urban area, and an affluent urban area. Individuals are recruited through healthcare practices in the targeted areas. A patient questionnaire comprises measures for quality of life, self-management behaviours, social network and social support, as well as individual characteristics. A community organisations' survey maps out interconnections between community and voluntary organisations that support patients with chronic illness and documents the scope of work of the different types of organisations. We first explore the structure of social networks of individuals and of community organisations. Then linkages between these social networks, self-management and quality of life will be examined, taking deprivation and other factors into account. This study will provide insight into determinants of self-management and quality of life in individuals with diabetes, focusing on the role of social networks and community organisations.

  20. Sequence optimization to reduce velocity offsets in cardiovascular magnetic resonance volume flow quantification - A multi-vendor study

    PubMed Central

    2011-01-01

    Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath-hold imaging and flow-related artefacts. Conclusions This study showed that with current systems there was no generic protocol which resulted into acceptable flow offset values. Protocol optimization would have to be performed on a per scanner and per protocol basis. Proper optimization might make accurate (transverse) aortic flow quantification possible for most scanners. Pulmonary flow quantification would still need further (offline) correction. PMID:21388521

  1. Shuffle Optimizer: A Program to Optimize DNA Shuffling for Protein Engineering.

    PubMed

    Milligan, John N; Garry, Daniel J

    2017-01-01

    DNA shuffling is a powerful tool to develop libraries of variants for protein engineering. Here, we present a protocol to use our freely available and easy-to-use computer program, Shuffle Optimizer. Shuffle Optimizer is written in the Python computer language and increases the nucleotide homology between two pieces of DNA desired to be shuffled together without changing the amino acid sequence. In addition we also include sections on optimal primer design for DNA shuffling and library construction, a small-volume ultrasonicator method to create sheared DNA, and finally a method to reassemble the sheared fragments and recover and clone the library. The Shuffle Optimizer program and these protocols will be useful to anyone desiring to perform any of the nucleotide homology-dependent shuffling methods.

  2. A core-shell column approach to a comprehensive high-performance liquid chromatography phenolic analysis of Vitis vinifera L. and interspecific hybrid grape juices, wines, and other matrices following either solid phase extraction or direct injection.

    PubMed

    Manns, David C; Mansfield, Anna Katharine

    2012-08-17

    Four high-throughput reverse-phase chromatographic protocols utilizing two different core-shell column chemistries have been developed to analyze the phenolic profiles of complex matrices, specifically targeting juices and wines produced from interspecific hybrid grape cultivars. Following pre-fractionation via solid-phase extraction or direct injection, individual protocols were designed to resolve, identify and quantify specific chemical classes of compounds including non-anthocyanin monomeric phenolics, condensed tannins following acid hydrolysis, and anthocyanins. Detection levels ranging from 1.2 ppb to 27.5 ppb, analyte %RSDs ranging from 0.04 to 0.38, and linear ranges of quantitation approaching five orders of magnitude were achieved using conventional HPLC instrumentation. Using C(18) column chemistry, the non-anthocyanin monomeric protocol effectively separated a set of 16 relevant phenolic compounds comprised flavan-3-ols, hydroxycinnamic acids, and flavonols in under 14 min. The same column was used to develop a 15-min protocol for hydrolyzed condensed tannin analysis. Two anthocyanin protocols are presented, one utilizing the same C(18) column, best suited for anthocyanidin and monoglucoside analysis, the other utilizing a pentafluorophenyl chemistry optimized to effectively separate complex mixtures of coexisting mono- and diglucoside anthocyanins. These protocols and column chemistries have been used initially to explore a wide variety of complex phenolic matrices, including red and white juices and wines produced from Vitis vinifera and interspecific hybrid grape cultivars, juices, teas, and plant extracts. Each protocol displayed robust matrix responses as written, yet are flexible enough to be easily modified to suit specifically tailored analytical requirements. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Brazilian Samba Protocol for Individuals With Parkinson’s Disease: A Clinical Non-Randomized Study

    PubMed Central

    2017-01-01

    Background In the 10 most populated countries in the world, Parkinson's disease (PD) affects more than 5 million individuals. Despite optimal treatment options already developed for the disease, concomitant involvement of other areas of health care plays an important role in complementing the treatment. From this perspective, dancing can be viewed as a non-drug alternative that can reduce falls by improving some motor skills, such as mobility, balance, gait, and posture, and can also improve the overall quality of life. Brazilian samba promotes improvement in motor and non-motor symptoms in individuals with PD, providing a new treatment option for this population. Objective The main objective of this quasi-experimental study is to provide a 12-week samba protocol (2x/week) for individuals with PD and to compare its effects with the group without intervention. The hypothesis is that the Brazilian samba protocol will promote improvement in primary (motor) and secondary (non-motor) outcomes in individuals with PD. Methods The sample will be selected at random from individuals diagnosed with PD in the city of Florianopolis (SC, Brazil). Sample size calculation was performed with the G*Power 3.1.9.2 software, with 0.447 effect size, at 5% significance level, power of 0.9, and test and sample loss of 20%. This yielded 60 individuals divided between the intervention and control groups. The questionnaires will be filled out before and after the dance intervention. The data collection for the control group will be held simultaneously to the intervention group. The classes will last for 1 hour, twice a week in the evening for 12 weeks, and all classes will be divided into warm-up, main part, and relaxation. Two-way analysis of variance with repeated measures and Sidak post-hoc comparison test will be used for a comparative analysis of the final results of the control group with the experimental group and of the within-group changes between pre- and postintervention period. Results We expect to complete follow-up in September 2017. Conclusions The major inspiration for this study was to encourage the creation of new rehabilitation programs that do not emphasize doctor involvement. This is a unique protocol for PD and we believe it can be an important tool to alleviate the motor and non-motor symptoms of individuals with PD. Dance is a simple activity depending on little equipment and few financial resources, facilitating its implementation and improving the cost-benefit relationship. In addition, activities that have a cultural aspect for the population in question, and which are pleasant, enable the participants to commit long term. This can enhance patient’s compliance with the therapy, which is often a problem for many rehabilitation programs. PMID:28676466

  4. Improvement of electroporation to deliver plasmid DNA into dental follicle cells

    PubMed Central

    Yao, Shaomian; Rana, Samir; Liu, Dawen; Wise, Gary E.

    2010-01-01

    Electroporation DNA transfer is a simple and versatile approach to deliver genes. To develop an optimal electroporation protocol to deliver DNA into cells, we conducted square wave electroporation experiments with using rat dental follicle cells as follows: 1) the cells were electroporated at different electric field strengths with lac Z plasmid; 2) plasmid concentrations were tested to determine the optimal doses; 3) various concentrations of bovine serum albumin or fetal bovine serum were added to the pulsing buffer; and, 4) the pulsing durations were studied to determine the optimal duration. These experiments indicated that the optimal electroporation electric field strength was 375 V/cm, and that plasmid concentrations greater than 0.18 μg/μl were required to achieve high transfection efficiency. BSA or FBS in the pulsing buffer significantly improved cell survival and increased the number of transfected cells. The optimal pulsing duration was in the range of 45 to 120 milliseconds (ms) at 375 V/cm. Thus, an improved electroporation protocol was established by optimizing the above parameters. In turn, this electroporation protocol can be used to deliver DNA into dental follicle cells to study the roles of candidate genes in regulating tooth eruption. PMID:19830717

  5. Automation of sample preparation for mass cytometry barcoding in support of clinical research: protocol optimization.

    PubMed

    Nassar, Ala F; Wisnewski, Adam V; Raddassi, Khadir

    2017-03-01

    Analysis of multiplexed assays is highly important for clinical diagnostics and other analytical applications. Mass cytometry enables multi-dimensional, single-cell analysis of cell type and state. In mass cytometry, the rare earth metals used as reporters on antibodies allow determination of marker expression in individual cells. Barcode-based bioassays for CyTOF are able to encode and decode for different experimental conditions or samples within the same experiment, facilitating progress in producing straightforward and consistent results. Herein, an integrated protocol for automated sample preparation for barcoding used in conjunction with mass cytometry for clinical bioanalysis samples is described; we offer results of our work with barcoding protocol optimization. In addition, we present some points to be considered in order to minimize the variability of quantitative mass cytometry measurements. For example, we discuss the importance of having multiple populations during titration of the antibodies and effect of storage and shipping of labelled samples on the stability of staining for purposes of CyTOF analysis. Data quality is not affected when labelled samples are stored either frozen or at 4 °C and used within 10 days; we observed that cell loss is greater if cells are washed with deionized water prior to shipment or are shipped in lower concentration. Once the labelled samples for CyTOF are suspended in deionized water, the analysis should be performed expeditiously, preferably within the first hour. Damage can be minimized if the cells are resuspended in phosphate-buffered saline (PBS) rather than deionized water while waiting for data acquisition.

  6. SU-F-J-16: Planar KV Imaging Dose Reduction Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gershkevitsh, E; Zolotuhhin, D

    Purpose: IGRT has become an indispensable tool in modern radiotherapy with kV imaging used in many departments due to superior image quality and lower dose when compared to MV imaging. Many departments use manufacturer supplied protocols for imaging which are not always optimised between image quality and radiation dose (ALARA). Methods: Whole body phantom PBU-50 (Kyoto Kagaku ltd., Japan) for imaging in radiology has been imaged on Varian iX accelerator (Varian Medical Systems, USA) with OBI 1.5 system. Manufacturer’s default protocols were adapted by modifying kV and mAs values when imaging different anatomical regions of the phantom (head, thorax, abdomen,more » pelvis, extremities). Images with different settings were independently reviewed by two persons and their suitability for IGRT set-up correction protocols were evaluated. The suitable images with the lowest mAs were then selected. The entrance surface dose (ESD) for manufacturer’s default protocols and modified protocols were measured with RTI Black Piranha (RTI Group, Sweden) and compared. Image quality was also measured with kVQC phantom (Standard Imaging, USA) for different protocols. The modified protocols have been applied for clinical work. Results: For most cases optimized protocols reduced the ESD on average by a factor of 3(range 0.9–8.5). Further reduction in ESD has been observed by applying bow-tie filter designed for CBCT. The largest reduction in dose (12.2 times) was observed for Thorax lateral protocol. The dose was slightly increased (by 10%) for large pelvis AP protocol. Conclusion: Manufacturer’s default IGRT protocols could be optimised to reduce the ESD to the patient without losing the necessary image quality for patient set-up correction. For patient set-up with planar kV imaging the bony anatomy is mostly used and optimization should focus on this aspect. Therefore, the current approach with anthropomorphic phantom is more advantageous in optimization over standard kV quality control phantoms and SNR metrics.« less

  7. High-resolution Modeling Assisted Design of Customized and Individualized Transcranial Direct Current Stimulation Protocols

    PubMed Central

    Bikson, Marom; Rahman, Asif; Datta, Abhishek; Fregni, Felipe; Merabet, Lotfi

    2012-01-01

    Objectives Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity currents facilitating or inhibiting spontaneous neuronal activity. tDCS is attractive since dose is readily adjustable by simply changing electrode number, position, size, shape, and current. In the recent past, computational models have been developed with increased precision with the goal to help customize tDCS dose. The aim of this review is to discuss the incorporation of high-resolution patient-specific computer modeling to guide and optimize tDCS. Methods In this review, we discuss the following topics: (i) The clinical motivation and rationale for models of transcranial stimulation is considered pivotal in order to leverage the flexibility of neuromodulation; (ii) The protocols and the workflow for developing high-resolution models; (iii) The technical challenges and limitations of interpreting modeling predictions, and (iv) Real cases merging modeling and clinical data illustrating the impact of computational models on the rational design of rehabilitative electrotherapy. Conclusions Though modeling for non-invasive brain stimulation is still in its development phase, it is predicted that with increased validation, dissemination, simplification and democratization of modeling tools, computational forward models of neuromodulation will become useful tools to guide the optimization of clinical electrotherapy. PMID:22780230

  8. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    PubMed

    Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E

    2017-01-01

    Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  9. Thermodynamic metrics and optimal paths.

    PubMed

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  10. High-throughput crystallization screening.

    PubMed

    Skarina, Tatiana; Xu, Xiaohui; Evdokimova, Elena; Savchenko, Alexei

    2014-01-01

    Protein structure determination by X-ray crystallography is dependent on obtaining a single protein crystal suitable for diffraction data collection. Due to this requirement, protein crystallization represents a key step in protein structure determination. The conditions for protein crystallization have to be determined empirically for each protein, making this step also a bottleneck in the structure determination process. Typical protein crystallization practice involves parallel setup and monitoring of a considerable number of individual protein crystallization experiments (also called crystallization trials). In these trials the aliquots of purified protein are mixed with a range of solutions composed of a precipitating agent, buffer, and sometimes an additive that have been previously successful in prompting protein crystallization. The individual chemical conditions in which a particular protein shows signs of crystallization are used as a starting point for further crystallization experiments. The goal is optimizing the formation of individual protein crystals of sufficient size and quality to make them suitable for diffraction data collection. Thus the composition of the primary crystallization screen is critical for successful crystallization.Systematic analysis of crystallization experiments carried out on several hundred proteins as part of large-scale structural genomics efforts allowed the optimization of the protein crystallization protocol and identification of a minimal set of 96 crystallization solutions (the "TRAP" screen) that, in our experience, led to crystallization of the maximum number of proteins.

  11. Performance Analysis of TCP Enhancements in Satellite Data Networks

    NASA Technical Reports Server (NTRS)

    Broyles, Ren H.

    1999-01-01

    This research examines two proposed enhancements to the well-known Transport Control Protocol (TCP) in the presence of noisy communication links. The Multiple Pipes protocol is an application-level adaptation of the standard TCP protocol, where several TCP links cooperate to transfer data. The Space Communication Protocol Standard - Transport Protocol (SCPS-TP) modifies TCP to optimize performance in a satellite environment. While SCPS-TP has inherent advantages that allow it to deliver data more rapidly than Multiple Pipes, the protocol, when optimized for operation in a high-error environment, is not compatible with legacy TCP systems, and requires changes to the TCP specification. This investigation determines the level of improvement offered by SCPS-TP's Corruption Mode, which will help determine if migration to the protocol is appropriate in different environments. As the percentage of corrupted packets approaches 5 %, Multiple Pipes can take over five times longer than SCPS-TP to deliver data. At high error rates, SCPS-TP's advantage is primarily caused by Multiple Pipes' use of congestion control algorithms. The lack of congestion control, however, limits the systems in which SCPS-TP can be effectively used.

  12. Relating quantum privacy and quantum coherence: an operational approach.

    PubMed

    Devetak, I; Winter, A

    2004-08-20

    Given many realizations of a state or a channel as a resource, two parties can generate a secret key as well as entanglement. We describe protocols to perform the secret key distillation (as it turns out, with optimal rate). Then we show how to achieve optimal entanglement generation rates by "coherent" implementation of a class of secret key agreement protocols, proving the long-conjectured "hashing inequality."

  13. Rethinking Traffic Management: Design of Optimizable Networks

    DTIC Science & Technology

    2008-06-01

    Though this paper used optimization theory to design and analyze DaVinci , op- timization theory is one of many possible tools to enable a grounded...dynamically allocate bandwidth shares. The distributed protocols can be implemented using DaVinci : Dynamically Adaptive VIrtual Networks for a Customized...Internet. In DaVinci , each virtual network runs traffic-management protocols optimized for a traffic class, and link bandwidth is dynamically allocated

  14. A Family of Quantum Protocols

    NASA Astrophysics Data System (ADS)

    Devetak, Igor; Harrow, Aram W.; Winter, Andreas

    2004-12-01

    We introduce three new quantum protocols involving noisy quantum channels and entangled states, and relate them operationally and conceptually with four well-known old protocols. Two of the new protocols (the mother and father) can generate the other five “child” protocols by direct application of teleportation and superdense coding, and can be derived in turn by making the old protocols “coherent.” This gives very simple proofs for two famous old protocols (the hashing inequality and quantum channel capacity) and provides the basis for optimal trade-off curves in several quantum information processing tasks.

  15. Unification of quantum information theory

    NASA Astrophysics Data System (ADS)

    Abeyesinghe, Anura

    We present the unification of many previously disparate results in noisy quantum Shannon theory and the unification of all of noiseless quantum Shannon theory. More specifically we deal here with bipartite, unidirectional, and memoryless quantum Shannon theory. We find all the optimal protocols and quantify the relationship between the resources used, both for the one-shot and for the ensemble case, for what is arguably the most fundamental task in quantum information theory: sharing entangled states between a sender and a receiver. We find that all of these protocols are derived from our one-shot superdense coding protocol and relate nicely to each other. We then move on to noisy quantum information theory and give a simple, direct proof of the "mother" protocol, or rather her generalization to the Fully Quantum Slepian-Wolf protocol (FQSW). FQSW simultaneously accomplishes two goals: quantum communication-assisted entanglement distillation, and state transfer from the sender to the receiver. As a result, in addition to her other "children," the mother protocol generates the state merging primitive of Horodecki, Oppenheim, and Winter as well as a new class of distributed compression protocols for correlated quantum sources, which are optimal for sources described by separable density operators. Moreover, the mother protocol described here is easily transformed into the so-called "father" protocol, demonstrating that the division of single-sender/single-receiver protocols into two families was unnecessary: all protocols in the family are children of the mother.

  16. Assessing and Managing Risk with Suicidal Individuals

    ERIC Educational Resources Information Center

    Linehan, Marsh M.; Comtois, Katherine A.; Ward-Ciesielski, Erin F.

    2012-01-01

    The University of Washington Risk Assessment Protocol (UWRAP) and Risk Assessment and Management Protocol (UWRAMP) have been used in numerous clinical trials treating high-risk suicidal individuals over several years. These protocols structure assessors and treatment providers to provide a thorough suicide risk assessment, review standards of care…

  17. Bulk Data Dissemination in Low Power Sensor Networks: Present and Future Directions

    PubMed Central

    Xu, Zhirong; Hu, Tianlei; Song, Qianshu

    2017-01-01

    Wireless sensor network-based (WSN-based) applications need an efficient and reliable data dissemination service to facilitate maintenance, management and data distribution tasks. As WSNs nowadays are becoming pervasive and data intensive, bulk data dissemination protocols have been extensively studied recently. This paper provides a comprehensive survey of the state-of-the-art bulk data dissemination protocols. The large number of papers available in the literature propose various techniques to optimize the dissemination protocols. Different from the existing survey works which separately explores the building blocks of dissemination, our work categorizes the literature according to the optimization purposes: Reliability, Scalability and Transmission/Energy efficiency. By summarizing and reviewing the key insights and techniques, we further discuss on the future directions for each category. Our survey helps unveil three key findings for future direction: (1) The recent advances in wireless communications (e.g., study on cross-technology interference, error estimating codes, constructive interference, capture effect) can be potentially exploited to support further optimization on the reliability and energy efficiency of dissemination protocols; (2) Dissemination in multi-channel, multi-task and opportunistic networks requires more efforts to fully exploit the spatial-temporal network resources to enhance the data propagation; (3) Since many designs incur changes on MAC layer protocols, the co-existence of dissemination with other network protocols is another problem left to be addressed. PMID:28098830

  18. Implementing a Pro-forma for Multidisciplinary Management of an Enterocutaneous Fistula: A Case Study.

    PubMed

    Samad, Sohel; Anele, Chukwuemeka; Akhtar, Mansoor; Doughan, Samer

    2015-06-01

    Optimal management of patients with an entercocutaneous fistula (ECF) requires utilization of the sepsis, nutrition, anatomy, and surgical procedure (SNAP) protocol. The protocol includes early detection and treatment of sepsis, optimizing patient nutrition through oral and parenteral routes, identifying the fistula anatomy, optimal fistula management, and proceeding to corrective surgery when appropriate. The protocol requires multidisciplinary team (MDT) coordination among surgeons, nurses, dietitians, stoma nurses, and physiotherapists. This case study describes a 70-year-old man who developed an ECF subsequent to a laparotomy for a small bowel obstruction. Following a period of ileus, 16 days post laparotomy the patient developed a high-output (2,000 mL per day) fistula. The patient also became pyrexial with raised inflammatory markers, requiring antibiotic treatment. Following development of his ECF, he was managed using the SNAP protocol for the duration of his admission; however, in implementing this protocol with this patient, clinicians noted fluid charts were inadequate to allow effective management of the variables. Thus, a new pro-forma was created that encompassed fluid balance, nutritional status, and pertinent blood test results, as well as perifistular skin condition, medication, and documentation of management plans from the MDT team. The pro-forma was recorded daily in the patient notes. Following implementation of the pro-forma and the SNAP protocol, the patient recovered well clinically over a period of 4 weeks with a decrease in his fistula output to 300-500 mL per day, and he was discharged with plans for further corrective surgery to resect the fistula and for bowel re-anastomoses. Although fluid charts are readily available, they do not include all pertinent variables for optimal management of patients with an ECF. Further research is needed to validate the pro-forma and evaluate its effect on patient outcomes.

  19. Single-source dual-energy spectral multidetector CT of pancreatic adenocarcinoma: optimization of energy level viewing significantly increases lesion contrast.

    PubMed

    Patel, B N; Thomas, J V; Lockhart, M E; Berland, L L; Morgan, D E

    2013-02-01

    To evaluate lesion contrast in pancreatic adenocarcinoma patients using spectral multidetector computed tomography (MDCT) analysis. The present institutional review board-approved, Health Insurance Portability and Accountability Act of 1996 (HIPAA)-compliant retrospective study evaluated 64 consecutive adults with pancreatic adenocarcinoma examined using a standardized, multiphasic protocol on a single-source, dual-energy MDCT system. Pancreatic phase images (35 s) were acquired in dual-energy mode; unenhanced and portal venous phases used standard MDCT. Lesion contrast was evaluated on an independent workstation using dual-energy analysis software, comparing tumour to non-tumoural pancreas attenuation (HU) differences and tumour diameter at three energy levels: 70 keV; individual subject-optimized viewing energy level (based on the maximum contrast-to-noise ratio, CNR); and 45 keV. The image noise was measured for the same three energies. Differences in lesion contrast, diameter, and noise between the different energy levels were analysed using analysis of variance (ANOVA). Quantitative differences in contrast gain between 70 keV and CNR-optimized viewing energies, and between CNR-optimized and 45 keV were compared using the paired t-test. Thirty-four women and 30 men (mean age 68 years) had a mean tumour diameter of 3.6 cm. The median optimized energy level was 50 keV (range 40-77). The mean ± SD lesion contrast values (non-tumoural pancreas - tumour attenuation) were: 57 ± 29, 115 ± 70, and 146 ± 74 HU (p = 0.0005); the lengths of the tumours were: 3.6, 3.3, and 3.1 cm, respectively (p = 0.026); and the contrast to noise ratios were: 24 ± 7, 39 ± 12, and 59 ± 17 (p = 0.0005) for 70 keV, the optimized energy level, and 45 keV, respectively. For individuals, the mean ± SD contrast gain from 70 keV to the optimized energy level was 59 ± 45 HU; and the mean ± SD contrast gain from the optimized energy level to 45 keV was 31 ± 25 HU (p = 0.007). Significantly increased pancreatic lesion contrast was noted at lower viewing energies using spectral MDCT. Individual patient CNR-optimized energy level images have the potential to improve lesion conspicuity. Copyright © 2012 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  20. Optimization of three FISH procedures for in situ detection of anaerobic ammonium oxidizing bacteria in biological wastewater treatment.

    PubMed

    Pavlekovic, Marko; Schmid, Markus C; Schmider-Poignee, Nadja; Spring, Stefan; Pilhofer, Martin; Gaul, Tobias; Fiandaca, Mark; Löffler, Frank E; Jetten, Mike; Schleifer, K-H; Lee, Natuschka M

    2009-08-01

    Fluorescence in situ hybridization (FISH) using fluorochrome-labeled DNA oligonucleotide probes has been successfully applied for in situ detection of anaerobic ammonium oxidizing (anammox) bacteria. However, application of the standard FISH protocols to visualize anammox bacteria in biofilms from a laboratory-scale wastewater reactor produced only weak signals. Increased signal intensity was achieved either by modifying the standard FISH protocol, using peptide nucleic acid probes (PNA FISH), or applying horse radish peroxidase- (HRP-) labeled probes and subsequent catalyzed reporter deposition (CARD-FISH). A comparative analysis using anammox biofilm samples and suspended anammox biomass from different laboratory wastewater bioreactors revealed that the modified standard FISH protocol and the PNA FISH probes produced equally strong fluorescence signals on suspended biomass, but only weak signals were obtained with the biofilm samples. The probe signal intensities in the biofilm samples could be enhanced by enzymatic pre-treatment of fixed cells, and by increasing the hybridization time of the PNA FISH protocol. CARD-FISH always produced up to four-fold stronger fluorescent signals but unspecific fluorescence signals, likely caused by endogenous peroxidases as reported in several previous studies, compromised the results. Interference of the development of fluorescence intensity with endogenous peroxidases was also observed in cells of aerobic ammonium oxidizers like Nitrosomonas europea, and sulfate-reducers like Desulfobacter postgatei. Interestingly, no interference was observed with other peroxidase-positive microorganisms, suggesting that CARD-FISH is not only compromised by the mere presence of peroxidases. Pre-treatment of cells to inactivate peroxidase with HCl or autoclavation/pasteurization failed to inactive peroxidases, but H(2)O(2) significantly reduced endogenous peroxidase activity. However, for optimal inactivation, different H(2)O(2) concentrations and incubation time may be needed, depending on nature of sample and should therefore always be individually determined for each study.

  1. Simple proof that Gaussian attacks are optimal among collective attacks against continuous-variable quantum key distribution with a Gaussian modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leverrier, Anthony; Grangier, Philippe; Laboratoire Charles Fabry, Institut d'Optique, CNRS, University Paris-Sud, Campus Polytechnique, RD 128, F-91127 Palaiseau Cedex

    2010-06-15

    In this article, we give a simple proof of the fact that the optimal collective attacks against continuous-variable quantum key distribution with a Gaussian modulation are Gaussian attacks. Our proof, which makes use of symmetry properties of the protocol in phase space, is particularly relevant for the finite-key analysis of the protocol and therefore for practical applications.

  2. Evaluating optimal therapy robustness by virtual expansion of a sample population, with a case study in cancer immunotherapy

    PubMed Central

    Barish, Syndi; Ochs, Michael F.; Sontag, Eduardo D.; Gevertz, Jana L.

    2017-01-01

    Cancer is a highly heterogeneous disease, exhibiting spatial and temporal variations that pose challenges for designing robust therapies. Here, we propose the VEPART (Virtual Expansion of Populations for Analyzing Robustness of Therapies) technique as a platform that integrates experimental data, mathematical modeling, and statistical analyses for identifying robust optimal treatment protocols. VEPART begins with time course experimental data for a sample population, and a mathematical model fit to aggregate data from that sample population. Using nonparametric statistics, the sample population is amplified and used to create a large number of virtual populations. At the final step of VEPART, robustness is assessed by identifying and analyzing the optimal therapy (perhaps restricted to a set of clinically realizable protocols) across each virtual population. As proof of concept, we have applied the VEPART method to study the robustness of treatment response in a mouse model of melanoma subject to treatment with immunostimulatory oncolytic viruses and dendritic cell vaccines. Our analysis (i) showed that every scheduling variant of the experimentally used treatment protocol is fragile (nonrobust) and (ii) discovered an alternative region of dosing space (lower oncolytic virus dose, higher dendritic cell dose) for which a robust optimal protocol exists. PMID:28716945

  3. Prevention of Osmotic Injury to Human Umbilical Vein Endothelial Cells for Biopreservation: A First Step Toward Biobanking of Endothelial Cells for Vascular Tissue Engineering.

    PubMed

    Niu, Dan; Zhao, Gang; Liu, Xiaoli; Zhou, Ping; Cao, Yunxia

    2016-03-01

    High-survival-rate cryopreservation of endothelial cells plays a critical role in vascular tissue engineering, while optimization of osmotic injuries is the first step toward successful cryopreservation. We designed a low-cost, easy-to-use, microfluidics-based microperfusion chamber to investigate the osmotic responses of human umbilical vein endothelial cells (HUVECs) at different temperatures, and then optimized the protocols for using cryoprotective agents (CPAs) to minimize osmotic injuries and improve processes before freezing and after thawing. The fundamental cryobiological parameters were measured using the microperfusion chamber, and then, the optimized protocols using these parameters were confirmed by survival evaluation and cell proliferation experiments. It was revealed for the first time that HUVECs have an unusually small permeability coefficient for Me2SO. Even at the concentrations well established for slow freezing of cells (1.5 M), one-step removal of CPAs for HUVECs might result in inevitable osmotic injuries, indicating that multiple-step removal is essential. Further experiments revealed that multistep removal of 1.5 M Me2SO at 25°C was the best protocol investigated, in good agreement with theory. These results should prove invaluable for optimization of cryopreservation protocols of HUVECs.

  4. CHARMM-GUI Input Generator for NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM Simulations Using the CHARMM36 Additive Force Field

    DOE PAGES

    Lee, Jumin; Cheng, Xi; Swails, Jason M.; ...

    2015-11-12

    Here we report that proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find themore » optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.« less

  5. SU-E-I-57: Evaluation and Optimization of Effective-Dose Using Different Beam-Hardening Filters in Clinical Pediatric Shunt CT Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, K; Aldoohan, S; Collier, J

    Purpose: Study image optimization and radiation dose reduction in pediatric shunt CT scanning protocol through the use of different beam-hardening filters Methods: A 64-slice CT scanner at OU Childrens Hospital has been used to evaluate CT image contrast-to-noise ratio (CNR) and measure effective-doses based on the concept of CT dose index (CTDIvol) using the pediatric head shunt scanning protocol. The routine axial pediatric head shunt scanning protocol that has been optimized for the intrinsic x-ray tube filter has been used to evaluate CNR by acquiring images using the ACR approved CT-phantom and radiation dose CTphantom, which was used to measuremore » CTDIvol. These results were set as reference points to study and evaluate the effects of adding different filtering materials (i.e. Tungsten, Tantalum, Titanium, Nickel and Copper filters) to the existing filter on image quality and radiation dose. To ensure optimal image quality, the scanner routine air calibration was run for each added filter. The image CNR was evaluated for different kVps and wide range of mAs values using above mentioned beam-hardening filters. These scanning protocols were run under axial as well as under helical techniques. The CTDIvol and the effective-dose were measured and calculated for all scanning protocols and added filtration, including the intrinsic x-ray tube filter. Results: Beam-hardening filter shapes energy spectrum, which reduces the dose by 27%. No noticeable changes in image low contrast detectability Conclusion: Effective-dose is very much dependent on the CTDIVol, which is further very much dependent on beam-hardening filters. Substantial reduction in effective-dose is realized using beam-hardening filters as compare to the intrinsic filter. This phantom study showed that significant radiation dose reduction could be achieved in CT pediatric shunt scanning protocols without compromising in diagnostic value of image quality.« less

  6. CHARMM-GUI Input Generator for NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM Simulations Using the CHARMM36 Additive Force Field.

    PubMed

    Lee, Jumin; Cheng, Xi; Swails, Jason M; Yeom, Min Sun; Eastman, Peter K; Lemkul, Justin A; Wei, Shuai; Buckner, Joshua; Jeong, Jong Cheol; Qi, Yifei; Jo, Sunhwan; Pande, Vijay S; Case, David A; Brooks, Charles L; MacKerell, Alexander D; Klauda, Jeffery B; Im, Wonpil

    2016-01-12

    Proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find the optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.

  7. Cross-layer protocol design for QoS optimization in real-time wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2010-04-01

    The metrics of quality of service (QoS) for each sensor type in a wireless sensor network can be associated with metrics for multimedia that describe the quality of fused information, e.g., throughput, delay, jitter, packet error rate, information correlation, etc. These QoS metrics are typically set at the highest, or application, layer of the protocol stack to ensure that performance requirements for each type of sensor data are satisfied. Application-layer metrics, in turn, depend on the support of the lower protocol layers: session, transport, network, data link (MAC), and physical. The dependencies of the QoS metrics on the performance of the higher layers of the Open System Interconnection (OSI) reference model of the WSN protocol, together with that of the lower three layers, are the basis for a comprehensive approach to QoS optimization for multiple sensor types in a general WSN model. The cross-layer design accounts for the distributed power consumption along energy-constrained routes and their constituent nodes. Following the author's previous work, the cross-layer interactions in the WSN protocol are represented by a set of concatenated protocol parameters and enabling resource levels. The "best" cross-layer designs to achieve optimal QoS are established by applying the general theory of martingale representations to the parameterized multivariate point processes (MVPPs) for discrete random events occurring in the WSN. Adaptive control of network behavior through the cross-layer design is realized through the parametric factorization of the stochastic conditional rates of the MVPPs. The cross-layer protocol parameters for optimal QoS are determined in terms of solutions to stochastic dynamic programming conditions derived from models of transient flows for heterogeneous sensor data and aggregate information over a finite time horizon. Markov state processes, embedded within the complex combinatorial history of WSN events, are more computationally tractable and lead to simplifications for any simulated or analytical performance evaluations of the cross-layer designs.

  8. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R

    2005-02-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  9. Design and Methodological Considerations of the Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida

    PubMed Central

    Routh, Jonathan C.; Cheng, Earl Y.; Austin, J. Christopher; Baum, Michelle A.; Gargollo, Patricio C.; Grady, Richard W.; Herron, Adrienne R.; Kim, Steven S.; King, Shelly J.; Koh, Chester J.; Paramsothy, Pangaja; Raman, Lisa; Schechter, Michael S.; Smith, Kathryn A.; Tanaka, Stacy T.; Thibadeau, Judy K.; Walker, William O.; Wallis, M. Chad; Wiener, John S.; Joseph, David B.

    2016-01-01

    Purpose Care of children with spina bifida has significantly advanced in the last half century, resulting in gains in longevity and quality of life for affected children and caregivers. Bladder dysfunction is the norm in patients with spina bifida and may result in infection, renal scarring and chronic kidney disease. However, the optimal urological management for spina bifida related bladder dysfunction is unknown. Materials and Methods In 2012 the Centers for Disease Control and Prevention convened a working group composed of pediatric urologists, nephrologists, epidemiologists, methodologists, community advocates and Centers for Disease Control and Prevention personnel to develop a protocol to optimize urological care of children with spina bifida from the newborn period through age 5 years. Results An iterative quality improvement protocol was selected. In this model participating institutions agree to prospectively treat all newborns with spina bifida using a single consensus based protocol. During the 5-year study period outcomes will be routinely assessed and the protocol adjusted as needed to optimize patient and process outcomes. Primary study outcomes include urinary tract infections, renal scarring, renal function and bladder characteristics. The protocol specifies the timing and use of testing (eg ultrasonography, urodynamics) and interventions (eg intermittent catheterization, prophylactic antibiotics, antimuscarinic medications). Starting in 2014 the Centers for Disease Control and Prevention began funding 9 study sites to implement and evaluate the protocol. Conclusions The Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida began accruing patients in 2015. Assessment in the first 5 years will focus on urinary tract infections, renal function, renal scarring and clinical process improvements. PMID:27475969

  10. Design and Methodological Considerations of the Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida.

    PubMed

    Routh, Jonathan C; Cheng, Earl Y; Austin, J Christopher; Baum, Michelle A; Gargollo, Patricio C; Grady, Richard W; Herron, Adrienne R; Kim, Steven S; King, Shelly J; Koh, Chester J; Paramsothy, Pangaja; Raman, Lisa; Schechter, Michael S; Smith, Kathryn A; Tanaka, Stacy T; Thibadeau, Judy K; Walker, William O; Wallis, M Chad; Wiener, John S; Joseph, David B

    2016-12-01

    Care of children with spina bifida has significantly advanced in the last half century, resulting in gains in longevity and quality of life for affected children and caregivers. Bladder dysfunction is the norm in patients with spina bifida and may result in infection, renal scarring and chronic kidney disease. However, the optimal urological management for spina bifida related bladder dysfunction is unknown. In 2012 the Centers for Disease Control and Prevention convened a working group composed of pediatric urologists, nephrologists, epidemiologists, methodologists, community advocates and Centers for Disease Control and Prevention personnel to develop a protocol to optimize urological care of children with spina bifida from the newborn period through age 5 years. An iterative quality improvement protocol was selected. In this model participating institutions agree to prospectively treat all newborns with spina bifida using a single consensus based protocol. During the 5-year study period outcomes will be routinely assessed and the protocol adjusted as needed to optimize patient and process outcomes. Primary study outcomes include urinary tract infections, renal scarring, renal function and bladder characteristics. The protocol specifies the timing and use of testing (eg ultrasonography, urodynamics) and interventions (eg intermittent catheterization, prophylactic antibiotics, antimuscarinic medications). Starting in 2014 the Centers for Disease Control and Prevention began funding 9 study sites to implement and evaluate the protocol. The Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida began accruing patients in 2015. Assessment in the first 5 years will focus on urinary tract infections, renal function, renal scarring and clinical process improvements. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  11. A real-time polymerase chain reaction-based protocol for low/medium-throughput Y-chromosome microdeletions analysis.

    PubMed

    Segat, Ludovica; Padovan, Lara; Doc, Darja; Petix, Vincenzo; Morgutti, Marcello; Crovella, Sergio; Ricci, Giuseppe

    2012-12-01

    We describe a real-time polymerase chain reaction (PCR) protocol based on the fluorescent molecule SYBR Green chemistry, for a low- to medium-throughput analysis of Y-chromosome microdeletions, optimized according to the European guidelines and aimed at making the protocol faster, avoiding post-PCR processing, and simplifying the results interpretation. We screened 156 men from the Assisted Reproduction Unit, Department of Obstetrics and Gynecology, Institute for Maternal and Child Health IRCCS Burlo Garofolo (Trieste, Italy), 150 not presenting Y-chromosome microdeletion, and 6 with microdeletions in different azoospermic factor (AZF) regions. For each sample, the Zinc finger Y-chromosomal protein (ZFY), sex-determining region Y (SRY), sY84, sY86, sY127, sY134, sY254, and sY255 loci were analyzed by performing one reaction for each locus. AZF microdeletions were successfully detected in six individuals, confirming the results obtained with commercial kits. Our real-time PCR protocol proved to be a rapid, safe, and relatively cheap method that was suitable for a low- to medium-throughput diagnosis of Y-chromosome microdeletion, which allows an analysis of approximately 10 samples (with the addition of positive and negative controls) in a 96-well plate format, or approximately 46 samples in a 384-well plate for all markers simultaneously, in less than 2 h without the need of post-PCR manipulation.

  12. VERSATILE, HIGH-RESOLUTION ANTEROGRADE LABELING OF VAGAL EFFERENT PROJECTIONS WITH DEXTRAN AMINES

    PubMed Central

    Walter, Gary C.; Phillips, Robert J.; Baronowsky, Elizabeth A.; Powley, Terry L.

    2009-01-01

    None of the anterograde tracers used to label and investigate vagal preganglionic neurons projecting to the viscera has proved optimal for routine and extensive labeling of autonomic terminal fields. To identify an alternative tracer protocol, the present experiment evaluated whether dextran conjugates, which have produced superior results in the CNS, might yield widespread and effective labeling of long, fine-caliber vagal efferents in the peripheral nervous system. The dextran conjugates that were evaluated proved reliable and versatile for labeling the motor neuron pool in its entirety, for single- and multiple-labeling protocols, for both conventional and confocal fluorescence microscopy, and for permanent labeling protocols for brightfield microscopy of the projections to the gastrointestinal (GI) tract. Using a standard ABC kit followed by visualization with DAB as the chromagen, Golgi-like labeling of the vagal efferent terminal fields in the GI wall was achieved with the biotinylated dextrans. The definition of individual terminal varicosities was so sharp and detailed that it was routinely practical to examine the relationship of putative vagal efferent contacts (by the criteria of high magnification light microscopy) with the dendritic and somatic architecture of counterstained neurons in the myenteric plexus. Overall, dextran conjugates provide high-definition labeling of an extensive vagal motor pool in the GI tract, and offer considerable versatility when multiple-staining protocols are needed to elucidate the complexities of the innervation of the gut. PMID:19056424

  13. MAC Protocol for Ad Hoc Networks Using a Genetic Algorithm

    PubMed Central

    Elizarraras, Omar; Panduro, Marco; Méndez, Aldo L.

    2014-01-01

    The problem of obtaining the transmission rate in an ad hoc network consists in adjusting the power of each node to ensure the signal to interference ratio (SIR) and the energy required to transmit from one node to another is obtained at the same time. Therefore, an optimal transmission rate for each node in a medium access control (MAC) protocol based on CSMA-CDMA (carrier sense multiple access-code division multiple access) for ad hoc networks can be obtained using evolutionary optimization. This work proposes a genetic algorithm for the transmission rate election considering a perfect power control, and our proposition achieves improvement of 10% compared with the scheme that handles the handshaking phase to adjust the transmission rate. Furthermore, this paper proposes a genetic algorithm that solves the problem of power combining, interference, data rate, and energy ensuring the signal to interference ratio in an ad hoc network. The result of the proposed genetic algorithm has a better performance (15%) compared to the CSMA-CDMA protocol without optimizing. Therefore, we show by simulation the effectiveness of the proposed protocol in terms of the throughput. PMID:25140339

  14. Optimizing a dynamical decoupling protocol for solid-state electronic spin ensembles in diamond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farfurnik, D.; Jarmola, A.; Pham, L. M.

    2015-08-24

    In this study, we demonstrate significant improvements of the spin coherence time of a dense ensemble of nitrogen-vacancy (NV) centers in diamond through optimized dynamical decoupling (DD). Cooling the sample down to 77 K suppresses longitudinal spin relaxation T 1 effects and DD microwave pulses are used to increase the transverse coherence time T 2 from ~0.7ms up to ~30ms. Furthermore, we extend previous work of single-axis (Carr-Purcell-Meiboom-Gill) DD towards the preservation of arbitrary spin states. Following a theoretical and experimental characterization of pulse and detuning errors, we compare the performance of various DD protocols. We also identify that themore » optimal control scheme for preserving an arbitrary spin state is a recursive protocol, the concatenated version of the XY8 pulse sequence. The improved spin coherence might have an immediate impact on improvements of the sensitivities of ac magnetometry. Moreover, the protocol can be used on denser diamond samples to increase coherence times up to NV-NV interaction time scales, a major step towards the creation of quantum collective NV spin states.« less

  15. Magnetic resonance imaging protocols for examination of the neurocranium at 3 T.

    PubMed

    Schwindt, W; Kugel, H; Bachmann, R; Kloska, S; Allkemper, T; Maintz, D; Pfleiderer, B; Tombach, B; Heindel, W

    2003-09-01

    The increasing availability of high-field (3 T) MR scanners requires adapting and optimizing clinical imaging protocols to exploit the theoretically higher signal-to-noise ratio (SNR) of the higher field strength. Our aim was to establish reliable and stable protocols meeting the clinical demands for imaging the neurocranium at 3 T. Two hundred patients with a broad range of indications received an examination of the neurocranium with an appropriate assortment of imaging techniques at 3 T. Several imaging parameters were optimized. Keeping scan times comparable to those at 1.5 T we increased spatial resolution. Contrast-enhanced and non-enhanced T1-weighted imaging was best applying gradient-echo and inversion recovery (rather than spin-echo) techniques, respectively. For fluid-attenuated inversion recovery (FLAIR) imaging a TE of 120 ms yielded optimum contrast-to-noise ratio (CNR). High-resolution isotropic 3D data sets were acquired within reasonable scan times. Some artifacts were pronounced, but generally imaging profited from the higher SNR. We present a set of optimized examination protocols for neuroimaging at 3 T, which proved to be reliable in a clinical routine setting.

  16. Optimization and validation of a fast amplification protocol for AmpFlSTR® Profiler Plus® for rapid forensic human identification.

    PubMed

    Laurin, Nancy; Frégeau, Chantal

    2012-01-01

    The goal of this work was to optimize and validate a fast amplification protocol for the multiplex amplification of the STR loci included in AmpFlSTR(®) Profiler Plus(®) to expedite human DNA identification. By modifying the cycling conditions and by combining the use of a DNA polymerase optimized for high speed PCR (SpeedSTAR™ HS) and a more efficient thermal cycler instrument (Bio-RAD C1000™), we were able to reduce the amplification process from 4h to 26 min. No modification to the commercial AmpFlSTR(®) Profiler Plus(®) primer mix was required. When compared to the current Royal Canadian Mounted Police (RCMP) amplification protocol, no differences with regards to specificity, sensitivity, heterozygote peak height ratios and overall profile balance were noted. Moreover, complete concordance was obtained with profiles previously generated with the standard amplification protocol and minor alleles in mixture samples were reliably typed. An increase in n-4 stutter ratios (2.2% on average for all loci) was observed for profiles amplified with the fast protocol compared to the current procedure. Our results document the robustness of this rapid amplification protocol for STR profiling using the AmpFlSTR(®) Profiler Plus(®) primer set and demonstrate that comparable data can be obtained in substantially less time. This new approach could provide an alternative option to current multiplex STR typing amplification protocols in order to increase throughput or expedite time-sensitive cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Optimizing MRI for imaging peripheral arthritis.

    PubMed

    Hodgson, Richard J; O'Connor, Philip J; Ridgway, John P

    2012-11-01

    MRI is increasingly used for the assessment of both inflammatory arthritis and osteoarthritis. The wide variety of MRI systems in use ranges from low-field, low-cost extremity units to whole-body high-field 7-T systems, each with different strengths for specific applications. The availability of dedicated radiofrequency phased-array coils allows the rapid acquisition of high-resolution images of one or more peripheral joints. MRI is uniquely flexible in its ability to manipulate image contrast, and individual MR sequences may be combined into protocols to sensitively visualize multiple features of arthritis including synovitis, bone marrow lesions, erosions, cartilage changes, and tendinopathy. Careful choice of the imaging parameters allows images to be generated with optimal quality while minimizing unwanted artifacts. Finally, there are many novel MRI techniques that can quantify disease levels in arthritis in tissues including synovitis and cartilage. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. Validation of a reaction volume reduction protocol for analysis of Y chromosome haplotypes targeting DNA databases.

    PubMed

    Souza, C A; Oliveira, T C; Crovella, S; Santos, S M; Rabêlo, K C N; Soriano, E P; Carvalho, M V D; Junior, A F Caldas; Porto, G G; Campello, R I C; Antunes, A A; Queiroz, R A; Souza, S M

    2017-04-28

    The use of Y chromosome haplotypes, important for the detection of sexual crimes in forensics, has gained prominence with the use of databases that incorporate these genetic profiles in their system. Here, we optimized and validated an amplification protocol for Y chromosome profile retrieval in reference samples using lesser materials than those in commercial kits. FTA ® cards (Flinders Technology Associates) were used to support the oral cells of male individuals, which were amplified directly using the SwabSolution reagent (Promega). First, we optimized and validated the process to define the volume and cycling conditions. Three reference samples and nineteen 1.2 mm-diameter perforated discs were used per sample. Amplification of one or two discs (samples) with the PowerPlex ® Y23 kit (Promega) was performed using 25, 26, and 27 thermal cycles. Twenty percent, 32%, and 100% reagent volumes, one disc, and 26 cycles were used for the control per sample. Thereafter, all samples (N = 270) were amplified using 27 cycles, one disc, and 32% reagents (optimized conditions). Data was analyzed using a study of equilibrium values between fluorophore colors. In the samples analyzed with 20% volume, an imbalance was observed in peak heights, both inside and in-between each dye. In samples amplified with 32% reagents, the values obtained for the intra-color and inter-color standard balance calculations for verification of the quality of the analyzed peaks were similar to those of samples amplified with 100% of the recommended volume. The quality of the profiles obtained with 32% reagents was suitable for insertion into databases.

  19. Looking for new biomarkers of skin wound vitality with a cytokine-based multiplex assay: preliminary study.

    PubMed

    Peyron, Pierre-Antoine; Baccino, Éric; Nagot, Nicolas; Lehmann, Sylvain; Delaby, Constance

    2017-02-01

    Determination of skin wound vitality is an important issue in forensic practice. No reliable biomarker currently exists. Quantification of inflammatory cytokines in injured skin with MSD ® technology is an innovative and promising approach. This preliminary study aims to develop a protocol for the preparation and the analysis of skin samples. Samples from ante mortem wounds, post mortem wounds, and intact skin ("control samples") were taken from corpses at the autopsy. After an optimization of the pre-analytical protocol had been performed in terms of skin homogeneisation and proteic extraction, the concentration of TNF-α was measured in each sample with the MSD ® approach. Then five other cytokines of interest (IL-1β, IL-6, IL-10, IL-12p70 and IFN-γ) were simultaneously quantified with a MSD ® multiplex assay. The optimal pre-analytical conditions consist in a proteic extraction from a 6 mm diameter skin sample, in a PBS buffer with triton 0,05%. Our results show the linearity and the reproductibility of the TNF-α quantification with MSD ® , and an inter- and intra-individual variability of the concentrations of proteins. The MSD ® multiplex assay is likely to detect differential skin concentrations for each cytokine of interest. This preliminary study was used to develop and optimize the pre-analytical and analytical conditions of the MSD ® method using injured and healthy skin samples, for the purpose of looking for and identifying the cytokine, or the set of cytokines, that may be biomarkers of skin wound vitality.

  20. NREL, Mercedes-Benz Optimizing Refueling Experience for Fuel Cell Electric

    Science.gov Websites

    optimize the customer refueling experience for fuel cell electric vehicles. Photo of a Mercedes-Benz B fueling protocols, with an eye toward optimizing the refueling station's customer interface, making the

  1. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA

    PubMed Central

    2012-01-01

    Background Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Methods Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. Results The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. Conclusions The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations. PMID:22394458

  2. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA.

    PubMed

    Foster, Amanda; Laurin, Nancy

    2012-03-06

    Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations.

  3. Optimal continuous variable quantum teleportation protocol for realistic settings

    NASA Astrophysics Data System (ADS)

    Luiz, F. S.; Rigolin, Gustavo

    2015-03-01

    We show the optimal setup that allows Alice to teleport coherent states | α > to Bob giving the greatest fidelity (efficiency) when one takes into account two realistic assumptions. The first one is the fact that in any actual implementation of the continuous variable teleportation protocol (CVTP) Alice and Bob necessarily share non-maximally entangled states (two-mode finitely squeezed states). The second one assumes that Alice's pool of possible coherent states to be teleported to Bob does not cover the whole complex plane (| α | < ∞). The optimal strategy is achieved by tuning three parameters in the original CVTP, namely, Alice's beam splitter transmittance and Bob's displacements in position and momentum implemented on the teleported state. These slight changes in the protocol are currently easy to be implemented and, as we show, give considerable gain in performance for a variety of possible pool of input states with Alice.

  4. Using Green Star Metrics to Optimize the Greenness of Literature Protocols for Syntheses

    ERIC Educational Resources Information Center

    Duarte, Rita C. C.; Ribeiro, M. Gabriela T. C.; Machado, Adélio A. S. C.

    2015-01-01

    A procedure to improve the greenness of a synthesis, without performing laboratory work, using alternative protocols available in the literature is presented. The greenness evaluation involves the separate assessment of the different steps described in the available protocols--reaction, isolation, and purification--as well as the global process,…

  5. An Overview and Analysis of Mobile Internet Protocols in Cellular Environments.

    ERIC Educational Resources Information Center

    Chao, Han-Chieh

    2001-01-01

    Notes that cellular is the inevitable future architecture for the personal communication service system. Discusses the current cellular support based on Mobile Internet Protocol version 6 (Ipv6) and points out the shortfalls of using Mobile IP. Highlights protocols especially for mobile management schemes which can optimize a high-speed mobile…

  6. Pregnancy Research on Osteopathic Manipulation Optimizing Treatment Effects: The PROMOTE Study Protocol.

    PubMed

    Hensel, Kendi L; Carnes, Michael S; Stoll, Scott T

    2016-11-01

    The structural and physiologic changes in a woman's body during pregnancy can predispose pregnant women to low back pain and its associated disability, as well as to complications of pregnancy, labor, and delivery. Anecdotal and empirical evidence has indicated that osteopathic manipulative treatment (OMT) may be efficacious in improving pain and functionality in women who are pregnant. Based on that premise, the Pregnancy Research on Osteopathic Manipulation Optimizing Treatment Effects (PROMOTE) study was designed as a prospective, randomized, placebo-controlled, and blinded clinical trial to evaluate the efficacy of an OMT protocol for pain during third-trimester pregnancy. The OMT protocol developed for the PROMOTE study was based on physiologic theory and the concept of the interrelationship of structure and function. The 12 well-defined, standardized OMT techniques used in the protocol are commonly taught at osteopathic medical schools in the United States. These techniques can be easily replicated as a 20-minute protocol applied in conjunction with usual prenatal care, thus making it feasible to implement into clinical practice. This article presents an overview of the study design and treatment protocols used in the PROMOTE study.

  7. A neural networks-based hybrid routing protocol for wireless mesh networks.

    PubMed

    Kojić, Nenad; Reljin, Irini; Reljin, Branimir

    2012-01-01

    The networking infrastructure of wireless mesh networks (WMNs) is decentralized and relatively simple, but they can display reliable functioning performance while having good redundancy. WMNs provide Internet access for fixed and mobile wireless devices. Both in urban and rural areas they provide users with high-bandwidth networks over a specific coverage area. The main problems affecting these networks are changes in network topology and link quality. In order to provide regular functioning, the routing protocol has the main influence in WMN implementations. In this paper we suggest a new routing protocol for WMN, based on good results of a proactive and reactive routing protocol, and for that reason it can be classified as a hybrid routing protocol. The proposed solution should avoid flooding and creating the new routing metric. We suggest the use of artificial logic-i.e., neural networks (NNs). This protocol is based on mobile agent technologies controlled by a Hopfield neural network. In addition to this, our new routing metric is based on multicriteria optimization in order to minimize delay and blocking probability (rejected packets or their retransmission). The routing protocol observes real network parameters and real network environments. As a result of artificial logic intelligence, the proposed routing protocol should maximize usage of network resources and optimize network performance.

  8. A Neural Networks-Based Hybrid Routing Protocol for Wireless Mesh Networks

    PubMed Central

    Kojić, Nenad; Reljin, Irini; Reljin, Branimir

    2012-01-01

    The networking infrastructure of wireless mesh networks (WMNs) is decentralized and relatively simple, but they can display reliable functioning performance while having good redundancy. WMNs provide Internet access for fixed and mobile wireless devices. Both in urban and rural areas they provide users with high-bandwidth networks over a specific coverage area. The main problems affecting these networks are changes in network topology and link quality. In order to provide regular functioning, the routing protocol has the main influence in WMN implementations. In this paper we suggest a new routing protocol for WMN, based on good results of a proactive and reactive routing protocol, and for that reason it can be classified as a hybrid routing protocol. The proposed solution should avoid flooding and creating the new routing metric. We suggest the use of artificial logic—i.e., neural networks (NNs). This protocol is based on mobile agent technologies controlled by a Hopfield neural network. In addition to this, our new routing metric is based on multicriteria optimization in order to minimize delay and blocking probability (rejected packets or their retransmission). The routing protocol observes real network parameters and real network environments. As a result of artificial logic intelligence, the proposed routing protocol should maximize usage of network resources and optimize network performance. PMID:22969360

  9. The effect of stimulus strength on binocular rivalry rate in healthy individuals: Implications for genetic, clinical and individual differences studies.

    PubMed

    Law, Phillip C F; Miller, Steven M; Ngo, Trung T

    2017-11-01

    Binocular rivalry (BR) occurs when conflicting images concurrently presented to corresponding retinal locations of each eye stochastically alternate in perception. Anomalies of BR rate have been examined in a range of clinical psychiatric conditions. In particular, slow BR rate has been proposed as an endophenotype for bipolar disorder (BD) to improve power in large-scale genome-wide association studies. Examining the validity of BR rate as a BD endophenotype however requires large-scale datasets (n=1000s to 10,000s), a standardized testing protocol, and optimization of stimulus parameters to maximize separation between BD and healthy groups. Such requirements are indeed relevant to all clinical psychiatric BR studies. Here we address the issue of stimulus optimization by examining the effect of stimulus parameter variation on BR rate and mixed-percept duration (MPD) in healthy individuals. We aimed to identify the stimulus parameters that induced the fastest BR rates with the least MPD. Employing a repeated-measures within-subjects design, 40 healthy adults completed four BR tasks using orthogonally drifting grating stimuli that varied in drift speed and aperture size. Pairwise comparisons were performed to determine modulation of BR rate and MPD by these stimulus parameters, and individual variation of such modulation was also assessed. From amongst the stimulus parameters examined, we found that 8cycles/s drift speed in a 1.5° aperture induced the fastest BR rate without increasing MPD, but that BR rate with this stimulus configuration was not substantially different to BR rate with stimulus parameters we have used in previous studies (i.e., 4cycles/s drift speed in a 1.5° aperture). In addition to contributing to stimulus optimization issues, the findings have implications for Levelt's Proposition IV of binocular rivalry dynamics and individual differences in such dynamics. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Five years' experience of the modified Meek technique in the management of extensive burns.

    PubMed

    Hsieh, Chun-Sheng; Schuong, Jen-Yu; Huang, W S; Huang, Ted T

    2008-05-01

    The Meek technique of skin expansion is useful for covering a large open wound with a small piece of skin graft, but requires a carefully followed protocol. Over the past 5 years, a skin graft expansion technique following the Meek principle was used to treat 37 individuals who had sustained third degree burns involving more than 40% of the body surface. A scheme was devised whereby the body was divided into six areas, in order to clarify the optimal order of wound debridements and skin grafting procedures as well as the regimen of aftercare. The mean body surface involvement was 72.9% and the mean area of third degree burns was 41%. The average number of operations required was 1.84. There were four deaths among in this group of patients. The Meek technique of skin expansion and the suggested protocol are together efficient and effective in covering an open wound, particularly where there is a paucity of skin graft donor sites.

  11. Chromoendoscopy in magnetically guided capsule endoscopy

    PubMed Central

    2013-01-01

    Background Diagnosis of intestinal metaplasia and dysplasia via conventional endoscopy is characterized by low interobserver agreement and poor correlation with histopathologic findings. Chromoendoscopy significantly enhances the visibility of mucosa irregularities, like metaplasia and dysplasia mucosa. Magnetically guided capsule endoscopy (MGCE) offers an alternative technology for upper GI examination. We expect the difficulties of diagnosis of neoplasm in conventional endoscopy to transfer to MGCE. Thus, we aim to chart a path for the application of chromoendoscopy on MGCE via an ex-vivo animal study. Methods We propose a modified preparation protocol which adds a staining step to the existing MGCE preparation protocol. An optimal staining concentration is quantitatively determined for different stain types and pathologies. To that end 190 pig stomach tissue samples with and without lesion imitations were stained with different dye concentrations. Quantitative visual criteria are introduced to measure the quality of the staining with respect to mucosa and lesion visibility. Thusly determined optimal concentrations are tested in an ex-vivo pig stomach experiment under magnetic guidance of an endoscopic capsule with the modified protocol. Results We found that the proposed protocol modification does not impact the visibility in the stomach or steerability of the endoscopy capsule. An average optimal staining concentration for the proposed protocol was found at 0.4% for Methylene blue and Indigo carmine. The lesion visibility is improved using the previously obtained optimal dye concentration. Conclusions We conclude that chromoendoscopy may be applied in MGCE and improves mucosa and lesion visibility. Systematic evaluation provides important information on appropriate staining concentration. However, further animal and human in-vivo studies are necessary. PMID:23758801

  12. High-Throughput Screening Assay for Embryoid Body Differentiation of Human Embryonic Stem Cells

    PubMed Central

    Outten, Joel T.; Gadue, Paul; French, Deborah L.; Diamond, Scott L.

    2012-01-01

    Serum-free human pluripotent stem cell media offer the potential to develop reproducible clinically applicable differentiation strategies and protocols. The vast array of possible growth factor and cytokine combinations for media formulations makes differentiation protocol optimization both labor and cost-intensive. This unit describes a 96-well plate, 4-color flow cytometry-based screening assay to optimize pluripotent stem cell differentiation protocols. We provide conditions both to differentiate human embryonic stem cells (hESCs) to the three primary germ layers, ectoderm, endoderm, and mesoderm, and to utilize flow cytometry to distinguish between them. This assay exhibits low inter-well variability and can be utilized to efficiently screen a variety of media formulations, reducing cost, incubator space, and labor. Protocols can be adapted to a variety of differentiation stages and lineages. PMID:22415836

  13. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  14. Detection of minor and major satellite DNA in cytokinesis-blocked mouse splenocytes by a PRINS tandem labelling approach.

    PubMed

    Russo, A; Tommasi, A M; Renzi, L

    1996-11-01

    A protocol for the simultaneous visualization of minor and major satellite DNA by primed in situ DNA synthesis (PRINS) was developed in cytokinesis-blocked murine splenocytes. After individuation of optimal experimental conditions, a micronucleus (MN) test was carried out by treating splenocytes in vitro with the clastogenic agent mitomycin C and the aneugenic compound Colcemid. It was found that PRINS gives highly reproducible results, also comparable with the literature on MN results obtained by fluorescent in situ hybridization (FISH). Therefore the PRINS methodology may be proposed as a fast alternative to FISH for the characterization of induced MN.

  15. Triage and optimization: A new paradigm in the treatment of massive pulmonary embolism.

    PubMed

    Pasrija, Chetan; Shah, Aakash; George, Praveen; Kronfli, Anthony; Raithel, Maxwell; Boulos, Francesca; Ghoreishi, Mehrdad; Bittle, Gregory J; Mazzeffi, Michael A; Rubinson, Lewis; Gammie, James S; Griffith, Bartley P; Kon, Zachary N

    2018-04-07

    Massive pulmonary embolism (PE) remains a highly fatal condition. Although venoarterial extracorporeal membrane oxygenation (VA-ECMO) and surgical pulmonary embolectomy in the management of massive PE have been reported previously, the outcomes remain less than ideal. We hypothesized that the institution of a protocolized approach of triage and optimization using VA-ECMO would result in improved outcomes compared with historical surgical management. All patients with a massive PE referred to the cardiac surgery service between 2010 and 2017 were retrospectively reviewed. Patients were stratified by treatment strategy: historical control versus the protocolized approach. In the historical control group, the primary intervention was surgical pulmonary embolectomy. In the protocol approach group, patients were treated based on an algorithmic approach using VA-ECMO. The primary outcome was 1-year survival. A total of 56 patients (control, n = 27; protocol, n = 29) were identified. All 27 patients in the historical control group underwent surgical pulmonary embolectomy, whereas 2 of 29 patients in the protocol approach group were deemed appropriate for direct surgical pulmonary embolectomy. The remaining 27 patients were placed on VA-ECMO. In the protocol approach group, 15 of 29 patients were treated with anticoagulation alone and 14 patients ultimately required surgical pulmonary embolectomy. One-year survival was significantly lower in the historical control group compared with the protocol approach group (73% vs 96%; P = .02), with no deaths occurring after surgical pulmonary embolectomy in the protocol approach group. A protocolized strategy involving the aggressive institution of VA-ECMO appears to be an effective method to triage and optimize patients with massive PE to recovery or intervention. Implementation of this strategy rather than an aggressive surgical approach may reduce the mortality associated with massive PE. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  16. Optimized tomography of continuous variable systems using excitation counting

    NASA Astrophysics Data System (ADS)

    Shen, Chao; Heeres, Reinier W.; Reinhold, Philip; Jiang, Luyao; Liu, Yi-Kai; Schoelkopf, Robert J.; Jiang, Liang

    2016-11-01

    We propose a systematic procedure to optimize quantum state tomography protocols for continuous variable systems based on excitation counting preceded by a displacement operation. Compared with conventional tomography based on Husimi or Wigner function measurement, the excitation counting approach can significantly reduce the number of measurement settings. We investigate both informational completeness and robustness, and provide a bound of reconstruction error involving the condition number of the sensing map. We also identify the measurement settings that optimize this error bound, and demonstrate that the improved reconstruction robustness can lead to an order-of-magnitude reduction of estimation error with given resources. This optimization procedure is general and can incorporate prior information of the unknown state to further simplify the protocol.

  17. Understanding patient perspectives on management of their chronic pain - online survey protocol.

    PubMed

    Gaikwad, Manasi; Vanlint, Simon; Moseley, G Lorimer; Mittinty, Murthy N; Stocks, Nigel

    2017-01-01

    It is widely recognized that both doctors and patients report discontent regarding pain management provided and received. The impact of chronic pain on an individual's life resonates beyond physical and mental suffering; equal or at times even greater impact is observed on an individual's personal relationships, ability to work, and social interactions. The degree of these effects in each individual varies, mainly because of differences in biological factors, social environment, past experiences, support, and belief systems. Therefore, it is equally possible that these individual patient characteristics could influence their treatment outcome. Research shows that meeting patient expectations is a major challenge for health care systems attempting to provide optimal treatment strategies. However, patient perspectives and expectations in chronic pain management have not been studied extensively. The aim of this study is to investigate the views, perceptions, beliefs, and expectations of individuals who experience chronic pain on a daily basis, and the strategies used by them in managing chronic pain. This paper describes the study protocol to be used in a cross sectional survey of chronic pain patients. The study population will comprise of individuals aged ≥18 years, who have experienced pain for ≥3 months with no restrictions of sex, ethnicity, or region of residence. Ethics approval for our study was obtained from Humans research ethics committees, University of Adelaide and University of South Australia. Multinomial logistic regression will be used to estimate the effect of duration and character of pain, on patient's perception of time to recovery and supplement intake. Logistic regression will also be used for estimating the effect of patient-provider relationship and pain education on patient-reported recovery and pain intensity. Knowledge about the perceptions and beliefs of patients with chronic pain could inform future policies, research, health care professional education, and development of individualized treatment strategies.

  18. Efficiency in nonequilibrium molecular dynamics Monte Carlo simulations

    DOE PAGES

    Radak, Brian K.; Roux, Benoît

    2016-10-07

    Hybrid algorithms combining nonequilibrium molecular dynamics and Monte Carlo (neMD/MC) offer a powerful avenue for improving the sampling efficiency of computer simulations of complex systems. These neMD/MC algorithms are also increasingly finding use in applications where conventional approaches are impractical, such as constant-pH simulations with explicit solvent. However, selecting an optimal nonequilibrium protocol for maximum efficiency often represents a non-trivial challenge. This work evaluates the efficiency of a broad class of neMD/MC algorithms and protocols within the theoretical framework of linear response theory. The approximations are validated against constant pH-MD simulations and shown to provide accurate predictions of neMD/MC performance.more » An assessment of a large set of protocols confirms (both theoretically and empirically) that a linear work protocol gives the best neMD/MC performance. Lastly, a well-defined criterion for optimizing the time parameters of the protocol is proposed and demonstrated with an adaptive algorithm that improves the performance on-the-fly with minimal cost.« less

  19. Social-aware data dissemination in opportunistic mobile social networks

    NASA Astrophysics Data System (ADS)

    Yang, Yibo; Zhao, Honglin; Ma, Jinlong; Han, Xiaowei

    Opportunistic Mobile Social Networks (OMSNs), formed by mobile users with social relationships and characteristics, enhance spontaneous communication among users that opportunistically encounter each other. Such networks can be exploited to improve the performance of data forwarding. Discovering optimal relay nodes is one of the important issues for efficient data propagation in OMSNs. Although traditional centrality definitions to identify the nodes features in network, they cannot identify effectively the influential nodes for data dissemination in OMSNs. Existing protocols take advantage of spatial contact frequency and social characteristics to enhance transmission performance. However, existing protocols have not fully exploited the benefits of the relations and the effects between geographical information, social features and user interests. In this paper, we first evaluate these three characteristics of users and design a routing protocol called Geo-Social-Interest (GSI) protocol to select optimal relay nodes. We compare the performance of GSI using real INFOCOM06 data sets. The experiment results demonstrate that GSI overperforms the other protocols with highest data delivery ratio and low communication overhead.

  20. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstratemore » the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.« less

  1. The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2002-01-01

    The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.

  2. Single-photon quantum key distribution in the presence of loss

    NASA Astrophysics Data System (ADS)

    Curty, Marcos; Moroder, Tobias

    2007-05-01

    We investigate two-way and one-way single-photon quantum key distribution (QKD) protocols in the presence of loss introduced by the quantum channel. Our analysis is based on a simple precondition for secure QKD in each case. In particular, the legitimate users need to prove that there exists no separable state (in the case of two-way QKD), or that there exists no quantum state having a symmetric extension (one-way QKD), that is compatible with the available measurements results. We show that both criteria can be formulated as a convex optimization problem known as a semidefinite program, which can be efficiently solved. Moreover, we prove that the solution to the dual optimization corresponds to the evaluation of an optimal witness operator that belongs to the minimal verification set of them for the given two-way (or one-way) QKD protocol. A positive expectation value of this optimal witness operator states that no secret key can be distilled from the available measurements results. We apply such analysis to several well-known single-photon QKD protocols under losses.

  3. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT.

    PubMed

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-07-07

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6-5 and acquisition energy window widths of 16-22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16-22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose.

  4. Robust DNA Isolation and High-throughput Sequencing Library Construction for Herbarium Specimens.

    PubMed

    Saeidi, Saman; McKain, Michael R; Kellogg, Elizabeth A

    2018-03-08

    Herbaria are an invaluable source of plant material that can be used in a variety of biological studies. The use of herbarium specimens is associated with a number of challenges including sample preservation quality, degraded DNA, and destructive sampling of rare specimens. In order to more effectively use herbarium material in large sequencing projects, a dependable and scalable method of DNA isolation and library preparation is needed. This paper demonstrates a robust, beginning-to-end protocol for DNA isolation and high-throughput library construction from herbarium specimens that does not require modification for individual samples. This protocol is tailored for low quality dried plant material and takes advantage of existing methods by optimizing tissue grinding, modifying library size selection, and introducing an optional reamplification step for low yield libraries. Reamplification of low yield DNA libraries can rescue samples derived from irreplaceable and potentially valuable herbarium specimens, negating the need for additional destructive sampling and without introducing discernible sequencing bias for common phylogenetic applications. The protocol has been tested on hundreds of grass species, but is expected to be adaptable for use in other plant lineages after verification. This protocol can be limited by extremely degraded DNA, where fragments do not exist in the desired size range, and by secondary metabolites present in some plant material that inhibit clean DNA isolation. Overall, this protocol introduces a fast and comprehensive method that allows for DNA isolation and library preparation of 24 samples in less than 13 h, with only 8 h of active hands-on time with minimal modifications.

  5. Enabling Next-Generation Multicore Platforms in Embedded Applications

    DTIC Science & Technology

    2014-04-01

    mapping to sets 129 − 256 ) to the second page in memory, color 2 (sets 257 − 384) to the third page, and so on. Then, after the 32nd page, all 212 sets...the Real-Time Nested Locking Protocol (RNLP) [56], a recently developed multiprocessor real-time locking protocol that optimally supports the...RELEASE; DISTRIBUTION UNLIMITED 15 In general, the problems of optimally assigning tasks to processors and colors to tasks are both NP-hard in the

  6. Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis.

    PubMed

    Ali, Ather; Kahn, Janet; Rosenberger, Lisa; Perlman, Adam I

    2012-10-04

    Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists' clinical judgment and maintaining consistency with a prior pilot study. The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Clinicaltrials.gov NCT00970008 (18 August 2009).

  7. Suicide Risk Protocols: Addressing the Needs of High Risk Youths Identified through Suicide Prevention Efforts and in Clinical Settings

    ERIC Educational Resources Information Center

    Heilbron, Nicole; Goldston, David; Walrath, Christine; Rodi, Michael; McKeon, Richard

    2013-01-01

    Several agencies have emphasized the importance of establishing clear protocols or procedures to address the needs of youths who are identified as suicidal through suicide prevention programs or in emergency department settings. What constitutes optimal guidelines for developing and implementing such protocols, however, is unclear. At the request…

  8. In vitro selection and amplification protocols for isolation of aptameric sensors for small molecules

    PubMed Central

    Yang, Kyung-Ae; Pei, Renjun; Stojanovic, Milan N.

    2016-01-01

    We recently optimized a procedure that directly yields aptameric sensors for small molecules in so-called structure-switching format. The protocol has a high success rate, short time, and is sufficiently simple to be readily implemented in a non-specialist laboratory. We provide a stepwise guide to this selection protocol. PMID:27155227

  9. Time-to-isolation guided titration of freeze duration in 3rd generation short-tip cryoballoon pulmonary vein isolation - Comparable clinical outcome and shorter procedure duration.

    PubMed

    Pott, Alexander; Kraft, Christoph; Stephan, Tilman; Petscher, Kerstin; Rottbauer, Wolfgang; Dahme, Tillman

    2018-03-15

    The optimal freeze duration in cryoballoon pulmonary vein isolation (PVI) is unknown. The 3rd generation cryoballoon facilitates observation of the time-to-isolation (TTI) and thereby enables individualized cryoenergy titration. To evaluate the efficacy of an individualized freeze duration we compared the clinical outcome of patients treated with a TTI-guided ablation protocol to the outcome of patients treated with a fixed ablation protocol. We compared 100 patients treated with the 3rd generation cryoballoon applying a TTI-based protocol (TTI group) to 100 patients treated by a fixed freeze protocol (fixed group). In the fixed group a 240s freeze cycle was followed by a 240s bonus freeze after acute PV isolation. In the TTI group freeze duration was 180s if TTI was ≥30s and reduced to only 120s, if TTI was <30s. In case of a TTI >60s a 180s bonus freeze was applied. Freedom from atrial arrhythmia recurrence off class I/III antiarrhythmic drugs after one year was not different between the TTI group (73.6%) and the fixed group (75.7%; p=0.75). Mean procedure duration was 85.8±27.3min in the TTI group compared to 115.7±27.1min in the fixed group (p<0.001). Mean fluoroscopy time was 17.5±6.6min in the TTI group and 22.5±9.8min in the fixed group (p<0.001). TTI-guided cryoenergy titration leads to reduced procedure duration and fluoroscopy time and appears to be as effective as a fixed ablation strategy. A single 2-minute freeze seems to be sufficient in case of short TTI. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Low-dose computed tomography scans with automatic exposure control for patients of different ages undergoing cardiac PET/CT and SPECT/CT.

    PubMed

    Yang, Ching-Ching; Yang, Bang-Hung; Tu, Chun-Yuan; Wu, Tung-Hsin; Liu, Shu-Hsin

    2017-06-01

    This study aimed to evaluate the efficacy of automatic exposure control (AEC) in order to optimize low-dose computed tomography (CT) protocols for patients of different ages undergoing cardiac PET/CT and single-photon emission computed tomography/computed tomography (SPECT/CT). One PET/CT and one SPECT/CT were used to acquire CT images for four anthropomorphic phantoms representative of 1-year-old, 5-year-old and 10-year-old children and an adult. For the hybrid systems investigated in this study, the radiation dose and image quality of cardiac CT scans performed with AEC activated depend mainly on the selection of a predefined image quality index. Multiple linear regression methods were used to analyse image data from anthropomorphic phantom studies to investigate the effects of body size and predefined image quality index on CT radiation dose in cardiac PET/CT and SPECT/CT scans. The regression relationships have a coefficient of determination larger than 0.9, indicating a good fit to the data. According to the regression models, low-dose protocols using the AEC technique were optimized for patients of different ages. In comparison with the standard protocol with AEC activated for adult cardiac examinations used in our clinical routine practice, the optimized paediatric protocols in PET/CT allow 32.2, 63.7 and 79.2% CT dose reductions for anthropomorphic phantoms simulating 10-year-old, 5-year-old and 1-year-old children, respectively. The corresponding results for cardiac SPECT/CT are 8.4, 51.5 and 72.7%. AEC is a practical way to reduce CT radiation dose in cardiac PET/CT and SPECT/CT, but the AEC settings should be determined properly for optimal effect. Our results show that AEC does not eliminate the need for paediatric protocols and CT examinations using the AEC technique should be optimized for paediatric patients to reduce the radiation dose as low as reasonably achievable.

  11. A rapid and efficient SDS-based RNA isolation protocol from different tissues of coffee.

    PubMed

    Huded, Arun Kumar C; Jingade, Pavankumar; Mishra, Manoj Kumar

    2018-03-01

    Isolation of high-quality RNA from coffee is challenging because of high level of polysaccharides, polyphenols and other secondary metabolites. In the present study, a rapid and efficient RNA extraction protocol from different tissues of coffee was optimized. Sufficiently high quality and quantity (225.6-454.8 µg/g) of RNA was obtained by using the optimized protocol. The presence of two distinct bands of 28S rRNA and 18S rRNA in agarose gel proved the intactness of the RNA samples. The average spectrophotometric values of the isolated RNA ranged from 1.96 to 2.02 ( A 260/280 ) and 1.95 to 2.14 ( A 260/230 ), indicating the high quality of RNA devoid of polyphenols, polysaccharides and protein contamination. In the optimized protocol, addition of PVPP to the extraction buffer and a brief incubation of samples at 65 °C and subsequent purification with potassium acetate resulted in good-quality RNA isolation. The suitability of RNA for downstream processing was confirmed by PCR amplification with cytochrome c oxidase gene-specific primers. The amplification of a single 392 bp fragment using cDNA and 1.5 kb fragment using genomic DNA samples confirmed the absence of DNA contamination. The present protocol is rapid and yielded good quality and quantity of RNA suitable for functional genomics studies.

  12. Novel Multiplex Fluorescent PCR-Based Method for HLA Typing and Preimplantational Genetic Diagnosis of β-Thalassemia.

    PubMed

    Khosravi, Sharifeh; Salehi, Mansour; Ramezanzadeh, Mahboobeh; Mirzaei, Hamed; Salehi, Rasoul

    2016-05-01

    Thalassemia is curable by bone marrow transplantation; however, finding suitable donors with defined HLA combination remains a major challenge. Cord blood stem cells with preselected HLA system through preimplantation genetic diagnosis (PGD) proved very useful for resolving scarce HLA-matched bone marrow donors. A thalassemia trait couple with an affected child was included in this study. We used informative STR markers at the HLA and beta globin loci to develop a single cell multiplex fluorescent PCR protocol. The protocol was extensively optimized on single lymphocytes isolated from the couple's peripheral blood. The optimized protocol was applied on single blastomeres biopsied from day 3 cleavage stage IVF embryos of the couple. Four IVF embryos biopsied on day 3 and a single blastomere of each were provided for genetic diagnosis of combined β-thalassemia mutations and HLA typing. Of these, one embryo was diagnosed as homozygous normal for the thalassemia mutation and HLA matched with the existing affected sibling. The optimized protocol worked well in PGD clinical cycle for selection of thalassemia-unaffected embryos with the desired HLA system. Copyright © 2016 IMSS. Published by Elsevier Inc. All rights reserved.

  13. Dendritic Immunotherapy Improvement for an Optimal Control Murine Model

    PubMed Central

    Chimal-Eguía, J. C.; Castillo-Montiel, E.

    2017-01-01

    Therapeutic protocols in immunotherapy are usually proposed following the intuition and experience of the therapist. In order to deduce such protocols mathematical modeling, optimal control and simulations are used instead of the therapist's experience. Clinical efficacy of dendritic cell (DC) vaccines to cancer treatment is still unclear, since dendritic cells face several obstacles in the host environment, such as immunosuppression and poor transference to the lymph nodes reducing the vaccine effect. In view of that, we have created a mathematical murine model to measure the effects of dendritic cell injections admitting such obstacles. In addition, the model considers a therapy given by bolus injections of small duration as opposed to a continual dose. Doses timing defines the therapeutic protocols, which in turn are improved to minimize the tumor mass by an optimal control algorithm. We intend to supplement therapist's experience and intuition in the protocol's implementation. Experimental results made on mice infected with melanoma with and without therapy agree with the model. It is shown that the dendritic cells' percentage that manages to reach the lymph nodes has a crucial impact on the therapy outcome. This suggests that efforts in finding better methods to deliver DC vaccines should be pursued. PMID:28912828

  14. Step length and individual anaerobic threshold assessment in swimming.

    PubMed

    Fernandes, R J; Sousa, M; Machado, L; Vilas-Boas, J P

    2011-12-01

    Anaerobic threshold is widely used for diagnosis of swimming aerobic endurance but the precise incremental protocols step duration for its assessment is controversial. A physiological and biomechanical comparison between intermittent incremental protocols with different step lengths and a maximal lactate steady state (MLSS) test was conducted. 17 swimmers performed 7×200, 300 and 400 m (30 s and 24 h rest between steps and protocols) in front crawl until exhaustion and an MLSS test. The blood lactate concentration values ([La-]) at individual anaerobic threshold were 2.1±0.1, 2.2±0.2 and 1.8±0.1 mmol.l - 1 in the 200, 300 and 400 m protocols (with significant differences between 300 and 400 m tests), and 2.9±1.2 mmol.l - 1 at MLSS (higher than the incremental protocols); all these values are much lower than the traditional 4 mmol.l - 1 value. The velocities at individual anaerobic threshold obtained in incremental protocols were similar (and highly related) to the MLSS, being considerably lower than the velocity at 4 mmol.l - 1. Stroke rate increased and stroke length decreased throughout the different incremental protocols. It was concluded that it is valid to use intermittent incremental protocols of 200 and 300 m lengths to assess the swimming velocity corresponding to individual anaerobic threshold, the progressive protocols tend to underestimate the [La-] at anaerobic threshold assessed by the MLSS test, and swimmers increase velocity through stroke rate increases. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  16. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  17. Bio-mimic optimization strategies in wireless sensor networks: a survey.

    PubMed

    Adnan, Md Akhtaruzzaman; Abdur Razzaque, Mohammd; Ahmed, Ishtiaque; Isnin, Ismail Fauzi

    2013-12-24

    For the past 20 years, many authors have focused their investigations on wireless sensor networks. Various issues related to wireless sensor networks such as energy minimization (optimization), compression schemes, self-organizing network algorithms, routing protocols, quality of service management, security, energy harvesting, etc., have been extensively explored. The three most important issues among these are energy efficiency, quality of service and security management. To get the best possible results in one or more of these issues in wireless sensor networks optimization is necessary. Furthermore, in number of applications (e.g., body area sensor networks, vehicular ad hoc networks) these issues might conflict and require a trade-off amongst them. Due to the high energy consumption and data processing requirements, the use of classical algorithms has historically been disregarded. In this context contemporary researchers started using bio-mimetic strategy-based optimization techniques in the field of wireless sensor networks. These techniques are diverse and involve many different optimization algorithms. As far as we know, most existing works tend to focus only on optimization of one specific issue of the three mentioned above. It is high time that these individual efforts are put into perspective and a more holistic view is taken. In this paper we take a step in that direction by presenting a survey of the literature in the area of wireless sensor network optimization concentrating especially on the three most widely used bio-mimetic algorithms, namely, particle swarm optimization, ant colony optimization and genetic algorithm. In addition, to stimulate new research and development interests in this field, open research issues, challenges and future research directions are highlighted.

  18. EFFECTS OF RELIGIOUS VERSUS STANDARD COGNITIVE-BEHAVIORAL THERAPY ON OPTIMISM IN PERSONS WITH MAJOR DEPRESSION AND CHRONIC MEDICAL ILLNESS.

    PubMed

    Koenig, Harold G; Pearce, Michelle J; Nelson, Bruce; Daher, Noha

    2015-11-01

    We compared the effectiveness of religiously integrated cognitive behavioral therapy (RCBT) versus standard CBT (SCBT) on increasing optimism in persons with major depressive disorder (MDD) and chronic medical illness. Participants aged 18-85 were randomized to either RCBT (n = 65) or SCBT (n = 67) to receive ten 50-min sessions remotely (94% by telephone) over 12 weeks. Optimism was assessed at baseline, 12 and 24 weeks by the Life Orientation Test-Revised. Religiosity was assessed at baseline using a 29-item scale composed of religious importance, individual religious practices, intrinsic religiosity, and daily spiritual experiences. Mixed effects growth curve models were used to compare the effects of treatment group on trajectory of change in optimism. In the intention-to-treat analysis, both RCBT and SCBT increased optimism over time, although there was no significant difference between treatment groups (B = -0.75, SE = 0.57, t = -1.33, P = .185). Analyses in the highly religious and in the per protocol analysis indicated similar results. Higher baseline religiosity predicted an increase in optimism over time (B = 0.07, SE = 0.02, t = 4.12, P < .0001), and higher baseline optimism predicted a faster decline in depressive symptoms over time (B = -0.61, SE = 0.10, t = -6.30, P < .0001), both independent of treatment group. RCBT and SCBT are equally effective in increasing optimism in persons with MDD and chronic medical illness. While baseline religiosity does not moderate this effect, religiosity predicts increases in optimism over time independent of treatment group. © 2015 Wiley Periodicals, Inc.

  19. Practical quantum appointment scheduling

    NASA Astrophysics Data System (ADS)

    Touchette, Dave; Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-04-01

    We propose a protocol based on coherent states and linear optics operations for solving the appointment-scheduling problem. Our main protocol leaks strictly less information about each party's input than the optimal classical protocol, even when considering experimental errors. Along with the ability to generate constant-amplitude coherent states over two modes, this protocol requires the ability to transfer these modes back-and-forth between the two parties multiple times with very low losses. The implementation requirements are thus still challenging. Along the way, we develop tools to study quantum information cost of interactive protocols in the finite regime.

  20. Optimization of a secondary VOI protocol for lung imaging in a clinical CT scanner.

    PubMed

    Larsen, Thomas C; Gopalakrishnan, Vissagan; Yao, Jianhua; Nguyen, Catherine P; Chen, Marcus Y; Moss, Joel; Wen, Han

    2018-05-21

    We present a solution to meet an unmet clinical need of an in-situ "close look" at a pulmonary nodule or at the margins of a pulmonary cyst revealed by a primary (screening) chest CT while the patient is still in the scanner. We first evaluated options available on current whole-body CT scanners for high resolution screening scans, including ROI reconstruction of the primary scan data and HRCT, but found them to have insufficient SNR in lung tissue or discontinuous slice coverage. Within the capabilities of current clinical CT systems, we opted for the solution of a secondary, volume-of-interest (VOI) protocol where the radiation dose is focused into a short-beam axial scan at the z position of interest, combined with a small-FOV reconstruction at the xy position of interest. The objective of this work was to design a VOI protocol that is optimized for targeted lung imaging in a clinical whole-body CT system. Using a chest phantom containing a lung-mimicking foam insert with a simulated cyst, we identified the appropriate scan mode and optimized both the scan and recon parameters. The VOI protocol yielded 3.2 times the texture amplitude-to-noise ratio in the lung-mimicking foam when compared to the standard chest CT, and 8.4 times the texture difference between the lung mimicking and reference foams. It improved details of the wall of the simulated cyst and better resolution in a line-pair insert. The Effective Dose of the secondary VOI protocol was 42% on average and up to 100% in the worst-case scenario of VOI positioning relative to the standard chest CT. The optimized protocol will be used to obtain detailed CT textures of pulmonary lesions, which are biomarkers for the type and stage of lung diseases. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  1. Optimization of a human IgG B-cell ELISpot assay for the analysis of vaccine-induced B-cell responses.

    PubMed

    Jahnmatz, Maja; Kesa, Gun; Netterlid, Eva; Buisman, Anne-Marie; Thorstensson, Rigmor; Ahlborg, Niklas

    2013-05-31

    B-cell responses after infection or vaccination are often measured as serum titers of antigen-specific antibodies. Since this does not address the aspect of memory B-cell activity, it may not give a complete picture of the B-cell response. Analysis of memory B cells by ELISpot is therefore an important complement to conventional serology. B-cell ELISpot was developed more than 25 years ago and many assay protocols/reagents would benefit from optimization. We therefore aimed at developing an optimized B-cell ELISpot for the analysis of vaccine-induced human IgG-secreting memory B cells. A protocol was developed based on new monoclonal antibodies to human IgG and biotin-avidin amplification to increase the sensitivity. After comparison of various compounds commonly used to in vitro-activate memory B cells for ELISpot analysis, the TLR agonist R848 plus interleukin (IL)-2 was selected as the most efficient activator combination. The new protocol was subsequently compared to an established protocol, previously used in vaccine studies, based on polyclonal antibodies without biotin avidin amplification and activation of memory B-cells using a mix of antigen, CpG, IL-2 and IL-10. The new protocol displayed significantly better detection sensitivity, shortened the incubation time needed for the activation of memory B cells and reduced the amount of antigen required for the assay. The functionality of the new protocol was confirmed by analyzing specific memory B cells to five different antigens, induced in a limited number of subjects vaccinated against tetanus, diphtheria and pertussis. The limited number of subjects did not allow for a direct comparison with other vaccine studies. Optimization of the B-cell ELISpot will facilitate an improved analysis of IgG-secreting B cells in vaccine studies. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Quantum teleportation scheme by selecting one of multiple output ports

    NASA Astrophysics Data System (ADS)

    Ishizaka, Satoshi; Hiroshima, Tohya

    2009-04-01

    The scheme of quantum teleportation, where Bob has multiple (N) output ports and obtains the teleported state by simply selecting one of the N ports, is thoroughly studied. We consider both the deterministic version and probabilistic version of the teleportation scheme aiming to teleport an unknown state of a qubit. Moreover, we consider two cases for each version: (i) the state employed for the teleportation is fixed to a maximally entangled state and (ii) the state is also optimized as well as Alice’s measurement. We analytically determine the optimal protocols for all the four cases and show the corresponding optimal fidelity or optimal success probability. All these protocols can achieve the perfect teleportation in the asymptotic limit of N→∞ . The entanglement properties of the teleportation scheme are also discussed.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivak, David; Crooks, Gavin

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  4. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  5. Modeling and Simulation of a Novel Relay Node Based Secure Routing Protocol Using Multiple Mobile Sink for Wireless Sensor Networks.

    PubMed

    Perumal, Madhumathy; Dhandapani, Sivakumar

    2015-01-01

    Data gathering and optimal path selection for wireless sensor networks (WSN) using existing protocols result in collision. Increase in collision further increases the possibility of packet drop. Thus there is a necessity to eliminate collision during data aggregation. Increasing the efficiency is the need of the hour with maximum security. This paper is an effort to come up with a reliable and energy efficient WSN routing and secure protocol with minimum delay. This technique is named as relay node based secure routing protocol for multiple mobile sink (RSRPMS). This protocol finds the rendezvous point for optimal transmission of data using a "splitting tree" technique in tree-shaped network topology and then to determine all the subsequent positions of a sink the "Biased Random Walk" model is used. In case of an event, the sink gathers the data from all sources, when they are in the sensing range of rendezvous point. Otherwise relay node is selected from its neighbor to transfer packets from rendezvous point to sink. A symmetric key cryptography is used for secure transmission. The proposed relay node based secure routing protocol for multiple mobile sink (RSRPMS) is experimented and simulation results are compared with Intelligent Agent-Based Routing (IAR) protocol to prove that there is increase in the network lifetime compared with other routing protocols.

  6. Assisted closed-loop optimization of SSVEP-BCI efficiency

    PubMed Central

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo

    2012-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214

  7. Assisted closed-loop optimization of SSVEP-BCI efficiency.

    PubMed

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo

    2013-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.

  8. Protein analysis through Western blot of cells excised individually from human brain and muscle tissue

    PubMed Central

    Koob, A.O.; Bruns, L.; Prassler, C.; Masliah, E.; Klopstock, T.; Bender, A.

    2016-01-01

    Comparing protein levels from single cells in tissue has not been achieved through Western blot. Laser capture microdissection allows for the ability to excise single cells from sectioned tissue and compile an aggregate of cells in lysis buffer. In this study we analyzed proteins from cells excised individually from brain and muscle tissue through Western blot. After we excised individual neurons from the substantia nigra of the brain, the accumulated surface area of the individual cells was 120,000, 24,000, 360,000, 480,000, 600,000 μm2. We used an optimized Western blot protocol to probe for tyrosine hydroxylase in this cell pool. We also took 360,000 μm2 of astrocytes (1700 cells) and analyzed the specificity of the method. In muscle we were able to analyze the proteins of the five complexes of the electron transport chain through Western blot from 200 human cells. With this method, we demonstrate the ability to compare cell-specific protein levels in the brain and muscle and describe for the first time how to visualize proteins through Western blot from cells captured individually. PMID:22402104

  9. Protein analysis through Western blot of cells excised individually from human brain and muscle tissue.

    PubMed

    Koob, A O; Bruns, L; Prassler, C; Masliah, E; Klopstock, T; Bender, A

    2012-06-15

    Comparing protein levels from single cells in tissue has not been achieved through Western blot. Laser capture microdissection allows for the ability to excise single cells from sectioned tissue and compile an aggregate of cells in lysis buffer. In this study we analyzed proteins from cells excised individually from brain and muscle tissue through Western blot. After we excised individual neurons from the substantia nigra of the brain, the accumulated surface area of the individual cells was 120,000, 24,000, 360,000, 480,000, 600,000 μm2. We used an optimized Western blot protocol to probe for tyrosine hydroxylase in this cell pool. We also took 360,000 μm2 of astrocytes (1700 cells) and analyzed the specificity of the method. In muscle we were able to analyze the proteins of the five complexes of the electron transport chain through Western blot from 200 human cells. With this method, we demonstrate the ability to compare cell-specific protein levels in the brain and muscle and describe for the first time how to visualize proteins through Western blot from cells captured individually. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Continuous-variable measurement-device-independent quantum key distribution with virtual photon subtraction

    NASA Astrophysics Data System (ADS)

    Zhao, Yijia; Zhang, Yichen; Xu, Bingjie; Yu, Song; Guo, Hong

    2018-04-01

    The method of improving the performance of continuous-variable quantum key distribution protocols by postselection has been recently proposed and verified. In continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocols, the measurement results are obtained from untrusted third party Charlie. There is still not an effective method of improving CV-MDI QKD by the postselection with untrusted measurement. We propose a method to improve the performance of coherent-state CV-MDI QKD protocol by virtual photon subtraction via non-Gaussian postselection. The non-Gaussian postselection of transmitted data is equivalent to an ideal photon subtraction on the two-mode squeezed vacuum state, which is favorable to enhance the performance of CV-MDI QKD. In CV-MDI QKD protocol with non-Gaussian postselection, two users select their own data independently. We demonstrate that the optimal performance of the renovated CV-MDI QKD protocol is obtained with the transmitted data only selected by Alice. By setting appropriate parameters of the virtual photon subtraction, the secret key rate and tolerable excess noise are both improved at long transmission distance. The method provides an effective optimization scheme for the application of CV-MDI QKD protocols.

  11. Cryopreservation of GABAergic Neuronal Precursors for Cell-Based Therapy

    PubMed Central

    2017-01-01

    Cryopreservation protocols are essential for stem cells storage in order to apply them in the clinic. Here we describe a new standardized cryopreservation protocol for GABAergic neural precursors derived from the medial glanglionic eminence (MGE), a promising source of GABAergic neuronal progenitors for cell therapy against interneuron-related pathologies. We used 10% Me2SO as cryoprotectant and assessed the effects of cell culture amplification and cellular organization, as in toto explants, neurospheres, or individualized cells, on post-thaw cell viability and retrieval. We confirmed that in toto cryopreservation of MGE explants is an optimal preservation system to keep intact the interneuron precursor properties for cell transplantation, together with a high cell viability (>80%) and yield (>70%). Post-thaw proliferation and self-renewal of the cryopreserved precursors were tested in vitro. In addition, their migration capacity, acquisition of mature neuronal morphology, and potency to differentiate into multiple interneuron subtypes were also confirmed in vivo after transplantation. The results show that the cryopreserved precursor features remained intact and were similar to those immediately transplanted after their dissection from the MGE. We hope this protocol will facilitate the generation of biobanks to obtain a permanent and reliable source of GABAergic precursors for clinical application in cell-based therapies against interneuronopathies. PMID:28122047

  12. A practical guide for the identification of membrane and plasma membrane proteins in human embryonic stem cells and human embryonal carcinoma cells.

    PubMed

    Dormeyer, Wilma; van Hoof, Dennis; Mummery, Christine L; Krijgsveld, Jeroen; Heck, Albert J R

    2008-10-01

    The identification of (plasma) membrane proteins in cells can provide valuable insights into the regulation of their biological processes. Pluripotent cells such as human embryonic stem cells and embryonal carcinoma cells are capable of unlimited self-renewal and share many of the biological mechanisms that regulate proliferation and differentiation. The comparison of their membrane proteomes will help unravel the biological principles of pluripotency, and the identification of biomarker proteins in their plasma membranes is considered a crucial step to fully exploit pluripotent cells for therapeutic purposes. For these tasks, membrane proteomics is the method of choice, but as indicated by the scarce identification of membrane and plasma membrane proteins in global proteomic surveys it is not an easy task. In this minireview, we first describe the general challenges of membrane proteomics. We then review current sample preparation steps and discuss protocols that we found particularly beneficial for the identification of large numbers of (plasma) membrane proteins in human tumour- and embryo-derived stem cells. Our optimized assembled protocol led to the identification of a large number of membrane proteins. However, as the composition of cells and membranes is highly variable we still recommend adapting the sample preparation protocol for each individual system.

  13. CPAC: Energy-Efficient Data Collection through Adaptive Selection of Compression Algorithms for Sensor Networks

    PubMed Central

    Lee, HyungJune; Kim, HyunSeok; Chang, Ik Joon

    2014-01-01

    We propose a technique to optimize the energy efficiency of data collection in sensor networks by exploiting a selective data compression. To achieve such an aim, we need to make optimal decisions regarding two aspects: (1) which sensor nodes should execute compression; and (2) which compression algorithm should be used by the selected sensor nodes. We formulate this problem into binary integer programs, which provide an energy-optimal solution under the given latency constraint. Our simulation results show that the optimization algorithm significantly reduces the overall network-wide energy consumption for data collection. In the environment having a stationary sink from stationary sensor nodes, the optimized data collection shows 47% energy savings compared to the state-of-the-art collection protocol (CTP). More importantly, we demonstrate that our optimized data collection provides the best performance in an intermittent network under high interference. In such networks, we found that the selective compression for frequent packet retransmissions saves up to 55% energy compared to the best known protocol. PMID:24721763

  14. Sharing Service Resource Information for Application Integration in a Virtual Enterprise - Modeling the Communication Protocol for Exchanging Service Resource Information

    NASA Astrophysics Data System (ADS)

    Yamada, Hiroshi; Kawaguchi, Akira

    Grid computing and web service technologies enable us to use networked resources in a coordinated manner. An integrated service is made of individual services running on coordinated resources. In order to achieve such coordinated services autonomously, the initiator of a coordinated service needs to know detailed service resource information. This information ranges from static attributes like the IP address of the application server to highly dynamic ones like the CPU load. The most famous wide-area service discovery mechanism based on names is DNS. Its hierarchical tree organization and caching methods take advantage of the static information managed. However, in order to integrate business applications in a virtual enterprise, we need a discovery mechanism to search for the optimal resources based on the given a set of criteria (search keys). In this paper, we propose a communication protocol for exchanging service resource information among wide-area systems. We introduce the concept of the service domain that consists of service providers managed under the same management policy. This concept of the service domain is similar to that for autonomous systems (ASs). In each service domain, the service information provider manages the service resource information of service providers that exist in this service domain. The service resource information provider exchanges this information with other service resource information providers that belong to the different service domains. We also verified the protocol's behavior and effectiveness using a simulation model developed for proposed protocol.

  15. Development of a pharmacokinetic-guided dose individualization strategy for hydroxyurea treatment in children with sickle cell anaemia.

    PubMed

    Dong, Min; McGann, Patrick T; Mizuno, Tomoyuki; Ware, Russell E; Vinks, Alexander A

    2016-04-01

    Hydroxyurea has emerged as the primary disease-modifying therapy for patients with sickle cell anaemia (SCA). The laboratory and clinical benefits of hydroxyurea are optimal at maximum tolerated dose (MTD), but the current empirical dose escalation process often takes up to 12 months. The purpose of this study was to develop a pharmacokinetic-guided dosing strategy to reduce the time required to reach hydroxyurea MTD in children with SCA. Pharmacokinetic (PK) data from the HUSTLE trial (NCT00305175) were used to develop a population PK model using non-linear mixed effects modelling (nonmem 7.2). A D-optimal sampling strategy was developed to estimate individual PK and hydroxyurea exposure (area under the concentration-time curve (AUC)). The initial AUC target was derived from HUSTLE clinical data and defined as the mean AUC at MTD. PK profiles were best described by a one compartment with Michaelis-Menten elimination and a transit absorption model. Body weight and cystatin C were identified as significant predictors of hydroxyurea clearance. The following clinically feasible sampling times are included in a new prospective protocol: pre-dose (baseline), 15-20 min, 50-60 min and 3 h after an initial 20 mg kg(-1) oral dose. The mean target AUC(0,∞) for initial dose titration was 115 mg l(-1)  h. We developed a PK model-based individualized dosing strategy for the prospective Therapeutic Response Evaluation and Adherence Trial (TREAT, ClinicalTrials.gov NCT02286154). This approach has the potential to optimize the dose titration of hydroxyurea therapy for children with SCA, such that the clinical benefits at MTD are achieved more quickly. © 2015 The British Pharmacological Society.

  16. Development of a pharmacokinetic‐guided dose individualization strategy for hydroxyurea treatment in children with sickle cell anaemia

    PubMed Central

    Dong, Min; McGann, Patrick T.; Mizuno, Tomoyuki; Ware, Russell E.

    2016-01-01

    AIMS Hydroxyurea has emerged as the primary disease‐modifying therapy for patients with sickle cell anaemia (SCA). The laboratory and clinical benefits of hydroxyurea are optimal at maximum tolerated dose (MTD), but the current empirical dose escalation process often takes up to 12 months. The purpose of this study was to develop a pharmacokinetic‐guided dosing strategy to reduce the time required to reach hydroxyurea MTD in children with SCA. Methods Pharmacokinetic (PK) data from the HUSTLE trial (NCT00305175) were used to develop a population PK model using non‐linear mixed effects modelling (nonmem 7.2). A D‐optimal sampling strategy was developed to estimate individual PK and hydroxyurea exposure (area under the concentration–time curve (AUC)). The initial AUC target was derived from HUSTLE clinical data and defined as the mean AUC at MTD. Results PK profiles were best described by a one compartment with Michaelis–Menten elimination and a transit absorption model. Body weight and cystatin C were identified as significant predictors of hydroxyurea clearance. The following clinically feasible sampling times are included in a new prospective protocol: pre‐dose (baseline), 15–20 min, 50–60 min and 3 h after an initial 20 mg kg–1 oral dose. The mean target AUC(0,∞) for initial dose titration was 115 mg l–1 h. Conclusion We developed a PK model‐based individualized dosing strategy for the prospective Therapeutic Response Evaluation and Adherence Trial (TREAT, ClinicalTrials.gov NCT02286154). This approach has the potential to optimize the dose titration of hydroxyurea therapy for children with SCA, such that the clinical benefits at MTD are achieved more quickly. PMID:26615061

  17. Optimal protocol for maximum work extraction in a feedback process with a time-varying potential

    NASA Astrophysics Data System (ADS)

    Kwon, Chulan

    2017-12-01

    The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.

  18. Control systems and coordination protocols of the secretory pathway.

    PubMed

    Luini, Alberto; Mavelli, Gabriella; Jung, Juan; Cancino, Jorge

    2014-01-01

    Like other cellular modules, the secretory pathway and the Golgi complex are likely to be supervised by control systems that support homeostasis and optimal functionality under all conditions, including external and internal perturbations. Moreover, the secretory apparatus must be functionally connected with other cellular modules, such as energy metabolism and protein degradation, via specific rules of interaction, or "coordination protocols". These regulatory devices are of fundamental importance for optimal function; however, they are generally "hidden" at steady state. The molecular components and the architecture of the control systems and coordination protocols of the secretory pathway are beginning to emerge through studies based on the use of controlled transport-specific perturbations aimed specifically at the detection and analysis of these internal regulatory devices.

  19. Retrospective MicroRNA Sequencing: Complementary DNA Library Preparation Protocol Using Formalin-fixed Paraffin-embedded RNA Specimens.

    PubMed

    Loudig, Olivier; Liu, Christina; Rohan, Thomas; Ben-Dov, Iddo Z

    2018-05-05

    -Archived, clinically classified formalin-fixed paraffin-embedded (FFPE) tissues can provide nucleic acids for retrospective molecular studies of cancer development. By using non-invasive or pre-malignant lesions from patients who later develop invasive disease, gene expression analyses may help identify early molecular alterations that predispose to cancer risk. It has been well described that nucleic acids recovered from FFPE tissues have undergone severe physical damage and chemical modifications, which make their analysis difficult and generally requires adapted assays. MicroRNAs (miRNAs), however, which represent a small class of RNA molecules spanning only up to ~18-24 nucleotides, have been shown to withstand long-term storage and have been successfully analyzed in FFPE samples. Here we present a 3' barcoded complementary DNA (cDNA) library preparation protocol specifically optimized for the analysis of small RNAs extracted from archived tissues, which was recently demonstrated to be robust and highly reproducible when using archived clinical specimens stored for up to 35 years. This library preparation is well adapted to the multiplex analysis of compromised/degraded material where RNA samples (up to 18) are ligated with individual 3' barcoded adapters and then pooled together for subsequent enzymatic and biochemical preparations prior to analysis. All purifications are performed by polyacrylamide gel electrophoresis (PAGE), which allows size-specific selections and enrichments of barcoded small RNA species. This cDNA library preparation is well adapted to minute RNA inputs, as a pilot polymerase chain reaction (PCR) allows determination of a specific amplification cycle to produce optimal amounts of material for next-generation sequencing (NGS). This approach was optimized for the use of degraded FFPE RNA from specimens archived for up to 35 years and provides highly reproducible NGS data.

  20. Optimization of Rb-82 PET acquisition and reconstruction protocols for myocardial perfusion defect detection

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman; Lautamäki, Riikka; Lodge, Martin A.; Bengel, Frank M.; Tsui, Benjamin M. W.

    2009-05-01

    The purpose of this study is to optimize the dynamic Rb-82 cardiac PET acquisition and reconstruction protocols for maximum myocardial perfusion defect detection using realistic simulation data and task-based evaluation. Time activity curves (TACs) of different organs under both rest and stress conditions were extracted from dynamic Rb-82 PET images of five normal patients. Combined SimSET-GATE Monte Carlo simulation was used to generate nearly noise-free cardiac PET data from a time series of 3D NCAT phantoms with organ activities modeling different pre-scan delay times (PDTs) and total acquisition times (TATs). Poisson noise was added to the nearly noise-free projections and the OS-EM algorithm was applied to generate noisy reconstructed images. The channelized Hotelling observer (CHO) with 32× 32 spatial templates corresponding to four octave-wide frequency channels was used to evaluate the images. The area under the ROC curve (AUC) was calculated from the CHO rating data as an index for image quality in terms of myocardial perfusion defect detection. The 0.5 cycle cm-1 Butterworth post-filtering on OS-EM (with 21 subsets) reconstructed images generates the highest AUC values while those from iteration numbers 1 to 4 do not show different AUC values. The optimized PDTs for both rest and stress conditions are found to be close to the cross points of the left ventricular chamber and myocardium TACs, which may promote an individualized PDT for patient data processing and image reconstruction. Shortening the TATs for <~3 min from the clinically employed acquisition time does not affect the myocardial perfusion defect detection significantly for both rest and stress studies.

  1. Interactions of cortisol, testosterone, and resistance training: influence of circadian rhythms.

    PubMed

    Hayes, Lawrence D; Bickerstaff, Gordon F; Baker, Julien S

    2010-06-01

    Diurnal variation of sports performance usually peaks in the late afternoon, coinciding with increased body temperature. This circadian pattern of performance may be explained by the effect of increased core temperature on peripheral mechanisms, as neural drive does not appear to exhibit nycthemeral variation. This typical diurnal regularity has been reported in a variety of physical activities spanning the energy systems, from Adenosine triphosphate-phosphocreatine (ATP-PC) to anaerobic and aerobic metabolism, and is evident across all muscle contractions (eccentric, isometric, concentric) in a large number of muscle groups. Increased nerve conduction velocity, joint suppleness, increased muscular blood flow, improvements of glycogenolysis and glycolysis, increased environmental temperature, and preferential meteorological conditions may all contribute to diurnal variation in physical performance. However, the diurnal variation in strength performance can be blunted by a repeated-morning resistance training protocol. Optimal adaptations to resistance training (muscle hypertrophy and strength increases) also seem to occur in the late afternoon, which is interesting, since cortisol and, particularly, testosterone (T) concentrations are higher in the morning. T has repeatedly been linked with resistance training adaptation, and higher concentrations appear preferential. This has been determined by suppression of endogenous production and exogenous supplementation. However, the cortisol (C)/T ratio may indicate the catabolic/anabolic environment of an organism due to their roles in protein degradation and protein synthesis, respectively. The morning elevated T level (seen as beneficial to achieve muscle hypertrophy) may be counteracted by the morning elevated C level and, therefore, protein degradation. Although T levels are higher in the morning, an increased resistance exercise-induced T response has been found in the late afternoon, suggesting greater responsiveness of the hypothalamo-pituitary-testicular axis then. Individual responsiveness has also been observed, with some participants experiencing greater hypertrophy and strength increases in response to strength protocols, whereas others respond preferentially to power, hypertrophy, or strength endurance protocols dependent on which protocol elicited the greatest T response. It appears that physical performance is dependent on a number of endogenous time-dependent factors, which may be masked or confounded by exogenous circadian factors. Strength performance without time-of-day-specific training seems to elicit the typical diurnal pattern, as does resistance training adaptations. The implications for this are (a) athletes are advised to coincide training times with performance times, and (b) individuals may experience greater hypertrophy and strength gains when resistance training protocols are designed dependent on individual T response.

  2. Optimized quantum sensing with a single electron spin using real-time adaptive measurements.

    PubMed

    Bonato, C; Blok, M S; Dinani, H T; Berry, D W; Markham, M L; Twitchen, D J; Hanson, R

    2016-03-01

    Quantum sensors based on single solid-state spins promise a unique combination of sensitivity and spatial resolution. The key challenge in sensing is to achieve minimum estimation uncertainty within a given time and with high dynamic range. Adaptive strategies have been proposed to achieve optimal performance, but their implementation in solid-state systems has been hindered by the demanding experimental requirements. Here, we realize adaptive d.c. sensing by combining single-shot readout of an electron spin in diamond with fast feedback. By adapting the spin readout basis in real time based on previous outcomes, we demonstrate a sensitivity in Ramsey interferometry surpassing the standard measurement limit. Furthermore, we find by simulations and experiments that adaptive protocols offer a distinctive advantage over the best known non-adaptive protocols when overhead and limited estimation time are taken into account. Using an optimized adaptive protocol we achieve a magnetic field sensitivity of 6.1 ± 1.7 nT Hz(-1/2) over a wide range of 1.78 mT. These results open up a new class of experiments for solid-state sensors in which real-time knowledge of the measurement history is exploited to obtain optimal performance.

  3. Optimized quantum sensing with a single electron spin using real-time adaptive measurements

    NASA Astrophysics Data System (ADS)

    Bonato, C.; Blok, M. S.; Dinani, H. T.; Berry, D. W.; Markham, M. L.; Twitchen, D. J.; Hanson, R.

    2016-03-01

    Quantum sensors based on single solid-state spins promise a unique combination of sensitivity and spatial resolution. The key challenge in sensing is to achieve minimum estimation uncertainty within a given time and with high dynamic range. Adaptive strategies have been proposed to achieve optimal performance, but their implementation in solid-state systems has been hindered by the demanding experimental requirements. Here, we realize adaptive d.c. sensing by combining single-shot readout of an electron spin in diamond with fast feedback. By adapting the spin readout basis in real time based on previous outcomes, we demonstrate a sensitivity in Ramsey interferometry surpassing the standard measurement limit. Furthermore, we find by simulations and experiments that adaptive protocols offer a distinctive advantage over the best known non-adaptive protocols when overhead and limited estimation time are taken into account. Using an optimized adaptive protocol we achieve a magnetic field sensitivity of 6.1 ± 1.7 nT Hz-1/2 over a wide range of 1.78 mT. These results open up a new class of experiments for solid-state sensors in which real-time knowledge of the measurement history is exploited to obtain optimal performance.

  4. Modelling optimal location for pre-hospital helicopter emergency medical services.

    PubMed

    Schuurman, Nadine; Bell, Nathaniel J; L'Heureux, Randy; Hameed, Syed M

    2009-05-09

    Increasing the range and scope of early activation/auto launch helicopter emergency medical services (HEMS) may alleviate unnecessary injury mortality that disproportionately affects rural populations. To date, attempts to develop a quantitative framework for the optimal location of HEMS facilities have been absent. Our analysis used five years of critical care data from tertiary health care facilities, spatial data on origin of transport and accurate road travel time catchments for tertiary centres. A location optimization model was developed to identify where the expansion of HEMS would cover the greatest population among those currently underserved. The protocol was developed using geographic information systems (GIS) to measure populations, distances and accessibility to services. Our model determined Royal Inland Hospital (RIH) was the optimal site for an expanded HEMS - based on denominator population, distance to services and historical usage patterns. GIS based protocols for location of emergency medical resources can provide supportive evidence for allocation decisions - especially when resources are limited. In this study, we were able to demonstrate conclusively that a logical choice exists for location of additional HEMS. This protocol could be extended to location analysis for other emergency and health services.

  5. Optimal eavesdropping in cryptography with three-dimensional quantum states.

    PubMed

    Bruss, D; Macchiavello, C

    2002-03-25

    We study optimal eavesdropping in quantum cryptography with three-dimensional systems, and show that this scheme is more secure against symmetric attacks than protocols using two-dimensional states. We generalize the according eavesdropping transformation to arbitrary dimensions, and discuss the connection with optimal quantum cloning.

  6. Rationally optimized cryopreservation of multiple mouse embryonic stem cell lines: II—Mathematical prediction and experimental validation of optimal cryopreservation protocols☆

    PubMed Central

    Kashuba, Corinna M.; Benson, James D.; Critser, John K.

    2014-01-01

    In Part I, we documented differences in cryopreservation success measured by membrane integrity in four mouse embryonic stem cell (mESC) lines from different genetic backgrounds (BALB/c, CBA, FVB, and 129R1), and we demonstrated a potential biophysical basis for these differences through a comparative study characterizing the membrane permeability characteristics and osmotic tolerance limits of each cell line. Here we use these values to predict optimal cryoprotectants, cooling rates, warming rates, and plunge temperatures. We subsequently verified these predictions experimentally for their effects on post-thaw recovery. From this study, we determined that a cryopreservation protocol utilizing 1 M propylene glycol, a cooling rate of 1 °C/minute, and plunging into liquid nitrogen at −41 °C, combined with subsequent warming in a 22 °C water bath with agitation, significantly improved post-thaw recovery for three of the four mESC lines, and did not diminish post-thaw recovery for our single exception. It is proposed that this protocol can be successfully applied to most mESC lines beyond those included within this study once the effect of propylene glycol on mESC gene expression, growth characteristics, and germ-line transmission has been determined. Mouse ESC lines with poor survival using current standard cryopreservation protocols or our proposed protocol can be optimized on a case-by-case basis using the method we have outlined over two papers. For our single exception, the CBA cell line, a cooling rate of 5 °C/minute in the presence of 1.0 M dimethyl sulfoxide or 1.0 M propylene glycol, combined with plunge temperature of −80 °C was optimal. PMID:24560712

  7. Current Perspectives on Profiling and Enhancing Wheelchair Court Sport Performance.

    PubMed

    Paulson, Thomas; Goosey-Tolfrey, Victoria

    2017-03-01

    Despite the growing interest in Paralympic sport, the evidence base for supporting elite wheelchair sport performance remains in its infancy when compared with able-bodied (AB) sport. Subsequently, current practice is often based on theory adapted from AB guidelines, with a heavy reliance on anecdotal evidence and practitioner experience. Many principles in training prescription and performance monitoring with wheelchair athletes are directly transferable from AB practice, including the periodization and tapering of athlete loads around competition, yet considerations for the physiological consequences of an athlete's impairment and the interface between athlete and equipment are vital when targeting interventions to optimize in-competition performance. Researchers and practitioners are faced with the challenge of identifying and implementing reliable protocols that detect small but meaningful changes in impairment-specific physical capacities and on-court performance. Technologies to profile both linear and rotational on-court performance are an essential component of sport-science support to understand sport-specific movement profiles and prescribe training intensities. In addition, an individualized approach to the prescription of athlete training and optimization of the "wheelchair-user interface" is required, accounting for an athlete's anthropometrics, sports classification, and positional role on court. In addition to enhancing physical capacities, interventions must focus on the integration of the athlete and his or her equipment, as well as techniques for limiting environmental influence on performance. Taken together, the optimization of wheelchair sport performance requires a multidisciplinary approach based on the individual requirements of each athlete.

  8. Optimization of OSPF Routing in IP Networks

    NASA Astrophysics Data System (ADS)

    Bley, Andreas; Fortz, Bernard; Gourdin, Eric; Holmberg, Kaj; Klopfenstein, Olivier; Pióro, Michał; Tomaszewski, Artur; Ümit, Hakan

    The Internet is a huge world-wide packet switching network comprised of more than 13,000 distinct subnetworks, referred to as Autonomous Systems (ASs) autonomous system AS . They all rely on the Internet Protocol (IP) internet protocol IP for transport of packets across the network. And most of them use shortest path routing protocols shortest path routing!protocols , such as OSPF or IS-IS, to control the routing of IP packets routing!of IP packets within an AS. The idea of the routing is extremely simple — every packet is forwarded on IP links along the shortest route between its source and destination nodes of the AS. The AS network administrator can manage the routing of packets in the AS by supplying the so-called administrative weights of IP links, which specify the link lengths that are used by the routing protocols for their shortest path computations. The main advantage of the shortest path routing policy is its simplicity, allowing for little administrative overhead. From the network engineering perspective, however, shortest path routing can pose problems in achieving satisfactory traffic handling efficiency. As all routing paths depend on the same routing metric routing!metric , it is not possible to configure the routing paths for the communication demands between different pairs of nodes explicitly or individually; the routing can be controlled only indirectly and only as a whole by modifying the routing metric. Thus, one of the main tasks when planning such networks is to find administrative link weights that induce a globally efficient traffic routing traffic!routing configuration of an AS. It turns out that this task leads to very difficult mathematical optimization problems. In this chapter, we discuss and describe exact integer programming models and solution approaches as well as practically efficient smart heuristics for such shortest path routing problems shortest path routing!problems .

  9. iCLIP: Protein–RNA interactions at nucleotide resolution

    PubMed Central

    Huppertz, Ina; Attig, Jan; D’Ambrogio, Andrea; Easton, Laura E.; Sibley, Christopher R.; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej

    2014-01-01

    RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein–RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein–RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. PMID:24184352

  10. Genetic Interaction Mapping in Schizosaccharomyces pombe Using the Pombe Epistasis Mapper (PEM) System and a ROTOR HDA Colony Replicating Robot in a 1536 Array Format.

    PubMed

    Roguev, Assen; Xu, Jiewei; Krogan, Nevan

    2018-02-01

    This protocol describes an optimized high-throughput procedure for generating double deletion mutants in Schizosaccharomyces pombe using the colony replicating robot ROTOR HDA and the PEM (pombe epistasis mapper) system. The method is based on generating high-density colony arrays (1536 colonies per agar plate) and passaging them through a series of antidiploid and mating-type selection (ADS-MTS) and double-mutant selection (DMS) steps. Detailed program parameters for each individual replication step are provided. Using this procedure, batches of 25 or more screens can be routinely performed. © 2018 Cold Spring Harbor Laboratory Press.

  11. Deep Vein Thrombosis Prophylaxis: State of the Art.

    PubMed

    Lieberman, Jay R

    2018-03-21

    The selection of a prophylaxis regimen to prevent symptomatic pulmonary embolism and deep vein thrombosis is a balance between efficacy and safety. The latest American Academy of Orthopaedic Surgeons guideline recommended that either chemoprophylaxis or mechanical prophylaxis be used after total joint arthroplasty but did not recommend specific agents. However, the latest evidence-based American College of Chest Physicians guideline recommended a variety of chemoprophylaxis and mechanical agents for a minimum of 10 to 14 days after total joint arthroplasty. Risk stratification is the key to the selection of the appropriate prophylaxis regimen for the individual patient, but the optimal risk stratification protocol still needs to be developed. Copyright © 2018. Published by Elsevier Inc.

  12. Continuous-variable quantum key distribution in non-Markovian channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasile, Ruggero; Olivares, Stefano; CNISM, Unita di Ricerca di Milano Universita, I-20133 Milano

    2011-04-15

    We address continuous-variable quantum key distribution (QKD) in non-Markovian lossy channels and show how the non-Markovian features may be exploited to enhance security and/or to detect the presence and the position of an eavesdropper along the transmission line. In particular, we suggest a coherent-state QKD protocol which is secure against Gaussian individual attacks based on optimal 1{yields}2 asymmetric cloning machines for arbitrarily low values of the overall transmission line. The scheme relies on specific non-Markovian properties, and cannot be implemented in ordinary Markovian channels characterized by uniform losses. Our results give a clear indication of the potential impact of non-Markovianmore » effects in QKD.« less

  13. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT

    PubMed Central

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-01-01

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6–5 and acquisition energy window widths of 16–22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16–22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose. PMID:26083239

  14. The Interface of Clinical Decision-Making With Study Protocols for Knowledge Translation From a Walking Recovery Trial.

    PubMed

    Hershberg, Julie A; Rose, Dorian K; Tilson, Julie K; Brutsch, Bettina; Correa, Anita; Gallichio, Joann; McLeod, Molly; Moore, Craig; Wu, Sam; Duncan, Pamela W; Behrman, Andrea L

    2017-01-01

    Despite efforts to translate knowledge into clinical practice, barriers often arise in adapting the strict protocols of a randomized, controlled trial (RCT) to the individual patient. The Locomotor Experience Applied Post-Stroke (LEAPS) RCT demonstrated equal effectiveness of 2 intervention protocols for walking recovery poststroke; both protocols were more effective than usual care physical therapy. The purpose of this article was to provide knowledge-translation tools to facilitate implementation of the LEAPS RCT protocols into clinical practice. Participants from 2 of the trial's intervention arms: (1) early Locomotor Training Program (LTP) and (2) Home Exercise Program (HEP) were chosen for case presentation. The two cases illustrate how the protocols are used in synergy with individual patient presentations and clinical expertise. Decision algorithms and guidelines for progression represent the interface between implementation of an RCT standardized intervention protocol and clinical decision-making. In each case, the participant presents with a distinct clinical challenge that the therapist addresses by integrating the participant's unique presentation with the therapist's expertise while maintaining fidelity to the LEAPS protocol. Both participants progressed through an increasingly challenging intervention despite their own unique presentation. Decision algorithms and exercise progression for the LTP and HEP protocols facilitate translation of the RCT protocol to the real world of clinical practice. The two case examples to facilitate translation of the LEAPS RCT into clinical practice by enhancing understanding of the protocols, their progression, and their application to individual participants.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, available at: http://links.lww.com/JNPT/A147).

  15. The international experience of bacterial screen testing of platelet components with an automated microbial detection system: a need for consensus testing and reporting guidelines.

    PubMed

    Benjamin, Richard J; McDonald, Carl P

    2014-04-01

    The BacT/ALERT microbial detection system (bioMerieux, Inc, Durham, NC) is in routine use in many blood centers as a prerelease test for platelet collections. Published reports document wide variation in practices and outcomes. A systematic review of the English literature was performed to describe publications assessing the use of the BacT/ALERT culture system on platelet collections as a routine screen test of more than 10000 platelet components. Sixteen publications report the use of confirmatory testing to substantiate initial positive culture results but use varying nomenclature to classify the results. Preanalytical and analytical variables that may affect the outcomes differ widely between centers. Incomplete description of protocol details complicates comparison between sites. Initial positive culture results range from 539 to 10606 per million (0.054%-1.061%) and confirmed positive from 127 to 1035 per million (0.013%-0.104%) donations. False-negative results determined by outdate culture range from 662 to 2173 per million (0.066%-0.217%) and by septic reactions from 0 to 66 per million (0%-0.007%) collections. Current culture protocols represent pragmatic compromises between optimizing analytical sensitivity and ensuring the timely availability of platelets for clinical needs. Insights into the effect of protocol variations on outcomes are generally restricted to individual sites that implement limited changes to their protocols over time. Platelet manufacturers should reassess the adequacy of their BacT/ALERT screening protocols in light of the growing international experience and provide detailed documentation of all variables that may affect culture outcomes when reporting results. We propose a framework for a standardized nomenclature for reporting of the results of BacT/ALERT screening. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. An Adaptive Cultural Algorithm with Improved Quantum-behaved Particle Swarm Optimization for Sonar Image Detection.

    PubMed

    Wang, Xingmei; Hao, Wenqian; Li, Qiming

    2017-12-18

    This paper proposes an adaptive cultural algorithm with improved quantum-behaved particle swarm optimization (ACA-IQPSO) to detect the underwater sonar image. In the population space, to improve searching ability of particles, iterative times and the fitness value of particles are regarded as factors to adaptively adjust the contraction-expansion coefficient of the quantum-behaved particle swarm optimization algorithm (QPSO). The improved quantum-behaved particle swarm optimization algorithm (IQPSO) can make particles adjust their behaviours according to their quality. In the belief space, a new update strategy is adopted to update cultural individuals according to the idea of the update strategy in shuffled frog leaping algorithm (SFLA). Moreover, to enhance the utilization of information in the population space and belief space, accept function and influence function are redesigned in the new communication protocol. The experimental results show that ACA-IQPSO can obtain good clustering centres according to the grey distribution information of underwater sonar images, and accurately complete underwater objects detection. Compared with other algorithms, the proposed ACA-IQPSO has good effectiveness, excellent adaptability, a powerful searching ability and high convergence efficiency. Meanwhile, the experimental results of the benchmark functions can further demonstrate that the proposed ACA-IQPSO has better searching ability, convergence efficiency and stability.

  17. Cancer treatment as a game: integrating evolutionary game theory into the optimal control of chemotherapy

    NASA Astrophysics Data System (ADS)

    Orlando, Paul A.; Gatenby, Robert A.; Brown, Joel S.

    2012-12-01

    Chemotherapy for metastatic cancer commonly fails due to evolution of drug resistance in tumor cells. Here, we view cancer treatment as a game in which the oncologists choose a therapy and tumors ‘choose’ an adaptive strategy. We propose the oncologist can gain an upper hand in the game by choosing treatment strategies that anticipate the adaptations of the tumor. In particular, we examine the potential benefit of exploiting evolutionary tradeoffs in tumor adaptations to therapy. We analyze a math model where cancer cells face tradeoffs in allocation of resistance to two drugs. The tumor ‘chooses’ its strategy by natural selection and the oncologist chooses her strategy by solving a control problem. We find that when tumor cells perform best by investing resources to maximize response to one drug the optimal therapy is a time-invariant delivery of both drugs simultaneously. However, if cancer cells perform better using a generalist strategy allowing resistance to both drugs simultaneously, then the optimal protocol is a time varying solution in which the two drug concentrations negatively covary. However, drug interactions can significantly alter these results. We conclude that knowledge of both evolutionary tradeoffs and drug interactions is crucial in planning optimal chemotherapy schedules for individual patients.

  18. E-novo: an automated workflow for efficient structure-based lead optimization.

    PubMed

    Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit

    2009-07-01

    An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.

  19. Addressing practical challenges in utility optimization of mobile wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Eswaran, Sharanya; Misra, Archan; La Porta, Thomas; Leung, Kin

    2008-04-01

    This paper examines the practical challenges in the application of the distributed network utility maximization (NUM) framework to the problem of resource allocation and sensor device adaptation in a mission-centric wireless sensor network (WSN) environment. By providing rich (multi-modal), real-time information about a variety of (often inaccessible or hostile) operating environments, sensors such as video, acoustic and short-aperture radar enhance the situational awareness of many battlefield missions. Prior work on the applicability of the NUM framework to mission-centric WSNs has focused on tackling the challenges introduced by i) the definition of an individual mission's utility as a collective function of multiple sensor flows and ii) the dissemination of an individual sensor's data via a multicast tree to multiple consuming missions. However, the practical application and performance of this framework is influenced by several parameters internal to the framework and also by implementation-specific decisions. This is made further complex due to mobile nodes. In this paper, we use discrete-event simulations to study the effects of these parameters on the performance of the protocol in terms of speed of convergence, packet loss, and signaling overhead thereby addressing the challenges posed by wireless interference and node mobility in ad-hoc battlefield scenarios. This study provides better understanding of the issues involved in the practical adaptation of the NUM framework. It also helps identify potential avenues of improvement within the framework and protocol.

  20. Near-optimality of special periodic protocols for fluid models of single server switched networks with switchover times

    NASA Astrophysics Data System (ADS)

    Matveev, A. S.; Ishchenko, R.

    2017-11-01

    We consider a generic deterministic time-invariant fluid model of a single server switched network, which consists of finitely many infinite size buffers (queues) and receives constant rate inflows of jobs from the outside. Any flow undergoes a multi-phase service, entering a specific buffer after every phase, and ultimately leaves the network; the route of the flow over the buffers is pre-specified, and flows may merge inside the network. They share a common source of service, which can serve at most one buffer at a time and has to switch among buffers from time to time; any switch consumes a nonzero switchover period. With respect to the long-run maximal scaled wip (work in progress) performance metric, near-optimality of periodic scheduling and service protocols is established: the deepest optimum (that is over all feasible processes in the network, irrespective of the initial state) is furnished by such a protocol up to as small error as desired. Moreover, this can be achieved with a special periodic protocol introduced in the paper. It is also shown that the exhaustive policy is optimal for any buffer whose service at the maximal rate does not cause growth of the scaled wip.

  1. Efficient protocols for Stirling heat engines at the micro-scale

    NASA Astrophysics Data System (ADS)

    Muratore-Ginanneschi, Paolo; Schwieger, Kay

    2015-10-01

    We investigate the thermodynamic efficiency of sub-micro-scale Stirling heat engines operating under the conditions described by overdamped stochastic thermodynamics. We show how to construct optimal protocols such that at maximum power the efficiency attains for constant isotropic mobility the universal law η=2 ηC/(4-ηC) , where ηC is the efficiency of an ideal Carnot cycle. We show that these protocols are specified by the solution of an optimal mass transport problem. Such solution can be determined explicitly using well-known Monge-Ampère-Kantorovich reconstruction algorithms. Furthermore, we show that the same law describes the efficiency of heat engines operating at maximum work over short time periods. Finally, we illustrate the straightforward extension of these results to cases when the mobility is anisotropic and temperature dependent.

  2. Feeding Protocols for Neonates With Hypoplastic Left Heart Syndrome: A Review.

    PubMed

    Jenkins, Erin

    2015-01-01

    Optimizing nutrition in neonates with hypoplastic left heart syndrome is essential, given the high rate of growth failure in this population. Infants with hypoplastic left heart syndrome are predisposed to nutritional deficiency as a result of their increased metabolic demand; however, early enteral feeding also increases the risk of serious gastrointestinal morbidity and mortality caused by poor intestinal perfusion. Consequently, providers have difficulty deciding when and how to safely feed these patients. A review of the literature found that implementation of a structured enteral feeding protocol may decrease the risk of gastrointestinal complications while also minimizing dependence on parenteral nutrition and decreasing length of hospital stay. As these studies were limited, further research is warranted to establish a best practice feeding protocol to decrease risk and optimize nutrition in this fragile population.

  3. Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS.

    PubMed

    Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young

    2015-11-09

    In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry.

  4. Experimental Eavesdropping Based on Optimal Quantum Cloning

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Lemr, Karel; Černoch, Antonín; Soubusta, Jan; Miranowicz, Adam

    2013-04-01

    The security of quantum cryptography is guaranteed by the no-cloning theorem, which implies that an eavesdropper copying transmitted qubits in unknown states causes their disturbance. Nevertheless, in real cryptographic systems some level of disturbance has to be allowed to cover, e.g., transmission losses. An eavesdropper can attack such systems by replacing a noisy channel by a better one and by performing approximate cloning of transmitted qubits which disturb them but below the noise level assumed by legitimate users. We experimentally demonstrate such symmetric individual eavesdropping on the quantum key distribution protocols of Bennett and Brassard (BB84) and the trine-state spherical code of Renes (R04) with two-level probes prepared using a recently developed photonic multifunctional quantum cloner [Lemr et al., Phys. Rev. A 85, 050307(R) (2012)PLRAAN1050-2947]. We demonstrated that our optimal cloning device with high-success rate makes the eavesdropping possible by hiding it in usual transmission losses. We believe that this experiment can stimulate the quest for other operational applications of quantum cloning.

  5. Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS

    PubMed Central

    Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young

    2015-01-01

    In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry. PMID:26569219

  6. Optimizing the flow of care for prevention and treatment of deep vein thrombosis and pulmonary embolism.

    PubMed

    Ecklund, M M

    1995-11-01

    Critically ill patients have multiple risk factors for deep vein thrombosis and pulmonary embolism. The majority of patients with pulmonary embolism have a lower extremity deep vein thrombosis as a source of origin. Pulmonary embolism causes a high mortality rate in the hemodynamically compromised individual. Awareness of risk factors relative to the development of deep vein thrombosis and pulmonary embolism is important for the critical care nurse. Understanding the pathophysiology can help guide prophylaxis and treatment plans. The therapies, from invasive to mechanical, all carry risks and benefits, and are weighed for each patient. The advanced practice nurse, whether in the direct or indirect role, has an opportunity to impact the care of the high risk patient. Options range from teaching the nurse who is new to critical care, to teaching patients and families. Development of multidisciplinary protocols and clinical pathways are ways to impact the standard of care. Improved delivery of care methods can optimize the care rendered in an ever changing field of critical care.

  7. Game-theoretic perspective of Ping-Pong protocol

    NASA Astrophysics Data System (ADS)

    Kaur, Hargeet; Kumar, Atul

    2018-01-01

    We analyse Ping-Pong protocol from the point of view of a game. The analysis helps us in understanding the different strategies of a sender and an eavesdropper to gain the maximum payoff in the game. The study presented here characterizes strategies that lead to different Nash equilibriums. We further demonstrate the condition for Pareto optimality depending on the parameters used in the game. Moreover, we also analysed LM05 protocol and compared it with PP protocol from the point of view of a generic two-way QKD game with or without entanglement. Our results provide a deeper understanding of general two-way QKD protocols in terms of the security and payoffs of different stakeholders in the protocol.

  8. Optimizing the high-resolution manometry (HRM) study protocol.

    PubMed

    Patel, A; Ding, A; Mirza, F; Gyawali, C P

    2015-02-01

    Intolerance of the esophageal manometry catheter may prolong high-resolution manometry (HRM) studies and increase patient distress. We assessed the impact of obtaining the landmark phase at the end of the study when the patient has acclimatized to the HRM catheter. 366 patients (mean age 55.4 ± 0.8 years, 62.0% female) undergoing esophageal HRM over a 1-year period were studied. The standard protocol consisted of the landmark phase, 10 5 mL water swallows 20-30 s apart, and multiple rapid swallows where 4-6 2 mL swallows were administered in rapid succession. The modified protocol consisted of the landmark phase at the end of the study after test swallows. Study duration, technical characteristics, indications, and motor findings were compared between standard and modified protocols. Of the 366 patients, 89.6% underwent the standard protocol (study duration 12.9 ± 0.3 min). In 10.4% with poor catheter tolerance undergoing the modified protocol, study duration was significantly longer (15.6 ± 1.0 min, p = 0.004) despite similar duration of study maneuvers. Only elevated upper esophageal sphincter basal pressures at the beginning of the study segregated modified protocol patients. The 95th percentile time to landmark phase in the standard protocol patients was 6.1 min; as many as 31.4% of modified protocol patients could not obtain their first study maneuver within this period (p = 0.0003). Interpretation was not impacted by shifting the landmark phase to the end of the study. Modification of the HRM study protocol with the landmark phase obtained at the end of the study optimizes study duration without compromising quality. © 2014 John Wiley & Sons Ltd.

  9. Bio-Mimic Optimization Strategies in Wireless Sensor Networks: A Survey

    PubMed Central

    Adnan, Md. Akhtaruzzaman; Razzaque, Mohammd Abdur; Ahmed, Ishtiaque; Isnin, Ismail Fauzi

    2014-01-01

    For the past 20 years, many authors have focused their investigations on wireless sensor networks. Various issues related to wireless sensor networks such as energy minimization (optimization), compression schemes, self-organizing network algorithms, routing protocols, quality of service management, security, energy harvesting, etc., have been extensively explored. The three most important issues among these are energy efficiency, quality of service and security management. To get the best possible results in one or more of these issues in wireless sensor networks optimization is necessary. Furthermore, in number of applications (e.g., body area sensor networks, vehicular ad hoc networks) these issues might conflict and require a trade-off amongst them. Due to the high energy consumption and data processing requirements, the use of classical algorithms has historically been disregarded. In this context contemporary researchers started using bio-mimetic strategy-based optimization techniques in the field of wireless sensor networks. These techniques are diverse and involve many different optimization algorithms. As far as we know, most existing works tend to focus only on optimization of one specific issue of the three mentioned above. It is high time that these individual efforts are put into perspective and a more holistic view is taken. In this paper we take a step in that direction by presenting a survey of the literature in the area of wireless sensor network optimization concentrating especially on the three most widely used bio-mimetic algorithms, namely, particle swarm optimization, ant colony optimization and genetic algorithm. In addition, to stimulate new research and development interests in this field, open research issues, challenges and future research directions are highlighted. PMID:24368702

  10. Method optimization for fathead minnow (Pimephales promelas) liver S9 isolation

    EPA Science Inventory

    Standard protocols have been proposed to assess metabolic stability in rainbow trout liver S9 fractions. Using in vitro substrate depletion assays, in vitro intrinsic clearance rates can be calculated for a variety of study compounds. Existing protocols suggest potential adaptati...

  11. Practical considerations for optimizing cardiac computed tomography protocols for comprehensive acquisition prior to transcatheter aortic valve replacement.

    PubMed

    Khalique, Omar K; Pulerwitz, Todd C; Halliburton, Sandra S; Kodali, Susheel K; Hahn, Rebecca T; Nazif, Tamim M; Vahl, Torsten P; George, Isaac; Leon, Martin B; D'Souza, Belinda; Einstein, Andrew J

    2016-01-01

    Transcatheter aortic valve replacement (TAVR) is performed frequently in patients with severe, symptomatic aortic stenosis who are at high risk or inoperable for open surgical aortic valve replacement. Computed tomography angiography (CTA) has become the gold standard imaging modality for pre-TAVR cardiac anatomic and vascular access assessment. Traditionally, cardiac CTA has been most frequently used for assessment of coronary artery stenosis, and scanning protocols have generally been tailored for this purpose. Pre-TAVR CTA has different goals than coronary CTA and the high prevalence of chronic kidney disease in the TAVR patient population creates a particular need to optimize protocols for a reduction in iodinated contrast volume. This document reviews details which allow the physician to tailor CTA examinations to maximize image quality and minimize harm, while factoring in multiple patient and scanner variables which must be considered in customizing a pre-TAVR protocol. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  12. Experimental Optimal Single Qubit Purification in an NMR Quantum Information Processor

    PubMed Central

    Hou, Shi-Yao; Sheng, Yu-Bo; Feng, Guan-Ru; Long, Gui-Lu

    2014-01-01

    High quality single qubits are the building blocks in quantum information processing. But they are vulnerable to environmental noise. To overcome noise, purification techniques, which generate qubits with higher purities from qubits with lower purities, have been proposed. Purifications have attracted much interest and been widely studied. However, the full experimental demonstration of an optimal single qubit purification protocol proposed by Cirac, Ekert and Macchiavello [Phys. Rev. Lett. 82, 4344 (1999), the CEM protocol] more than one and half decades ago, still remains an experimental challenge, as it requires more complicated networks and a higher level of precision controls. In this work, we design an experiment scheme that realizes the CEM protocol with explicit symmetrization of the wave functions. The purification scheme was successfully implemented in a nuclear magnetic resonance quantum information processor. The experiment fully demonstrated the purification protocol, and showed that it is an effective way of protecting qubits against errors and decoherence. PMID:25358758

  13. Optimization of image quality in pulmonary CT angiography with low dose of contrast material

    NASA Astrophysics Data System (ADS)

    Assi, Abed Al Nasser; Abu Arra, Ali

    2017-06-01

    Aim: The aim of this study was to compare objective image quality data for patient pulmonary embolism between a conventional pulmonary CTA protocol with respect to a novel acquisition protocol performed with optimize radiation dose and less amount of iodinated contrast medium injected to the patients during PE scanning. Materials and Methods: Sixty- four patients with Pulmonary Embolism (PE) possibility, were examined using angio-CT protocol. Patients were randomly assigned to two groups: A (16 women and 16 men, with age ranging from 19-89 years) mean age, 62 years with standard deviation 16; range, 19-89 years) - injected contrast agent: 35-40 ml. B (16 women and 16 men, with age ranging from 28-86 years) - injected contrast agent: 70-80 ml. Other scanning parameters were kept constant. Pulmonary vessel enhancement and image noise were quantified; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. Subjective vessel contrast was assessed by two radiologists in consensus. Result: A total of 14 cases of PE (22 %) were found in the evaluated of subjects (nine in group A, and five in group B). All PE cases were detected by the two readers. There was no significant difference in the size or location of the PEs between the two groups, the average image noise was 14 HU for group A and 19 HU for group B. The difference was not statistically significant (p = 0.09). Overall, the SNR and CNR were slightly higher on group B (24.4 and 22.5 respectively) compared with group A (19.4 and 16.4 respectively), but those differences were not statistically significant (p = 0.71 and p = 0.35, respectively). Conclusion and Discussion: Both groups that had been evaluated by pulmonary CTA protocol allow similar image quality to be achieved as compared with each other's, with optimize care dose for both protocol and contrast volume were reduced by 50 % in new protocol comparing to the conventional protocol.

  14. Achievable rate maximization for decode-and-forward MIMO-OFDM networks with an energy harvesting relay.

    PubMed

    Du, Guanyao; Yu, Jianjun

    2016-01-01

    This paper investigates the system achievable rate for the multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) system with an energy harvesting (EH) relay. Firstly we propose two protocols, time switching-based decode-and-forward relaying (TSDFR) and a flexible power splitting-based DF relaying (PSDFR) protocol by considering two practical receiver architectures, to enable the simultaneous information processing and energy harvesting at the relay. In PSDFR protocol, we introduce a temporal parameter to describe the time division pattern between the two phases which makes the protocol more flexible and general. In order to explore the system performance limit, we discuss the system achievable rate theoretically and formulate two optimization problems for the proposed protocols to maximize the system achievable rate. Since the problems are non-convex and difficult to solve, we first analyze them theoretically and get some explicit results, then design an augmented Lagrangian penalty function (ALPF) based algorithm for them. Numerical results are provided to validate the accuracy of our analytical results and the effectiveness of the proposed ALPF algorithm. It is shown that, PSDFR outperforms TSDFR to achieve higher achievable rate in such a MIMO-OFDM relaying system. Besides, we also investigate the impacts of the relay location, the number of antennas and the number of subcarriers on the system performance. Specifically, it is shown that, the relay position greatly affects the system performance of both protocols, and relatively worse achievable rate is achieved when the relay is placed in the middle of the source and the destination. This is different from the MIMO-OFDM DF relaying system without EH. Moreover, the optimal factor which indicates the time division pattern between the two phases in the PSDFR protocol is always above 0.8, which means that, the common division of the total transmission time into two equal phases in previous work applying PS-based receiver is not optimal.

  15. Architectural and engineering issues for building an optical Internet

    NASA Astrophysics Data System (ADS)

    St. Arnaud, Bill

    1998-10-01

    Recent developments in high density Wave Division Multiplexing fiber systems allows for the deployment of a dedicated optical Internet network for large volume backbone pipes that does not require an underlying multi-service SONET/SDH and ATM transport protocol. Some intrinsic characteristics of Internet traffic such as its self similar nature, server bound congestion, routing and data asymmetry allow for highly optimized traffic engineered networks using individual wavelengths. By transmitting GigaBit Ethernet or SONET/SDH frames natively over WDM wavelengths that directly interconnect high performance routers the original concept of the Internet as an intrinsically survivable datagram network is possible. Traffic engineering, restoral, protection and bandwidth management of the network must now be carried out at the IP layer and so new routing or switching protocols such as MPLS that allow for uni- directional paths with fast restoral and protection at the IP layer become essential for a reliable production network. The deployment of high density WDM municipal and campus networks also gives carriers and ISPs the flexibility to offer customers as integrated and seamless set of optical Internet services.

  16. Thermally coupled moving boundary model for charge-discharge of LiFePO4/C cells

    NASA Astrophysics Data System (ADS)

    Khandelwal, Ashish; Hariharan, Krishnan S.; Gambhire, Priya; Kolake, Subramanya Mayya; Yeo, Taejung; Doo, Seokgwang

    2015-04-01

    Optimal thermal management is a key requirement in commercial utilization of lithium ion battery comprising of phase change electrodes. In order to facilitate design of battery packs, thermal management systems and fast charging profiles, a thermally coupled electrochemical model that takes into account the phase change phenomenon is required. In the present work, an electrochemical thermal model is proposed which includes the biphasic nature of phase change electrodes, such as lithium iron phosphate (LFP), via a generalized moving boundary model. The contribution of phase change to the heat released during the cell operation is modeled using an equivalent enthalpy approach. The heat released due to phase transformation is analyzed in comparison with other sources of heat such as reversible, irreversible and ohmic. Detailed study of the thermal behavior of the individual cell components with changing ambient temperature, rate of operation and heat transfer coefficient is carried out. Analysis of heat generation in the various regimes is used to develop cell design and operating guidelines. Further, different charging protocols are analyzed and a model based methodology is suggested to design an efficient quick charging protocol.

  17. Feasibility of Providing Safe Mouth Care and Collecting Oral and Fecal Microbiome Samples from Nursing Home Residents with Dysphagia: Proof of Concept Study.

    PubMed

    Jablonski, Rita A; Winstead, Vicki; Azuero, Andres; Ptacek, Travis; Jones-Townsend, Corteza; Byrd, Elizabeth; Geisinger, Maria L; Morrow, Casey

    2017-09-01

    Individuals with dysphagia who reside in nursing homes often receive inadequate mouth care and experience poor oral health. From a policy perspective, the combination of absent evidence-based mouth care protocols coupled with insufficient dental coverage create a pool of individuals at great risk for preventable infectious illnesses that contribute to high health care costs. The purpose of the current study was to determine (a) the safety of a mouth care protocol tailored for individuals with dysphagia residing in nursing homes without access to suction equipment, and (b) the feasibility of collecting oral and fecal samples for microbiota analyses. The mouth care protocol resulted in improved oral hygiene without aspiration, and oral and fecal samples were safely collected from participants. Policies supporting ongoing testing of evidence-based mouth care protocols for individuals with dysphagia are important to improve quality, demonstrate efficacy, and save health care costs. [Journal of Gerontological Nursing, 43(9), 9-15.]. Copyright 2017, SLACK Incorporated.

  18. Robot-assisted upper extremity rehabilitation for cervical spinal cord injuries: a systematic scoping review.

    PubMed

    Singh, Hardeep; Unger, Janelle; Zariffa, José; Pakosh, Maureen; Jaglal, Susan; Craven, B Catharine; Musselman, Kristin E

    2018-01-15

    Abstact Purpose: To provide an overview of the feasibility and outcomes of robotic-assisted upper extremity training for individuals with cervical spinal cord injury (SCI), and to identify gaps in current research and articulate future research directions. A systematic search was conducted using Medline, Embase, PsycINFO, CCTR, CDSR, CINAHL and PubMed on June 7, 2017. Search terms included 3 themes: (1) robotics; (2) SCI; (3) upper extremity. Studies using robots for upper extremity rehabilitation among individuals with cervical SCI were included. Identified articles were independently reviewed by two researchers and compared to pre-specified criteria. Disagreements regarding article inclusion were resolved through discussion. The modified Downs and Black checklist was used to assess article quality. Participant characteristics, study and intervention details, training outcomes, robot features, study limitations and recommendations for future studies were abstracted from included articles. Twelve articles (one randomized clinical trial, six case series, five case studies) met the inclusion criteria. Five robots were exoskeletons and three were end-effectors. Sample sizes ranged from 1 to 17 subjects. Articles had variable quality, with quality scores ranging from 8 to 20. Studies had a low internal validity primarily from lack of blinding or a control group. Individuals with mild-moderate impairments showed the greatest improvements on body structure/function and performance-level measures. This review is limited by the small number of articles, low-sample sizes and the diversity of devices and their associated training protocols, and outcome measures. Preliminary evidence suggests robot-assisted interventions are safe, feasible and can reduce active assistance provided by therapists. Implications for rehabilitation Robot-assisted upper extremity training for individuals with cervical spinal cord injury is safe, feasible and can reduce hands-on assistance provided by therapists. Future research in robotics rehabilitation with individuals with spinal cord injury is needed to determine the optimal device and training protocol as well as effectiveness.

  19. Frozen embryo transfer: a review on the optimal endometrial preparation and timing.

    PubMed

    Mackens, S; Santos-Ribeiro, S; van de Vijver, A; Racca, A; Van Landuyt, L; Tournaye, H; Blockeel, C

    2017-11-01

    What is the optimal endometrial preparation protocol for a frozen embryo transfer (FET)? Although the optimal endometrial preparation protocol for FET needs further research and is yet to be determined, we propose a standardized timing strategy based on the current available evidence which could assist in the harmonization and comparability of clinic practice and future trials. Amid a continuous increase in the number of FET cycles, determining the optimal endometrial preparation protocol has become paramount to maximize ART success. In current daily practice, different FET preparation methods and timing strategies are used. This is a review of the current literature on FET preparation methods, with special attention to the timing of the embryo transfer. Literature on the topic was retrieved in PubMed and references from relevant articles were investigated until June 2017. The number of high quality randomized controlled trials (RCTs) is scarce and, hence, the evidence for the best protocol for FET is poor. Future research should compare both the pregnancy and neonatal outcomes between HRT and true natural cycle (NC) FET. In terms of embryo transfer timing, we propose to start progesterone intake on the theoretical day of oocyte retrieval in HRT and to perform blastocyst transfer at hCG + 7 or LH + 6 in modified or true NC, respectively. As only a few high quality RCTs on the optimal preparation for FET are available in the existing literature, no definitive conclusion for benefit of one protocol over the other can be drawn so far. Caution when using HRT for FET is warranted since the rate of early pregnancy loss is alarmingly high in some reports. S.M. is funded by the Research Fund of Flanders (FWO). H.T. and C.B. report grants from Merck, Goodlife, Besins and Abbott during the conduct of the study. Not applicable. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    PubMed

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  1. High-Performance CCSDS AOS Protocol Implementation in FPGA

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Torgerson, Jordan L.; Pang, Jackson

    2010-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Advanced Orbiting Systems (AOS) space data link protocol provides a framing layer between channel coding such as LDPC (low-density parity-check) and higher-layer link multiplexing protocols such as CCSDS Encapsulation Service, which is described in the following article. Recent advancement in RF modem technology has allowed multi-megabit transmission over space links. With this increase in data rate, the CCSDS AOS protocol implementation needs to be optimized to both reduce energy consumption and operate at a high rate.

  2. Gaussian error correction of quantum states in a correlated noisy channel.

    PubMed

    Lassen, Mikael; Berni, Adriano; Madsen, Lars S; Filip, Radim; Andersen, Ulrik L

    2013-11-01

    Noise is the main obstacle for the realization of fault-tolerant quantum information processing and secure communication over long distances. In this work, we propose a communication protocol relying on simple linear optics that optimally protects quantum states from non-Markovian or correlated noise. We implement the protocol experimentally and demonstrate the near-ideal protection of coherent and entangled states in an extremely noisy channel. Since all real-life channels are exhibiting pronounced non-Markovian behavior, the proposed protocol will have immediate implications in improving the performance of various quantum information protocols.

  3. Determining the optimal load for jump squats: a review of methods and calculations.

    PubMed

    Dugan, Eric L; Doyle, Tim L A; Humphries, Brendan; Hasson, Christopher J; Newton, Robert U

    2004-08-01

    There has been an increasing volume of research focused on the load that elicits maximum power output during jump squats. Because of a lack of standardization for data collection and analysis protocols, results of much of this research are contradictory. The purpose of this paper is to examine why differing methods of data collection and analysis can lead to conflicting results for maximum power and associated optimal load. Six topics relevant to measurement and reporting of maximum power and optimal load are addressed: (a) data collection equipment, (b) inclusion or exclusion of body weight force in calculations of power, (c) free weight versus Smith machine jump squats, (d) reporting of average versus peak power, (e) reporting of load intensity, and (f) instructions given to athletes/ participants. Based on this information, a standardized protocol for data collection and reporting of jump squat power and optimal load is presented.

  4. Design and testing of regulatory cassettes for optimal activity in skeletal and cardiac muscles.

    PubMed

    Himeda, Charis L; Chen, Xiaolan; Hauschka, Stephen D

    2011-01-01

    Gene therapy for muscular dystrophies requires efficient gene delivery to the striated musculature and specific, high-level expression of the therapeutic gene in a physiologically diverse array of muscles. This can be achieved by the use of recombinant adeno-associated virus vectors in conjunction with muscle-specific regulatory cassettes. We have constructed several generations of regulatory cassettes based on the enhancer and promoter of the muscle creatine kinase gene, some of which include heterologous enhancers and individual elements from other muscle genes. Since the relative importance of many control elements varies among different anatomical muscles, we are aiming to tailor these cassettes for high-level expression in cardiac muscle, and in fast and slow skeletal muscles. With the achievement of efficient intravascular gene delivery to isolated limbs, selected muscle groups, and heart in large animal models, the design of cassettes optimized for activity in different muscle types is now a practical goal. In this protocol, we outline the key steps involved in the design of regulatory cassettes for optimal activity in skeletal and cardiac muscle, and testing in mature muscle fiber cultures. The basic principles described here can also be applied to engineering tissue-specific regulatory cassettes for other cell types.

  5. Security of two-state and four-state practical quantum bit-commitment protocols

    NASA Astrophysics Data System (ADS)

    Loura, Ricardo; Arsenović, Dušan; Paunković, Nikola; Popović, Duška B.; Prvanović, Slobodan

    2016-12-01

    We study cheating strategies against a practical four-state quantum bit-commitment protocol [A. Danan and L. Vaidman, Quant. Info. Proc. 11, 769 (2012)], 10.1007/s11128-011-0284-4 and its two-state variant [R. Loura et al., Phys. Rev. A 89, 052336 (2014)], 10.1103/PhysRevA.89.052336 when the underlying quantum channels are noisy and the cheating party is constrained to using single-qubit measurements only. We show that simply inferring the transmitted photons' states by using the Breidbart basis, optimal for ambiguous (minimum-error) state discrimination, does not directly produce an optimal cheating strategy for this bit-commitment protocol. We introduce a strategy, based on certain postmeasurement processes and show it to have better chances at cheating than the direct approach. We also study to what extent sending forged geographical coordinates helps a dishonest party in breaking the binding security requirement. Finally, we investigate the impact of imperfect single-photon sources in the protocols. Our study shows that, in terms of the resources used, the four-state protocol is advantageous over the two-state version. The analysis performed can be straightforwardly generalized to any finite-qubit measurement, with the same qualitative results.

  6. An Energy Balanced and Lifetime Extended Routing Protocol for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Lu, Luxi

    2018-05-17

    Energy limitation is an adverse problem in designing routing protocols for underwater sensor networks (UWSNs). To prolong the network lifetime with limited battery power, an energy balanced and efficient routing protocol, called energy balanced and lifetime extended routing protocol (EBLE), is proposed in this paper. The proposed EBLE not only balances traffic loads according to the residual energy, but also optimizes data transmissions by selecting low-cost paths. Two phases are operated in the EBLE data transmission process: (1) candidate forwarding set selection phase and (2) data transmission phase. In candidate forwarding set selection phase, nodes update candidate forwarding nodes by broadcasting the position and residual energy level information. The cost value of available nodes is calculated and stored in each sensor node. Then in data transmission phase, high residual energy and relatively low-cost paths are selected based on the cost function and residual energy level information. We also introduce detailed analysis of optimal energy consumption in UWSNs. Numerical simulation results on a variety of node distributions and data load distributions prove that EBLE outperforms other routing protocols (BTM, BEAR and direct transmission) in terms of network lifetime and energy efficiency.

  7. SU-F-I-46: Optimizing Dose Reduction in Adult Head CT Protocols While Maintaining Image Quality in Postmortem Head Scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipnharski, I; Carranza, C; Quails, N

    Purpose: To optimize adult head CT protocol by reducing dose to an appropriate level while providing CT images of diagnostic quality. Methods: Five cadavers were scanned from the skull base to the vertex using a routine adult head CT protocol (120 kVp, 270 mA, 0.75 s rotation, 0.5 mm × 32 detectors, 70.8 mGy CTDIvol) followed by seven reduced-dose protocols with varying combinations of reduced tube current, reduced rotation time, and increased detectors with CTDIvol ranging from 38.2 to 65.6 mGy. Organ doses were directly measured with 21 OSL dosimeters placed on the surface and implanted in the head bymore » a neurosurgeon. Two neuroradiologists assessed grey-white matter differentiation, fluid space, ventricular size, midline shift, brain mass, edema, ischemia, and skull fractures on a three point scale: (1) Unacceptable, (2) Borderline Acceptable, and (3) Acceptable. Results: For the standard scan, doses to the skin, lens of the eye, salivary glands, thyroid, and brain were 37.55 mGy, 49.65 mGy, 40.67 mGy, 4.63 mGy, and 27.33 mGy, respectively. Two cadavers had cerebral edema due to changing dynamics of postmortem effects, causing the grey-white matter differentiation to appear less distinct. Two cadavers with preserved grey-white matter received acceptable scores for all image quality features for the protocol with a CTDIvol of 57.3 mGy, allowing organ dose savings ranging from 34% to 45%. One cadaver allowed for greater dose reduction for the protocol with a CTDIvol of 42 mGy. Conclusion: Efforts to optimize scan protocol should consider both dose and clinical image quality. This is made possible with postmortem subjects, whose brains are similar to patients, allowing for an investigation of ideal scan parameters. Radiologists at our institution accepted scan protocols acquired with lower scan parameters, with CTDIvol values closer to the American College of Radiology’s (ACR) Achievable Dose level of 57 mGy.« less

  8. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  9. A Spectrophotometric Assay Optimizing Conditions for Pepsin Activity.

    ERIC Educational Resources Information Center

    Harding, Ethelynda E.; Kimsey, R. Scott

    1998-01-01

    Describes a laboratory protocol optimizing the conditions for the assay of pepsin activity using the Coomasie Blue dye binding assay of protein concentration. The dye bonds through strong, noncovalent interactions to basic and aromatic amino acid residues. (DDR)

  10. An efficient protocol for the synthesis of highly sensitive indole imines utilizing green chemistry: optimization of reaction conditions.

    PubMed

    Nisar, Bushra; Rubab, Syeda Laila; Raza, Abdul Rauf; Tariq, Sobia; Sultan, Ayesha; Tahir, Muhammad Nawaz

    2018-04-11

    Novel and highly sensitive indole-based imines have been synthesized. Their synthesis has been compared employing a variety of protocols. Ultimately, a convenient, economical and high yielding set of conditions employing green chemistry have been designed for their synthesis.

  11. Security Protocol Verification and Optimization by Epistemic Model Checking

    DTIC Science & Technology

    2010-11-05

    Three cryptographers are sitting down to dinner at their favourite restau- rant. Their waiter informs them that arrangements have been made with the...Unfortunately, the protocol cannot be expected to satisfy this: suppose that all agents manage to broadcast their mes- sage and all messages have the

  12. Cluster Size Optimization in Sensor Networks with Decentralized Cluster-Based Protocols

    PubMed Central

    Amini, Navid; Vahdatpour, Alireza; Xu, Wenyao; Gerla, Mario; Sarrafzadeh, Majid

    2011-01-01

    Network lifetime and energy-efficiency are viewed as the dominating considerations in designing cluster-based communication protocols for wireless sensor networks. This paper analytically provides the optimal cluster size that minimizes the total energy expenditure in such networks, where all sensors communicate data through their elected cluster heads to the base station in a decentralized fashion. LEACH, LEACH-Coverage, and DBS comprise three cluster-based protocols investigated in this paper that do not require any centralized support from a certain node. The analytical outcomes are given in the form of closed-form expressions for various widely-used network configurations. Extensive simulations on different networks are used to confirm the expectations based on the analytical results. To obtain a thorough understanding of the results, cluster number variability problem is identified and inspected from the energy consumption point of view. PMID:22267882

  13. Optimization of immunostaining on flat-mounted human corneas.

    PubMed

    Forest, Fabien; Thuret, Gilles; Gain, Philippe; Dumollard, Jean-Marc; Peoc'h, Michel; Perrache, Chantal; He, Zhiguo

    2015-01-01

    In the literature, immunohistochemistry on cross sections is the main technique used to study protein expression in corneal endothelial cells (ECs), even though this method allows visualization of few ECs, without clear subcellular localization, and is subject to the staining artifacts frequently encountered at tissue borders. We previously proposed several protocols, using fixation in 0.5% paraformaldehyde (PFA) or in methanol, allowing immunostaining on flatmounted corneas for proteins of different cell compartments. In the present study, we further refined the technique by systematically assessing the effect of fixative temperature. Last, we used optimized protocols to further demonstrate the considerable advantages of immunostaining on flatmounted intact corneas: detection of rare cells in large fields of thousands of ECs and epithelial cells, and accurate subcellular localization of given proteins. The staining of four ubiquitous proteins, ZO-1, hnRNP L, actin, and histone H3, with clearly different subcellular localizations, was analyzed in ECs of organ-cultured corneas. Whole intact human corneas were fixed for 30 min in 0.5% paraformaldehyde or pure methanol at four temperatures (4 °C for PFA, -20 °C for methanol, and 23, 37, and 50 °C for both). Experiments were performed in duplicate and repeated on three corneas. Standardized pictures were analyzed independently by two experts. Second, optimized immunostaining protocols were applied to fresh corneas for three applications: identification of rare cells that express KI67 in the endothelium of specimens with Fuch's endothelial corneal dystrophy (FECD), the precise localization of neural cell adhesion molecules (NCAMs) in normal ECs and of the cytokeratin pair K3/12 and CD44 in normal epithelial cells, and the identification of cells that express S100b in the normal epithelium. Temperature strongly influenced immunostaining quality. There was no ubiquitous protocol, but nevertheless, room temperature may be recommended as first-line temperature during fixation, instead of the conventional -20 °C for methanol and 4 °C for PFA. Further optimization may be required for certain target proteins. Optimized protocols allowed description of two previously unknown findings: the presence of a few proliferating ECs in FECD specimens, suggesting ineffective compensatory mechanisms against premature EC death, and the localization of NCAMs exclusively in the lateral membranes of ECs, showing hexagonal organization at the apical pole and an irregular shape with increasing complexity toward the basal pole. Optimized protocols were also effective for the epithelium, allowing clear localization of cytokeratin 3/12 and CD44 in superficial and basal epithelial cells, respectively. Finally, S100b allowed identification of clusters of epithelial Langerhans cells near the limbus and more centrally. Fixative temperature is a crucial parameter in optimizing immunostaining on flatmounted intact corneas. Whole-tissue overview and precise subcellular staining are significant advantages over conventional immunohistochemistry (IHC) on cross sections. This technique, initially developed for the corneal endothelium, proved equally suitable for the corneal epithelium and could be used for other superficial mono- and multilayered epithelia.

  14. Fundamental Lifetime Mechanisms in Routing Protocols for Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Eslaminejad, Mohammadreza; Razak, Shukor Abd

    2012-01-01

    Wireless sensor networks basically consist of low cost sensor nodes which collect data from environment and relay them to a sink, where they will be subsequently processed. Since wireless nodes are severely power-constrained, the major concern is how to conserve the nodes' energy so that network lifetime can be extended significantly. Employing one static sink can rapidly exhaust the energy of sink neighbors. Furthermore, using a non-optimal single path together with a maximum transmission power level may quickly deplete the energy of individual nodes on the route. This all results in unbalanced energy consumption through the sensor field, and hence a negative effect on the network lifetime. In this paper, we present a comprehensive taxonomy of the various mechanisms applied for increasing the network lifetime. These techniques, whether in the routing or cross-layer area, fall within the following types: multi-sink, mobile sink, multi-path, power control and bio-inspired algorithms, depending on the protocol operation. In this taxonomy, special attention has been devoted to the multi-sink, power control and bio-inspired algorithms, which have not yet received much consideration in the literature. Moreover, each class covers a variety of the state-of-the-art protocols, which should provide ideas for potential future works. Finally, we compare these mechanisms and discuss open research issues. PMID:23202008

  15. Fundamental lifetime mechanisms in routing protocols for wireless sensor networks: a survey and open issues.

    PubMed

    Eslaminejad, Mohammadreza; Razak, Shukor Abd

    2012-10-09

    Wireless sensor networks basically consist of low cost sensor nodes which collect data from environment and relay them to a sink, where they will be subsequently processed. Since wireless nodes are severely power-constrained, the major concern is how to conserve the nodes' energy so that network lifetime can be extended significantly. Employing one static sink can rapidly exhaust the energy of sink neighbors. Furthermore, using a non-optimal single path together with a maximum transmission power level may quickly deplete the energy of individual nodes on the route. This all results in unbalanced energy consumption through the sensor field, and hence a negative effect on the network lifetime. In this paper, we present a comprehensive taxonomy of the various mechanisms applied for increasing the network lifetime. These techniques, whether in the routing or cross-layer area, fall within the following types: multi-sink, mobile sink, multi-path, power control and bio-inspired algorithms, depending on the protocol operation. In this taxonomy, special attention has been devoted to the multi-sink, power control and bio-inspired algorithms, which have not yet received much consideration in the literature. Moreover, each class covers a variety of the state-of-the-art protocols, which should provide ideas for potential future works. Finally, we compare these mechanisms and discuss open research issues.

  16. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  17. Taking a holistic approach to managing difficult stress fractures.

    PubMed

    Miller, Timothy L; Best, Thomas M

    2016-09-09

    Stress fractures and other bony stress injuries occur along a spectrum of severity which can impact treatment and prognosis. When treating these injuries, it should be borne in mind that no two stress fractures behave exactly alike. Given that they are not a consistent injury, standardized treatment protocols can be challenging to develop. Treatment should be individualized to the patient or athlete, the causative activity, the anatomical site, and the severity of the injury. A holistic approach to the treatment of the most difficult stress fractures should be taken by orthopedists and sports medicine specialists. This approach is necessary to obtain optimal outcomes, minimize loss of fitness and time away from sports participation, and decrease the risk of recurrence.

  18. Experimental Demonstration of Polarization Encoding Measurement-Device-Independent Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Tang, Zhiyuan; Liao, Zhongfa; Xu, Feihu; Qi, Bing; Qian, Li; Lo, Hoi-Kwong

    2014-05-01

    We demonstrate the first implementation of polarization encoding measurement-device-independent quantum key distribution (MDI-QKD), which is immune to all detector side-channel attacks. Active phase randomization of each individual pulse is implemented to protect against attacks on imperfect sources. By optimizing the parameters in the decoy state protocol, we show that it is feasible to implement polarization encoding MDI-QKD with commercial off-the-shelf devices. A rigorous finite key analysis is applied to estimate the secure key rate. Our work paves the way for the realization of a MDI-QKD network, in which the users only need compact and low-cost state-preparation devices and can share complicated and expensive detectors provided by an untrusted network server.

  19. iCLIP: protein-RNA interactions at nucleotide resolution.

    PubMed

    Huppertz, Ina; Attig, Jan; D'Ambrogio, Andrea; Easton, Laura E; Sibley, Christopher R; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej

    2014-02-01

    RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein-RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein-RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Sport-Related Concussion: Optimizing Treatment Through Evidence-Informed Practice.

    PubMed

    Schneider, Kathryn J

    2016-08-01

    Concussion is one of the most common injuries in sport and recreation today. Reports of concussion have increased in recent years, likely due to increased societal awareness and the risk of longer-term sequelae. Presently, treatment includes a period of prescribed rest in the acute period following injury, followed by a protocol of graded exertion. Despite an initial period of rest and attempts at a gradual return to play, up to 30% of individuals may have ongoing symptoms past the acute period. The goal of this viewpoint is to introduce the reader to the most common symptoms of concussion and the need for a new, more active paradigm during treatment. J Orthop Sports Phys Ther 2016;46(8):613-616. doi:10.2519/jospt.2016.0607.

  1. Experimental demonstration of polarization encoding measurement-device-independent quantum key distribution.

    PubMed

    Tang, Zhiyuan; Liao, Zhongfa; Xu, Feihu; Qi, Bing; Qian, Li; Lo, Hoi-Kwong

    2014-05-16

    We demonstrate the first implementation of polarization encoding measurement-device-independent quantum key distribution (MDI-QKD), which is immune to all detector side-channel attacks. Active phase randomization of each individual pulse is implemented to protect against attacks on imperfect sources. By optimizing the parameters in the decoy state protocol, we show that it is feasible to implement polarization encoding MDI-QKD with commercial off-the-shelf devices. A rigorous finite key analysis is applied to estimate the secure key rate. Our work paves the way for the realization of a MDI-QKD network, in which the users only need compact and low-cost state-preparation devices and can share complicated and expensive detectors provided by an untrusted network server.

  2. Optimization and experimental validation of a thermal cycle that maximizes entropy coefficient fisher identifiability for lithium iron phosphate cells

    NASA Astrophysics Data System (ADS)

    Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam

    2016-03-01

    This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.

  3. Headspace Analysis of Volatile Compounds Using Segemented Chirped-Pulse Fourier Transform Mm-Wave Spectroscopy

    NASA Astrophysics Data System (ADS)

    Harris, Brent; Steber, Amanda; Pate, Brooks

    2014-06-01

    A chirped-pulse Fourier transform mm-wave spectrometer has been tested in analytical chemistry applications of headspace analysis of volatile species. A solid-state mm-wave light source (260-290 GHz) provides 30-50 mW of power. This power is sufficient to achieve optimal excitation of individual transitions of molecules with dipole moments larger than about 0.1 D. The chirped-pulse spectrometer has near 100% measurement duty cycle using a high-speed digitizer (4 GS/s) with signal accumulation in an FPGA. The combination of the ability to perform optimal pulse excitation and near 100% measurement duty cycle gives a spectrometer that is fully optimized for trace detection. The performance of the instrument is tested using an EPA sample (EPA VOC Mix 6 - Supelco) that contains a set of molecules that are fast eluting on gas chromatographs and, as a result, present analysis challenges to mass spectrometry. The ability to directly analyze the VOC mixture is tested by acquiring the full bandwidth (260-290 GHz) spectrum in a "high dynamic range" measurement mode that minimizes spurious spectrometer responses. The high-resolution of molecular rotational spectroscopy makes it easy to analyze this mixture without the need for chemical separation. The sensitivity of the instrument for individual molecule detection, where a single transition is polarized by the excitation pulse, is also tested. Detection limits in water will be reported. In the case of chloromethane, the detection limit (0.1 microgram/L), matches the sensitivity reported in the EPA measurement protocol (EPA Method 524) for GC/MS.

  4. Generation and customization of biosynthetic excitable tissues for electrophysiological studies and cell-based therapies.

    PubMed

    Nguyen, Hung X; Kirkton, Robert D; Bursac, Nenad

    2018-05-01

    We describe a two-stage protocol to generate electrically excitable and actively conducting cell networks with stable and customizable electrophysiological phenotypes. Using this method, we have engineered monoclonally derived excitable tissues as a robust and reproducible platform to investigate how specific ion channels and mutations affect action potential (AP) shape and conduction. In the first stage of the protocol, we combine computational modeling, site-directed mutagenesis, and electrophysiological techniques to derive optimal sets of mammalian and/or prokaryotic ion channels that produce specific AP shape and conduction characteristics. In the second stage of the protocol, selected ion channels are stably expressed in unexcitable human cells by means of viral or nonviral delivery, followed by flow cytometry or antibiotic selection to purify the desired phenotype. This protocol can be used with traditional heterologous expression systems or primary excitable cells, and application of this method to primary fibroblasts may enable an alternative approach to cardiac cell therapy. Compared with existing methods, this protocol generates a well-defined, relatively homogeneous electrophysiological phenotype of excitable cells that facilitates experimental and computational studies of AP conduction and can decrease arrhythmogenic risk upon cell transplantation. Although basic cell culture and molecular biology techniques are sufficient to generate excitable tissues using the described protocol, experience with patch-clamp techniques is required to characterize and optimize derived cell populations.

  5. A novel tool for continuous fracture aftercare - Clinical feasibility and first results of a new telemetric gait analysis insole.

    PubMed

    Braun, Benedikt J; Bushuven, Eva; Hell, Rebecca; Veith, Nils T; Buschbaum, Jan; Holstein, Joerg H; Pohlemann, Tim

    2016-02-01

    Weight bearing after lower extremity fractures still remains a highly controversial issue. Even in ankle fractures, the most common lower extremity injury no standard aftercare protocol has been established. Average non weight bearing times range from 0 to 7 weeks, with standardised, radiological healing controls at fixed time intervals. Recent literature calls for patient-adapted aftercare protocols based on individual fracture and load scenarios. We show the clinical feasibility and first results of a new, insole embedded gait analysis tool for continuous monitoring of gait, load and activity. Ten patients were monitored with a new, independent gait analysis insole for up to 3 months postoperatively. Strict 20 kg partial weight bearing was ordered for 6 weeks. Overall activity, load spectrum, ground reaction forces, clinical scoring and general health data were recorded and correlated. Statistical analysis with power analysis, t-test and Spearman correlation was performed. Only one patient completely adhered to the set weight bearing limit. Average time in minutes over the limit was 374 min. Based on the parameters load, activity, gait time over 20 kg weight bearing and maximum ground reaction force high and low performers were defined after 3 weeks. Significant difference in time to painless full weight bearing between high and low performers was shown. Correlation analysis revealed a significant correlation between weight bearing and clinical scoring as well as pain (American Orthopaedic Foot and Ankle Society (AOFAS) Score rs=0.74; Olerud-Molander Score rs=0.93; VAS pain rs=-0.95). Early, continuous gait analysis is able to define aftercare performers with significant differences in time to full painless weight bearing where clinical or radiographic controls could not. Patient compliance to standardised weight bearing limits and protocols is low. Highly individual rehabilitation patterns were seen in all patients. Aftercare protocols should be adjusted to real-time patient conditions, rather than fixed intervals and limits. With a real-time measuring device high performers could be identified and influenced towards optimal healing conditions early, while low performers are recognised and missing healing influences could be corrected according to patient condition. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Calculating an optimal box size for ligand docking and virtual screening against experimental and predicted binding pockets.

    PubMed

    Feinstein, Wei P; Brylinski, Michal

    2015-01-01

    Computational approaches have emerged as an instrumental methodology in modern research. For example, virtual screening by molecular docking is routinely used in computer-aided drug discovery. One of the critical parameters for ligand docking is the size of a search space used to identify low-energy binding poses of drug candidates. Currently available docking packages often come with a default protocol for calculating the box size, however, many of these procedures have not been systematically evaluated. In this study, we investigate how the docking accuracy of AutoDock Vina is affected by the selection of a search space. We propose a new procedure for calculating the optimal docking box size that maximizes the accuracy of binding pose prediction against a non-redundant and representative dataset of 3,659 protein-ligand complexes selected from the Protein Data Bank. Subsequently, we use the Directory of Useful Decoys, Enhanced to demonstrate that the optimized docking box size also yields an improved ranking in virtual screening. Binding pockets in both datasets are derived from the experimental complex structures and, additionally, predicted by eFindSite. A systematic analysis of ligand binding poses generated by AutoDock Vina shows that the highest accuracy is achieved when the dimensions of the search space are 2.9 times larger than the radius of gyration of a docking compound. Subsequent virtual screening benchmarks demonstrate that this optimized docking box size also improves compound ranking. For instance, using predicted ligand binding sites, the average enrichment factor calculated for the top 1 % (10 %) of the screening library is 8.20 (3.28) for the optimized protocol, compared to 7.67 (3.19) for the default procedure. Depending on the evaluation metric, the optimal docking box size gives better ranking in virtual screening for about two-thirds of target proteins. This fully automated procedure can be used to optimize docking protocols in order to improve the ranking accuracy in production virtual screening simulations. Importantly, the optimized search space systematically yields better results than the default method not only for experimental pockets, but also for those predicted from protein structures. A script for calculating the optimal docking box size is freely available at www.brylinski.org/content/docking-box-size. Graphical AbstractWe developed a procedure to optimize the box size in molecular docking calculations. Left panel shows the predicted binding pose of NADP (green sticks) compared to the experimental complex structure of human aldose reductase (blue sticks) using a default protocol. Right panel shows the docking accuracy using an optimized box size.

  7. Toward Fairness in Assessing Student Groupwork: A Protocol for Peer Evaluation of Individual Contributions

    ERIC Educational Resources Information Center

    Fellenz, Martin R.

    2006-01-01

    A key challenge for management instructors using graded groupwork with students is to find ways to maximize student learning from group projects while ensuring fair and accurate assessment methods. This article presents the Groupwork Peer-Evaluation Protocol (GPEP) that enables the assessment of individual contributions to graded student…

  8. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  9. Multi-site assessment of the precision and reproducibility of multiple reaction monitoring–based measurements of proteins in plasma

    PubMed Central

    Addona, Terri A; Abbatiello, Susan E; Schilling, Birgit; Skates, Steven J; Mani, D R; Bunk, David M; Spiegelman, Clifford H; Zimmerman, Lisa J; Ham, Amy-Joan L; Keshishian, Hasmik; Hall, Steven C; Allen, Simon; Blackman, Ronald K; Borchers, Christoph H; Buck, Charles; Cardasis, Helene L; Cusack, Michael P; Dodder, Nathan G; Gibson, Bradford W; Held, Jason M; Hiltke, Tara; Jackson, Angela; Johansen, Eric B; Kinsinger, Christopher R; Li, Jing; Mesri, Mehdi; Neubert, Thomas A; Niles, Richard K; Pulsipher, Trenton C; Ransohoff, David; Rodriguez, Henry; Rudnick, Paul A; Smith, Derek; Tabb, David L; Tegeler, Tony J; Variyath, Asokan M; Vega-Montoto, Lorenzo J; Wahlander, Åsa; Waldemarson, Sofia; Wang, Mu; Whiteaker, Jeffrey R; Zhao, Lei; Anderson, N Leigh; Fisher, Susan J; Liebler, Daniel C; Paulovich, Amanda G; Regnier, Fred E; Tempst, Paul; Carr, Steven A

    2010-01-01

    Verification of candidate biomarkers relies upon specific, quantitative assays optimized for selective detection of target proteins, and is increasingly viewed as a critical step in the discovery pipeline that bridges unbiased biomarker discovery to preclinical validation. Although individual laboratories have demonstrated that multiple reaction monitoring (MRM) coupled with isotope dilution mass spectrometry can quantify candidate protein biomarkers in plasma, reproducibility and transferability of these assays between laboratories have not been demonstrated. We describe a multilaboratory study to assess reproducibility, recovery, linear dynamic range and limits of detection and quantification of multiplexed, MRM-based assays, conducted by NCI-CPTAC. Using common materials and standardized protocols, we demonstrate that these assays can be highly reproducible within and across laboratories and instrument platforms, and are sensitive to low µg/ml protein concentrations in unfractionated plasma. We provide data and benchmarks against which individual laboratories can compare their performance and evaluate new technologies for biomarker verification in plasma. PMID:19561596

  10. Young investigator challenge: Validation and optimization of immunohistochemistry protocols for use on cellient cell block specimens.

    PubMed

    Sauter, Jennifer L; Grogg, Karen L; Vrana, Julie A; Law, Mark E; Halvorson, Jennifer L; Henry, Michael R

    2016-02-01

    The objective of the current study was to establish a process for validating immunohistochemistry (IHC) protocols for use on the Cellient cell block (CCB) system. Thirty antibodies were initially tested on CCBs using IHC protocols previously validated on formalin-fixed, paraffin-embedded tissue (FFPE). Cytology samples were split to generate thrombin cell blocks (TCB) and CCBs. IHC was performed in parallel. Antibody immunoreactivity was scored, and concordance or discordance in immunoreactivity between the TCBs and CCBs for each sample was determined. Criteria for validation of an antibody were defined as concordant staining in expected positive and negative cells, in at least 5 samples each, and concordance in at least 90% of the samples total. Antibodies that failed initial validation were retested after alterations in IHC conditions. Thirteen of the 30 antibodies (43%) did not meet initial validation criteria. Of those, 8 antibodies (calretinin, clusters of differentiation [CD] 3, CD20, CDX2, cytokeratin 20, estrogen receptor, MOC-31, and p16) were optimized for CCBs and subsequently validated. Despite several alterations in conditions, 3 antibodies (Ber-EP4, D2-40, and paired box gene 8 [PAX8]) were not successfully validated. Nearly one-half of the antibodies tested in the current study failed initial validation using IHC conditions that were established in the study laboratory for FFPE material. Although some antibodies subsequently met validation criteria after optimization of conditions, a few continued to demonstrate inadequate immunoreactivity. These findings emphasize the importance of validating IHC protocols for methanol-fixed tissue before clinical use and suggest that optimization for alcohol fixation may be needed to obtain adequate immunoreactivity on CCBs. © 2016 American Cancer Society.

  11. Optimized Setup and Protocol for Magnetic Domain Imaging with In Situ Hysteresis Measurement.

    PubMed

    Liu, Jun; Wilson, John; Davis, Claire; Peyton, Anthony

    2017-11-07

    This paper elaborates the sample preparation protocols required to obtain optimal domain patterns using the Bitter method, focusing on the extra steps compared to standard metallographic sample preparation procedures. The paper proposes a novel bespoke rig for dynamic domain imaging with in situ BH (magnetic hysteresis) measurements and elaborates the protocols for the sensor preparation and the use of the rig to ensure accurate BH measurement. The protocols for static and ordinary dynamic domain imaging (without in situ BH measurements) are also presented. The reported method takes advantage of the convenience and high sensitivity of the traditional Bitter method and enables in situ BH measurement without interrupting or interfering with the domain wall movement processes. This facilitates establishing a direct and quantitative link between the domain wall movement processes-microstructural feature interactions in ferritic steels with their BH loops. This method is anticipated to become a useful tool for the fundamental study of microstructure-magnetic property relationships in steels and to help interpret the electromagnetic sensor signals for non-destructive evaluation of steel microstructures.

  12. Optimized Setup and Protocol for Magnetic Domain Imaging with In Situ Hysteresis Measurement

    PubMed Central

    Liu, Jun; Wilson, John; Davis, Claire; Peyton, Anthony

    2017-01-01

    This paper elaborates the sample preparation protocols required to obtain optimal domain patterns using the Bitter method, focusing on the extra steps compared to standard metallographic sample preparation procedures. The paper proposes a novel bespoke rig for dynamic domain imaging with in situ BH (magnetic hysteresis) measurements and elaborates the protocols for the sensor preparation and the use of the rig to ensure accurate BH measurement. The protocols for static and ordinary dynamic domain imaging (without in situ BH measurements) are also presented. The reported method takes advantage of the convenience and high sensitivity of the traditional Bitter method and enables in situ BH measurement without interrupting or interfering with the domain wall movement processes. This facilitates establishing a direct and quantitative link between the domain wall movement processes–microstructural feature interactions in ferritic steels with their BH loops. This method is anticipated to become a useful tool for the fundamental study of microstructure–magnetic property relationships in steels and to help interpret the electromagnetic sensor signals for non-destructive evaluation of steel microstructures. PMID:29155796

  13. Cure Cycle Design Methodology for Fabricating Reactive Resin Matrix Fiber Reinforced Composites: A Protocol for Producing Void-free Quality Laminates

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung

    2014-01-01

    For the fabrication of resin matrix fiber reinforced composite laminates, a workable cure cycle (i.e., temperature and pressure profiles as a function of processing time) is needed and is critical for achieving void-free laminate consolidation. Design of such a cure cycle is not trivial, especially when dealing with reactive matrix resins. An empirical "trial and error" approach has been used as common practice in the composite industry. Such an approach is not only costly, but also ineffective at establishing the optimal processing conditions for a specific resin/fiber composite system. In this report, a rational "processing science" based approach is established, and a universal cure cycle design protocol is proposed. Following this protocol, a workable and optimal cure cycle can be readily and rationally designed for most reactive resin systems in a cost effective way. This design protocol has been validated through experimental studies of several reactive polyimide composites for a wide spectrum of usage that has been documented in the previous publications.

  14. Robust optimal design of diffusion-weighted magnetic resonance experiments for skin microcirculation

    NASA Astrophysics Data System (ADS)

    Choi, J.; Raguin, L. G.

    2010-10-01

    Skin microcirculation plays an important role in several diseases including chronic venous insufficiency and diabetes. Magnetic resonance (MR) has the potential to provide quantitative information and a better penetration depth compared with other non-invasive methods such as laser Doppler flowmetry or optical coherence tomography. The continuous progress in hardware resulting in higher sensitivity must be coupled with advances in data acquisition schemes. In this article, we first introduce a physical model for quantifying skin microcirculation using diffusion-weighted MR (DWMR) based on an effective dispersion model for skin leading to a q-space model of the DWMR complex signal, and then design the corresponding robust optimal experiments. The resulting robust optimal DWMR protocols improve the worst-case quality of parameter estimates using nonlinear least squares optimization by exploiting available a priori knowledge of model parameters. Hence, our approach optimizes the gradient strengths and directions used in DWMR experiments to robustly minimize the size of the parameter estimation error with respect to model parameter uncertainty. Numerical evaluations are presented to demonstrate the effectiveness of our approach as compared to conventional DWMR protocols.

  15. Incentive-compatible demand-side management for smart grids based on review strategies

    NASA Astrophysics Data System (ADS)

    Xu, Jie; van der Schaar, Mihaela

    2015-12-01

    Demand-side load management is able to significantly improve the energy efficiency of smart grids. Since the electricity production cost depends on the aggregate energy usage of multiple consumers, an important incentive problem emerges: self-interested consumers want to increase their own utilities by consuming more than the socially optimal amount of energy during peak hours since the increased cost is shared among the entire set of consumers. To incentivize self-interested consumers to take the socially optimal scheduling actions, we design a new class of protocols based on review strategies. These strategies work as follows: first, a review stage takes place in which a statistical test is performed based on the daily prices of the previous billing cycle to determine whether or not the other consumers schedule their electricity loads in a socially optimal way. If the test fails, the consumers trigger a punishment phase in which, for a certain time, they adjust their energy scheduling in such a way that everybody in the consumer set is punished due to an increased price. Using a carefully designed protocol based on such review strategies, consumers then have incentives to take the socially optimal load scheduling to avoid entering this punishment phase. We rigorously characterize the impact of deploying protocols based on review strategies on the system's as well as the users' performance and determine the optimal design (optimal billing cycle, punishment length, etc.) for various smart grid deployment scenarios. Even though this paper considers a simplified smart grid model, our analysis provides important and useful insights for designing incentive-compatible demand-side management schemes based on aggregate energy usage information in a variety of practical scenarios.

  16. Lighting Automation - Flying an Earthlike Habit Project

    NASA Technical Reports Server (NTRS)

    Falker, Jay; Howard, Ricky; Culbert, Christopher; Clark, Toni Anne; Kolomenski, Andrei

    2017-01-01

    Our proposal will enable the development of automated spacecraft habitats for long duration missions. Majority of spacecraft lighting systems employ lamps or zone specific switches and dimmers. Automation is not in the "picture". If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. To transform how spacecraft lighting environments are automated, we will provide performance data on a standard lighting communication protocol. We will investigate utilization and application of an industry accepted lighting control protocol, DMX512. We will demonstrate how lighting automation can conserve power, assist with lighting countermeasures, and utilize spatial body tracking. By using DMX512 we will prove the "wheel" does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.

  17. Discrete Particle Swarm Optimization Routing Protocol for Wireless Sensor Networks with Multiple Mobile Sinks.

    PubMed

    Yang, Jin; Liu, Fagui; Cao, Jianneng; Wang, Liangming

    2016-07-14

    Mobile sinks can achieve load-balancing and energy-consumption balancing across the wireless sensor networks (WSNs). However, the frequent change of the paths between source nodes and the sinks caused by sink mobility introduces significant overhead in terms of energy and packet delays. To enhance network performance of WSNs with mobile sinks (MWSNs), we present an efficient routing strategy, which is formulated as an optimization problem and employs the particle swarm optimization algorithm (PSO) to build the optimal routing paths. However, the conventional PSO is insufficient to solve discrete routing optimization problems. Therefore, a novel greedy discrete particle swarm optimization with memory (GMDPSO) is put forward to address this problem. In the GMDPSO, particle's position and velocity of traditional PSO are redefined under discrete MWSNs scenario. Particle updating rule is also reconsidered based on the subnetwork topology of MWSNs. Besides, by improving the greedy forwarding routing, a greedy search strategy is designed to drive particles to find a better position quickly. Furthermore, searching history is memorized to accelerate convergence. Simulation results demonstrate that our new protocol significantly improves the robustness and adapts to rapid topological changes with multiple mobile sinks, while efficiently reducing the communication overhead and the energy consumption.

  18. Dynamic whole-body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application.

    PubMed

    Karakatsanis, Nicolas A; Lodge, Martin A; Tahari, Abdel K; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    Static whole-body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single-bed-coverage limiting the axial field-of-view to ~15-20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole-body PET acquisition protocol of ~45 min total length is presented, composed of (i) an initial 6 min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (six passes × seven bed positions, each scanned for 45 s). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of ten different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole body. In addition, the total acquisition length can be reduced from 45 to ~35 min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error and the CNR metrics, resulting in enhanced task-based imaging.

  19. Dynamic whole body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application

    PubMed Central

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Tahari, Abdel K.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-01-01

    Static whole body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single bed-coverage limiting the axial field-of-view to ~15–20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole body PET acquisition protocol of ~45min total length is presented, composed of (i) an initial 6-min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (6 passes x 7 bed positions, each scanned for 45sec). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares (OLS) Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of 10 different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole-body. In addition, the total acquisition length can be reduced from 45min to ~35min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error (MSE) and the CNR metrics, resulting in enhanced task-based imaging. PMID:24080962

  20. Dynamic whole-body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Tahari, Abdel K.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-10-01

    Static whole-body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single-bed-coverage limiting the axial field-of-view to ˜15-20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole-body PET acquisition protocol of ˜45 min total length is presented, composed of (i) an initial 6 min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (six passes × seven bed positions, each scanned for 45 s). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of ten different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole body. In addition, the total acquisition length can be reduced from 45 to ˜35 min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error and the CNR metrics, resulting in enhanced task-based imaging.

  1. A Secure Routing Protocol for Wireless Sensor Networks Considering Secure Data Aggregation.

    PubMed

    Rahayu, Triana Mugia; Lee, Sang-Gon; Lee, Hoon-Jae

    2015-06-26

    The commonly unattended and hostile deployments of WSNs and their resource-constrained sensor devices have led to an increasing demand for secure energy-efficient protocols. Routing and data aggregation receive the most attention since they are among the daily network routines. With the awareness of such demand, we found that so far there has been no work that lays out a secure routing protocol as the foundation for a secure data aggregation protocol. We argue that the secure routing role would be rendered useless if the data aggregation scheme built on it is not secure. Conversely, the secure data aggregation protocol needs a secure underlying routing protocol as its foundation in order to be effectively optimal. As an attempt for the solution, we devise an energy-aware protocol based on LEACH and ESPDA that combines secure routing protocol and secure data aggregation protocol. We then evaluate its security effectiveness and its energy-efficiency aspects, knowing that there are always trade-off between both.

  2. A Secure Routing Protocol for Wireless Sensor Networks Considering Secure Data Aggregation

    PubMed Central

    Rahayu, Triana Mugia; Lee, Sang-Gon; Lee, Hoon-Jae

    2015-01-01

    The commonly unattended and hostile deployments of WSNs and their resource-constrained sensor devices have led to an increasing demand for secure energy-efficient protocols. Routing and data aggregation receive the most attention since they are among the daily network routines. With the awareness of such demand, we found that so far there has been no work that lays out a secure routing protocol as the foundation for a secure data aggregation protocol. We argue that the secure routing role would be rendered useless if the data aggregation scheme built on it is not secure. Conversely, the secure data aggregation protocol needs a secure underlying routing protocol as its foundation in order to be effectively optimal. As an attempt for the solution, we devise an energy-aware protocol based on LEACH and ESPDA that combines secure routing protocol and secure data aggregation protocol. We then evaluate its security effectiveness and its energy-efficiency aspects, knowing that there are always trade-off between both. PMID:26131669

  3. An optimized IFN-γ ELISpot assay for the sensitive and standardized monitoring of CMV protein-reactive effector cells of cell-mediated immunity.

    PubMed

    Barabas, Sascha; Spindler, Theresa; Kiener, Richard; Tonar, Charlotte; Lugner, Tamara; Batzilla, Julia; Bendfeldt, Hanna; Rascle, Anne; Asbach, Benedikt; Wagner, Ralf; Deml, Ludwig

    2017-03-07

    In healthy individuals, Cytomegalovirus (CMV) infection is efficiently controlled by CMV-specific cell-mediated immunity (CMI). Functional impairment of CMI in immunocompromized individuals however can lead to uncontrolled CMV replication and severe clinical complications. Close monitoring of CMV-specific CMI is therefore clinically relevant and might allow a reliable prognosis of CMV disease as well as assist personalized therapeutic decisions. Objective of this work was the optimization and technical validation of an IFN-γ ELISpot assay for a standardized, sensitive and reliable quantification of CMV-reactive effector cells. T-activated® immunodominant CMV IE-1 and pp65 proteins were used as stimulants. All basic assay parameters and reagents were tested and optimized to establish a user-friendly protocol and maximize the signal-to-noise ratio of the ELISpot assay. Optimized and standardized ELISpot revealed low intra-assay, inter-assay and inter-operator variability (coefficient of variation CV below 22%) and CV inter-site was lower than 40%. Good assay linearity was obtained between 6 × 10 4 and 2 × 10 5 PBMC per well upon stimulation with T-activated® IE-1 (R 2  = 0.97) and pp65 (R 2  = 0.99) antigens. Remarkably, stimulation of peripheral blood mononuclear cells (PBMC) with T-activated® IE-1 and pp65 proteins resulted in the activation of a broad range of CMV-reactive effector cells, including CD3 + CD4 + (Th), CD3 + CD8 + (CTL), CD3 - CD56 + (NK) and CD3 + CD56 + (NKT-like) cells. Accordingly, the optimized IFN-γ ELISpot assay revealed very high sensitivity (97%) in a cohort of 45 healthy donors, of which 32 were CMV IgG-seropositive. The combined use of T-activated® IE-1 and pp65 proteins for the stimulation of PBMC with the optimized IFN-γ ELISpot assay represents a highly standardized, valuable tool to monitor the functionality of CMV-specific CMI with great sensitivity and reliability.

  4. SU-F-207-02: Use of Postmortem Subjects for Subjective Image Quality Assessment in Abdominal CT Protocols with Iterative Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mench, A; Lipnharski, I; Carranza, C

    Purpose: New radiation dose reduction technologies are emerging constantly in the medical imaging field. The latest of these technologies, iterative reconstruction (IR) in CT, presents the ability to reduce dose significantly and hence provides great opportunity for CT protocol optimization. However, without effective analysis of image quality, the reduction in radiation exposure becomes irrelevant. This work explores the use of postmortem subjects as an image quality assessment medium for protocol optimizations in abdominal CT. Methods: Three female postmortem subjects were scanned using the Abdomen-Pelvis (AP) protocol at reduced minimum tube current and target noise index (SD) settings of 12.5, 17.5,more » 20.0, and 25.0. Images were reconstructed using two strengths of iterative reconstruction. Radiologists and radiology residents from several subspecialties were asked to evaluate 8 AP image sets including the current facility default scan protocol and 7 scans with the parameters varied as listed above. Images were viewed in the soft tissue window and scored on a 3-point scale as acceptable, borderline acceptable, and unacceptable for diagnosis. The facility default AP scan was identified to the reviewer while the 7 remaining AP scans were randomized and de-identified of acquisition and reconstruction details. The observers were also asked to comment on the subjective image quality criteria they used for scoring images. This included visibility of specific anatomical structures and tissue textures. Results: Radiologists scored images as acceptable or borderline acceptable for target noise index settings of up to 20. Due to the postmortem subjects’ close representation of living human anatomy, readers were able to evaluate images as they would those of actual patients. Conclusion: Postmortem subjects have already been proven useful for direct CT organ dose measurements. This work illustrates the validity of their use for the crucial evaluation of image quality during CT protocol optimization, especially when investigating the effects of new technologies.« less

  5. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Rodriguez-Canales, Jaime; Hanson, Jeffrey C; Hipp, Jason D; Balis, Ulysses J; Tangrea, Michael A; Emmert-Buck, Michael R; Bova, G Steven

    2013-01-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This updated chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high-quality, appropriately anatomically tagged scientific results. Improvement in this area will significantly increase life science quality and productivity. The chapter is divided into introduction, materials, protocols, and notes subheadings. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this chapter, readers are advised to read through the entire chapter first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  6. The cryopreservation protocol optimal for progenitor recovery is not optimal for preservation of marrow repopulating ability.

    PubMed

    Balint, B; Ivanović, Z; Petakov, M; Taseski, J; Jovcić, G; Stojanović, N; Milenković, P

    1999-03-01

    The efficiency of five different cryopreservation protocols (our original controlled-rate and noncontrolled-rate protocols) was evaluated on the basis of the recovery after thawing of very primitive pluripotent hemopoietic stem cells (MRA(CFU-GM), pluripotent progenitors (CFU-Sd12) and committed granulocyte-monocyte progenitors (CFU-GM) in mouse bone marrow. Although the nucleated cell recovery and viability determined immediately after the thawing and washing of the cells were found to be similar, whether controlled-rate or noncontrolled-rate cryopreservation protocols were used, the recovery of MRA(CFU-GM), CFU-Sd12 and CFU-GM varied depending on the type of protocol and the cryoprotector (DMSO) concentrations used. It was shown that the controlled-rate protocol was more efficient, enabling better MRA(CFU-GM), CFU-Sd12 and CFU-GM recovery from frozen samples. The most efficient was the controlled-rate protocol of cryopreservation designed to compensate for the release of fusion heat, which enabled a better survival of CFU-Sd12 and CFU-GM when combined with a lower (5%) DMSO concentration. On the contrary, a satisfactory survival rate of very primitive stem cells (MRA(CFU-GM)) was achieved only when 10% DMSO was included with a five-step protocol of cryopreservation. These results point to adequately used controlled-rate freezing as essential for a highly efficient cryopreservation of some of the categories of hematopoietic stem and progenitor cells. At the same time, it was obvious that a higher DMSO concentration was necessary for the cryopreservation of very primitive stem cells, but not, however, for more mature progenitor cells (CFU-S, CFU-GM). These results imply the existence of a mechanism that decreases the intracellular concentration of DMSO in primitive MRA cells, which is not the case for less primitive progenitors.

  7. Peptide-MHC Class I Tetramers Can Fail To Detect Relevant Functional T Cell Clonotypes and Underestimate Antigen-Reactive T Cell Populations.

    PubMed

    Rius, Cristina; Attaf, Meriem; Tungatt, Katie; Bianchi, Valentina; Legut, Mateusz; Bovay, Amandine; Donia, Marco; Thor Straten, Per; Peakman, Mark; Svane, Inge Marie; Ott, Sascha; Connor, Tom; Szomolay, Barbara; Dolton, Garry; Sewell, Andrew K

    2018-04-01

    Peptide-MHC (pMHC) multimers, usually used as streptavidin-based tetramers, have transformed the study of Ag-specific T cells by allowing direct detection, phenotyping, and enumeration within polyclonal T cell populations. These reagents are now a standard part of the immunology toolkit and have been used in many thousands of published studies. Unfortunately, the TCR-affinity threshold required for staining with standard pMHC multimer protocols is higher than that required for efficient T cell activation. This discrepancy makes it possible for pMHC multimer staining to miss fully functional T cells, especially where low-affinity TCRs predominate, such as in MHC class II-restricted responses or those directed against self-antigens. Several recent, somewhat alarming, reports indicate that pMHC staining might fail to detect the majority of functional T cells and have prompted suggestions that T cell immunology has become biased toward the type of cells amenable to detection with multimeric pMHC. We use several viral- and tumor-specific pMHC reagents to compare populations of human T cells stained by standard pMHC protocols and optimized protocols that we have developed. Our results confirm that optimized protocols recover greater populations of T cells that include fully functional T cell clonotypes that cannot be stained by regular pMHC-staining protocols. These results highlight the importance of using optimized procedures that include the use of protein kinase inhibitor and Ab cross-linking during staining to maximize the recovery of Ag-specific T cells and serve to further highlight that many previous quantifications of T cell responses with pMHC reagents are likely to have considerably underestimated the size of the relevant populations. Copyright © 2018 The Authors.

  8. Economic comparison of common treatment protocols and J5 vaccination for clinical mastitis in dairy herds using optimized culling decisions.

    PubMed

    Kessels, J A; Cha, E; Johnson, S K; Welcome, F L; Kristensen, A R; Gröhn, Y T

    2016-05-01

    This study used an existing dynamic optimization model to compare costs of common treatment protocols and J5 vaccination for clinical mastitis in US dairy herds. Clinical mastitis is an infection of the mammary gland causing major economic losses in dairy herds due to reduced milk production, reduced conception, and increased risk of mortality and culling for infected cows. Treatment protocols were developed to reflect common practices in dairy herds. These included targeted therapy following pathogen identification, and therapy without pathogen identification using a broad-spectrum antimicrobial or treating with the cheapest treatment option. The cost-benefit of J5 vaccination was also estimated. Effects of treatment were accounted for as changes in treatment costs, milk loss due to mastitis, milk discarded due to treatment, and mortality. Following ineffective treatments, secondary decisions included extending the current treatment, alternative treatment, discontinuing treatment, and pathogen identification followed by recommended treatment. Average net returns for treatment protocols and vaccination were generated using an existing dynamic programming model. This model incorporates cow and pathogen characteristics to optimize management decisions to treat, inseminate, or cull cows. Of the treatment protocols where 100% of cows received recommended treatment, pathogen-specific identification followed by recommended therapy yielded the highest average net returns per cow per year. Out of all treatment scenarios, the highest net returns were achieved with selecting the cheapest treatment option and discontinuing treatment, or alternate treatment with a similar spectrum therapy; however, this may not account for the full consequences of giving nonrecommended therapies to cows with clinical mastitis. Vaccination increased average net returns in all scenarios. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. An Individualized, Perception-Based Protocol to Investigate Human Physiological Responses to Cooling

    PubMed Central

    Coolbaugh, Crystal L.; Bush, Emily C.; Galenti, Elizabeth S.; Welch, E. Brian; Towse, Theodore F.

    2018-01-01

    Cold exposure, a known stimulant of the thermogenic effects of brown adipose tissue (BAT), is the most widely used method to study BAT physiology in adult humans. Recently, individualized cooling has been recommended to standardize the physiological cold stress applied across participants, but critical experimental details remain unclear. The purpose of this work was to develop a detailed methodology for an individualized, perception-based protocol to investigate human physiological responses to cooling. Participants were wrapped in two water-circulating blankets and fitted with skin temperature probes to estimate BAT activity and peripheral vasoconstriction. We created a thermoesthesia graphical user interface (tGUI) to continuously record the subject's perception of cooling and shivering status during the cooling protocol. The protocol began with a 15 min thermoneutral phase followed by a series of 10 min cooling phases and concluded when sustained shivering (>1 min duration) occurred. Researchers used perception of cooling feedback (tGUI ratings) to manually adjust and personalize the water temperature at each cooling phase. Blanket water temperatures were recorded continuously during the protocol. Twelve volunteers (ages: 26.2 ± 1.4 years; 25% female) completed a feasibility study to evaluate the proposed protocol. Water temperature, perception of cooling, and shivering varied considerably across participants in response to cooling. Mean clavicle skin temperature, a surrogate measure of BAT activity, decreased (−0.99°C, 95% CI: −1.7 to −0.25°C, P = 0.16) after the cooling protocol, but an increase in supraclavicular skin temperature was observed in 4 participants. A strong positive correlation was also found between thermoesthesia and peripheral vasoconstriction (ρ = 0.84, P < 0.001). The proposed individualized, perception-based protocol therefore has potential to investigate the physiological responses to cold stress applied across populations with varying age, sex, body composition, and cold sensitivity characteristics. PMID:29593558

  10. Experimental high-speed network

    NASA Astrophysics Data System (ADS)

    McNeill, Kevin M.; Klein, William P.; Vercillo, Richard; Alsafadi, Yasser H.; Parra, Miguel V.; Dallas, William J.

    1993-09-01

    Many existing local area networking protocols currently applied in medical imaging were originally designed for relatively low-speed, low-volume networking. These protocols utilize small packet sizes appropriate for text based communication. Local area networks of this type typically provide raw bandwidth under 125 MHz. These older network technologies are not optimized for the low delay, high data traffic environment of a totally digital radiology department. Some current implementations use point-to-point links when greater bandwidth is required. However, the use of point-to-point communications for a total digital radiology department network presents many disadvantages. This paper describes work on an experimental multi-access local area network called XFT. The work includes the protocol specification, and the design and implementation of network interface hardware and software. The protocol specifies the Physical and Data Link layers (OSI layers 1 & 2) for a fiber-optic based token ring providing a raw bandwidth of 500 MHz. The protocol design and implementation of the XFT interface hardware includes many features to optimize image transfer and provide flexibility for additional future enhancements which include: a modular hardware design supporting easy portability to a variety of host system buses, a versatile message buffer design providing 16 MB of memory, and the capability to extend the raw bandwidth of the network to 3.0 GHz.

  11. Cross-layer protocols optimized for real-time multimedia services in energy-constrained mobile ad hoc networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2003-07-01

    Mobile ad hoc networking (MANET) supports self-organizing, mobile infrastructures and enables an autonomous network of mobile nodes that can operate without a wired backbone. Ad hoc networks are characterized by multihop, wireless connectivity via packet radios and by the need for efficient dynamic protocols. All routers are mobile and can establish connectivity with other nodes only when they are within transmission range. Importantly, ad hoc wireless nodes are resource-constrained, having limited processing, memory, and battery capacity. Delivery of high quality-ofservice (QoS), real-time multimedia services from Internet-based applications over a MANET is a challenge not yet achieved by proposed Internet Engineering Task Force (IETF) ad hoc network protocols in terms of standard performance metrics such as end-to-end throughput, packet error rate, and delay. In the distributed operations of route discovery and maintenance, strong interaction occurs across MANET protocol layers, in particular, the physical, media access control (MAC), network, and application layers. The QoS requirements are specified for the service classes by the application layer. The cross-layer design must also satisfy the battery-limited energy constraints, by minimizing the distributed power consumption at the nodes and of selected routes. Interactions across the layers are modeled in terms of the set of concatenated design parameters including associated energy costs. Functional dependencies of the QoS metrics are described in terms of the concatenated control parameters. New cross-layer designs are sought that optimize layer interdependencies to achieve the "best" QoS available in an energy-constrained, time-varying network. The protocol design, based on a reactive MANET protocol, adapts the provisioned QoS to dynamic network conditions and residual energy capacities. The cross-layer optimization is based on stochastic dynamic programming conditions derived from time-dependent models of MANET packet flows. Regulation of network behavior is modeled by the optimal control of the conditional rates of multivariate point processes (MVPPs); these rates depend on the concatenated control parameters through a change of probability measure. The MVPP models capture behavior of many service applications, e.g., voice, video and the self-similar behavior of Internet data sessions. Performance verification of the cross-layer protocols, derived from the dynamic programming conditions, can be achieved by embedding the conditions in a reactive routing protocol for MANETs, in a simulation environment, such as the wireless extension of ns-2. A canonical MANET scenario consists of a distributed collection of battery-powered laptops or hand-held terminals, capable of hosting multimedia applications. Simulation details and performance tradeoffs, not presented, remain for a sequel to the paper.

  12. Efficient Online Optimized Quantum Control for Adiabatic Quantum Computation

    NASA Astrophysics Data System (ADS)

    Quiroz, Gregory

    Adiabatic quantum computation (AQC) relies on controlled adiabatic evolution to implement a quantum algorithm. While control evolution can take many forms, properly designed time-optimal control has been shown to be particularly advantageous for AQC. Grover's search algorithm is one such example where analytically-derived time-optimal control leads to improved scaling of the minimum energy gap between the ground state and first excited state and thus, the well-known quadratic quantum speedup. Analytical extensions beyond Grover's search algorithm present a daunting task that requires potentially intractable calculations of energy gaps and a significant degree of model certainty. Here, an in situ quantum control protocol is developed for AQC. The approach is shown to yield controls that approach the analytically-derived time-optimal controls for Grover's search algorithm. In addition, the protocol's convergence rate as a function of iteration number is shown to be essentially independent of system size. Thus, the approach is potentially scalable to many-qubit systems.

  13. Optimal control of complex atomic quantum systems

    PubMed Central

    van Frank, S.; Bonneau, M.; Schmiedmayer, J.; Hild, S.; Gross, C.; Cheneau, M.; Bloch, I.; Pichler, T.; Negretti, A.; Calarco, T.; Montangero, S.

    2016-01-01

    Quantum technologies will ultimately require manipulating many-body quantum systems with high precision. Cold atom experiments represent a stepping stone in that direction: a high degree of control has been achieved on systems of increasing complexity. However, this control is still sub-optimal. In many scenarios, achieving a fast transformation is crucial to fight against decoherence and imperfection effects. Optimal control theory is believed to be the ideal candidate to bridge the gap between early stage proof-of-principle demonstrations and experimental protocols suitable for practical applications. Indeed, it can engineer protocols at the quantum speed limit – the fastest achievable timescale of the transformation. Here, we demonstrate such potential by computing theoretically and verifying experimentally the optimal transformations in two very different interacting systems: the coherent manipulation of motional states of an atomic Bose-Einstein condensate and the crossing of a quantum phase transition in small systems of cold atoms in optical lattices. We also show that such processes are robust with respect to perturbations, including temperature and atom number fluctuations. PMID:27725688

  14. Optimal control of complex atomic quantum systems.

    PubMed

    van Frank, S; Bonneau, M; Schmiedmayer, J; Hild, S; Gross, C; Cheneau, M; Bloch, I; Pichler, T; Negretti, A; Calarco, T; Montangero, S

    2016-10-11

    Quantum technologies will ultimately require manipulating many-body quantum systems with high precision. Cold atom experiments represent a stepping stone in that direction: a high degree of control has been achieved on systems of increasing complexity. However, this control is still sub-optimal. In many scenarios, achieving a fast transformation is crucial to fight against decoherence and imperfection effects. Optimal control theory is believed to be the ideal candidate to bridge the gap between early stage proof-of-principle demonstrations and experimental protocols suitable for practical applications. Indeed, it can engineer protocols at the quantum speed limit - the fastest achievable timescale of the transformation. Here, we demonstrate such potential by computing theoretically and verifying experimentally the optimal transformations in two very different interacting systems: the coherent manipulation of motional states of an atomic Bose-Einstein condensate and the crossing of a quantum phase transition in small systems of cold atoms in optical lattices. We also show that such processes are robust with respect to perturbations, including temperature and atom number fluctuations.

  15. Optimization of the primary recovery of human interferon alpha2b from Escherichia coli inclusion bodies.

    PubMed

    Valente, C A; Monteiro, G A; Cabral, J M S; Fevereiro, M; Prazeres, D M F

    2006-01-01

    The human interferon alpha2b (hu-IFNalpha2b) gene was cloned in Escherichia coli JM109(DE3) and the recombinant protein was expressed as cytoplasmic inclusion bodies (IB). The present work discusses the recovery of hu-IFNalpha2b IB from the E. coli cells. An optimized protocol is proposed based on the sequential evaluation of recovery steps and parameters: (i) cell disruption, (ii) IB recovery and separation from cell debris, (iii) IB washing, and (iv) IB solubilization. Parameters such as hu-IFNalpha2b purity and recovery yield were measured after each step. The optimized recovery protocol yielded 60% of hu-IFNalpha2b with a purity of up to 80%. The protein was renatured at high concentration after recovery and it was found to display biological activity.

  16. Optimal Verification of Entangled States with Local Measurements

    NASA Astrophysics Data System (ADS)

    Pallister, Sam; Linden, Noah; Montanaro, Ashley

    2018-04-01

    Consider the task of verifying that a given quantum device, designed to produce a particular entangled state, does indeed produce that state. One natural approach would be to characterize the output state by quantum state tomography, or alternatively, to perform some kind of Bell test, tailored to the state of interest. We show here that neither approach is optimal among local verification strategies for 2-qubit states. We find the optimal strategy in this case and show that quadratically fewer total measurements are needed to verify to within a given fidelity than in published results for quantum state tomography, Bell test, or fidelity estimation protocols. We also give efficient verification protocols for any stabilizer state. Additionally, we show that requiring that the strategy be constructed from local, nonadaptive, and noncollective measurements only incurs a constant-factor penalty over a strategy without these restrictions.

  17. [Testicular cancer: a model to optimize the radiological follow-up].

    PubMed

    Stebler, V; Pauchard, B; Schmidt, S; Valerio, M; De Bari, B; Berthold, D

    2015-05-20

    Despite being rare cancers, testicular seminoma and non-seminoma play an important role in oncology: they represent a model on how to optimize radiological follow-up, aiming at a lowest possible radiation exposure and secondary cancer risk. Males diagnosed with testicular cancer undergo frequently prolonged follow-up with CT-scans with potential toxic side effects, in particular secondary cancers. To reduce the risks linked to ionizing radiation, precise follow-up protocols have been developed. The number of recommended CT-scanners has been significantly reduced over the last 10 years. The CT scanners have evolved technically and new acquisition protocols have the potential to reduce the radiation exposure further.

  18. MANEMO Routing in Practice: Protocol Selection, Expected Performance, and Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Tazaki, Hajime; van Meter, Rodney; Wakikawa, Ryuji; Wongsaardsakul, Thirapon; Kanchanasut, Kanchana; Dias de Amorim, Marcelo; Murai, Jun

    Motivated by the deployment of post-disaster MANEMO (MANET for NEMO) composed of mobile routers and stations, we evaluate two candidate routing protocols through network simulation, theoretical performance analysis, and field experiments. The first protocol is the widely adopted Optimized Link State Routing protocol (OLSR) and the second is the combination of the Tree Discovery Protocol (TDP) with Network In Node Advertisement (NINA). To the best of our knowledge, this is the first time that these two protocols are compared in both theoretical and practical terms. We focus on the control overhead generated when mobile routers perform a handover. Our results confirm the correctness and operational robustness of both protocols. More interestingly, although in the general case OLSR leads to better results, TDP/NINA outperforms OLSR both in the case of sparse networks and in highly mobile networks, which correspond to the operation point of a large set of post-disaster scenarios.

  19. Comparison of different tissue clearing methods and 3D imaging techniques for visualization of GFP-expressing mouse embryos and embryonic hearts.

    PubMed

    Kolesová, Hana; Čapek, Martin; Radochová, Barbora; Janáček, Jiří; Sedmera, David

    2016-08-01

    Our goal was to find an optimal tissue clearing protocol for whole-mount imaging of embryonic and adult hearts and whole embryos of transgenic mice that would preserve green fluorescent protein GFP fluorescence and permit comparison of different currently available 3D imaging modalities. We tested various published organic solvent- or water-based clearing protocols intended to preserve GFP fluorescence in central nervous system: tetrahydrofuran dehydration and dibenzylether protocol (DBE), SCALE, CLARITY, and CUBIC and evaluated their ability to render hearts and whole embryos transparent. DBE clearing protocol did not preserve GFP fluorescence; in addition, DBE caused considerable tissue-shrinking artifacts compared to the gold standard BABB protocol. The CLARITY method considerably improved tissue transparency at later stages, but also decreased GFP fluorescence intensity. The SCALE clearing resulted in sufficient tissue transparency up to ED12.5; at later stages the useful depth of imaging was limited by tissue light scattering. The best method for the cardiac specimens proved to be the CUBIC protocol, which preserved GFP fluorescence well, and cleared the specimens sufficiently even at the adult stages. In addition, CUBIC decolorized the blood and myocardium by removing tissue iron. Good 3D renderings of whole fetal hearts and embryos were obtained with optical projection tomography and selective plane illumination microscopy, although at resolutions lower than with a confocal microscope. Comparison of five tissue clearing protocols and three imaging methods for study of GFP mouse embryos and hearts shows that the optimal method depends on stage and level of detail required.

  20. Mathematical Model Formulation And Validation Of Water And Solute Transport In Whole Hamster Pancreatic Islets

    PubMed Central

    Benson, Charles T.; Critser, John K.

    2014-01-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3 × 3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87 ± 0.06 (mean ± S.D.). Only the treatment variable of perfusing solution was found to be significant (p < 0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. PMID:24950195

  1. Short Term Reproducibility of a High Contrast 3-D Isotropic Optic Nerve Imaging Sequence in Healthy Controls.

    PubMed

    Harrigan, Robert L; Smith, Alex K; Mawn, Louise A; Smith, Seth A; Landman, Bennett A

    2016-02-27

    The optic nerve (ON) plays a crucial role in human vision transporting all visual information from the retina to the brain for higher order processing. There are many diseases that affect the ON structure such as optic neuritis, anterior ischemic optic neuropathy and multiple sclerosis. Because the ON is the sole pathway for visual information from the retina to areas of higher level processing, measures of ON damage have been shown to correlate well with visual deficits. Increased intracranial pressure has been shown to correlate with the size of the cerebrospinal fluid (CSF) surrounding the ON. These measures are generally taken at an arbitrary point along the nerve and do not account for changes along the length of the ON. We propose a high contrast and high-resolution 3-D acquired isotropic imaging sequence optimized for ON imaging. We have acquired scan-rescan data using the optimized sequence and a current standard of care protocol for 10 subjects. We show that this sequence has superior contrast-to-noise ratio to the current standard of care while achieving a factor of 11 higher resolution. We apply a previously published automatic pipeline to segment the ON and CSF sheath and measure the size of each individually. We show that these measures of ON size have lower short-term reproducibility than the population variance and the variability along the length of the nerve. We find that the proposed imaging protocol is (1) useful in detecting population differences and local changes and (2) a promising tool for investigating biomarkers related to structural changes of the ON.

  2. Optimal setups for forced-choice staircases with fixed step sizes.

    PubMed

    García-Pérez, M A

    2000-01-01

    Forced-choice staircases with fixed step sizes are used in a variety of formats whose relative merits have never been studied. This paper presents a comparative study aimed at determining their optimal format. Factors included in the study were the up/down rule, the length (number of reversals), and the size of the steps. The study also addressed the issue of whether a protocol involving three staircases running for N reversals each (with a subsequent average of the estimates provided by each individual staircase) has better statistical properties than an alternative protocol involving a single staircase running for 3N reversals. In all cases the size of a step up was different from that of a step down, in the appropriate ratio determined by García-Pérez (Vision Research, 1998, 38, 1861 - 1881). The results of a simulation study indicate that a) there are no conditions in which the 1-down/1-up rule is advisable; b) different combinations of up/down rule and number of reversals appear equivalent in terms of precision and cost: c) using a single long staircase with 3N reversals is more efficient than running three staircases with N reversals each: d) to avoid bias and attain sufficient accuracy, threshold estimates should be based on at least 30 reversals: and e) to avoid excessive cost and imprecision, the size of the step up should be between 2/3 and 3/3 the (known or presumed) spread of the psychometric function. An empirical study with human subjects confirmed the major characteristics revealed by the simulations.

  3. Immunohistochemistry for the detection of neural and inflammatory cells in equine brain tissue

    PubMed Central

    Liu, Junjie; Herrington, Jenna M.; Vallario, Kelsey

    2016-01-01

    Phenotypic characterization of cellular responses in equine infectious encephalitides has had limited description of both peripheral and resident cell populations in central nervous system (CNS) tissues due to limited species-specific reagents that react with formalin-fixed, paraffin embedded tissue (FFPE). This study identified a set of antibodies for investigating the immunopathology of infectious CNS diseases in horses. Multiple commercially available staining reagents and antibodies derived from antigens of various species for manual immunohistochemistry (IHC) were screened. Several techniques and reagents for heat-induced antigen retrieval, non-specific protein blocking, endogenous peroxidase blocking, and visualization-detection systems were tested during IHC protocol development. Boiling of slides in a low pH, citrate-based buffer solution in a double-boiler system was most consistent for epitope retrieval. Pressure-cooking, microwaving, high pH buffers, and proteinase K solutions often resulted in tissue disruption or no reactivity. Optimal blocking reagents and concentrations of each working antibody were determined. Ultimately, a set of monoclonal (mAb) and polyclonal antibodies (pAb) were identified for CD3+ (pAb A0452, Dako) T-lymphocytes, CD79αcy+ B-lymphocytes (mAb HM57, Dako), macrophages (mAb MAC387, Leica), NF-H+ neurons (mAb NAP4, EnCor Biotechnology), microglia/macrophage (pAb Iba-1, Wako), and GFAP+ astrocytes (mAb 5C10, EnCor Biotechnology). In paraffin embedded tissues, mAbs and pAbs derived from human and swine antigens were very successful at binding equine tissue targets. Individual, optimized protocols are provided for each positively reactive antibody for analyzing equine neuroinflammatory disease histopathology. PMID:26855862

  4. Short term reproducibility of a high contrast 3-D isotropic optic nerve imaging sequence in healthy controls

    NASA Astrophysics Data System (ADS)

    Harrigan, Robert L.; Smith, Alex K.; Mawn, Louise A.; Smith, Seth A.; Landman, Bennett A.

    2016-03-01

    The optic nerve (ON) plays a crucial role in human vision transporting all visual information from the retina to the brain for higher order processing. There are many diseases that affect the ON structure such as optic neuritis, anterior ischemic optic neuropathy and multiple sclerosis. Because the ON is the sole pathway for visual information from the retina to areas of higher level processing, measures of ON damage have been shown to correlate well with visual deficits. Increased intracranial pressure has been shown to correlate with the size of the cerebrospinal fluid (CSF) surrounding the ON. These measures are generally taken at an arbitrary point along the nerve and do not account for changes along the length of the ON. We propose a high contrast and high-resolution 3-D acquired isotropic imaging sequence optimized for ON imaging. We have acquired scan-rescan data using the optimized sequence and a current standard of care protocol for 10 subjects. We show that this sequence has superior contrast-to-noise ratio to the current standard of care while achieving a factor of 11 higher resolution. We apply a previously published automatic pipeline to segment the ON and CSF sheath and measure the size of each individually. We show that these measures of ON size have lower short- term reproducibility than the population variance and the variability along the length of the nerve. We find that the proposed imaging protocol is (1) useful in detecting population differences and local changes and (2) a promising tool for investigating biomarkers related to structural changes of the ON.

  5. [Biobanking requirements from the perspective of the clinician : Experiences in hematology and oncology].

    PubMed

    Koschmieder, S; Brümmendorf, T H

    2018-04-05

    The requirements for optimal biobanking from the point of view of the clinical partner can be highly variable. Depending on the material, processing, storage conditions, clinical data, and involvement of external partners, there will be special requirements for the participating clinician and specialist areas. What they all have in common is that the goal of any biobanking must be to improve clinical, translational, and basic research. While in the past biomaterials often had to be individually stored for each research project, modern biobanking offers decisive advantages: a comprehensive ethics vote fulfilling state-of-the-art data safety requirements, standardized processing and storage protocols, specialized biobank software for pseudonymization and localization, protection against power failures and defects of the equipment, centralized and sustainable storage, easy localization and return of samples, and their destruction or anonymization after completion of an individual project. In addition to this important pure storage function, central biobanking can provide a link to clinical data as well as the anonymous use of samples for project-independent research. Both biobank functions serve different purposes, are associated with specific requirements, and should be pursued in parallel. If successful, central biomaterial management can achieve a sustainable improvement of academic and non-academic biomedical research and the optimal use of resources. The close collaboration between clinicians and non-clinicians is a crucial prerequisite for this.

  6. The Times, They are a-Changing: HOPE for HIV-to-HIV Organ Transplantation.

    PubMed

    Haidar, Ghady; Singh, Nina

    2017-09-01

    HIV-infected persons who achieve undetectable viral loads on antiretroviral therapy currently have near-normal lifespans. Liver disease is a major cause of non-AIDS-related deaths, and as a result of longer survival, the prevalence of end-stage renal disease in HIV is increasing. HIV-infected persons undergoing organ transplantation generally achieve comparable patient and graft survival rates compared to their HIV-uninfected counterparts, despite a nearly threefold increased risk of acute rejection. However, the ongoing shortage of suitable organs can limit transplantation as an option, and patients with HIV have higher waitlist mortality than others. One way to solve this problem would be to expand the donor pool to include HIV-infected individuals. The results of a South Africa study involving 27 HIV-to-HIV kidney transplants showed promise, with 3- and 5-year patient and graft survival rates similar to those of their HIV-uninfected counterparts. Similarly, individual cases of HIV-to-HIV liver transplantation from the United Kingdom and Switzerland have also shown good results. In the United States, HIV-to-HIV kidney and liver transplants are currently permitted only under a research protocol. Nevertheless, areas of ambiguity exist, including streamlining organ allocation practices, optimizing HIV-infected donor and recipient selection, managing donor-derived transmission of a resistant HIV strain, determining optimal immunosuppressive and antiretroviral regimens, and elucidating the incidence of rejection in HIV-to-HIV solid organ transplant recipients.

  7. Screening for hearing loss in the elderly using distortion product otoacoustic emissions, pure tones, and a self-assessment tool.

    PubMed

    Jupiter, Tina

    2009-12-01

    To determine whether distortion product otoacoustic emissions (DPOAEs) could be used as a hearing screening tool with elderly individuals living independently, and to compare the utility of different screening protocols: (a) 3 pure-tone screening protocols consisting of 30 dB HL at 1, 2, and 3 kHz; 40 dB HL at 1, 2, and 3 kHz; or 40 dB HL at 1 and 2 kHz; (b) the Hearing Handicap Inventory for the Elderly-Screening version (HHIE-S); (c) pure tones at 40 dB HL at 1 and 2 kHz plus the HHIE-S; and (d) DPOAEs. A total of 106 elderly individuals age 65-91 years were screened using the above protocols. Pass/fail results showed that most individuals failed at 30 dB HL, followed by DPOAEs, the 40-dB HL protocols, the HHIE-S alone, and the combined pure-tone/HHIE-S protocol. All screening results were associated except the HHIE-S and 30 dB HL and the HHIE-S and DPOAEs. A McNemar analysis revealed that the differences between the correlated pass/fail results were significant except for the HHIE-S and 40 dB at 1 and 2 kHz. DPOAEs can be used to screen the elderly, with the advantage that individuals do not have to voluntarily respond to the test.

  8. Quality and Dose Optimized CT Trauma Protocol - Recommendation from a University Level-I Trauma Center.

    PubMed

    Kahn, Johannes; Kaul, David; Böning, Georg; Rotzinger, Roman; Freyhardt, Patrick; Schwabe, Philipp; Maurer, Martin H; Renz, Diane Miriam; Streitparth, Florian

    2017-09-01

    Purpose  As a supra-regional level-I trauma center, we evaluated computed tomography (CT) acquisitions of polytraumatized patients for quality and dose optimization purposes. Adapted statistical iterative reconstruction [(AS)IR] levels, tube voltage reduction as well as a split-bolus contrast agent (CA) protocol were applied. Materials and Methods  61 patients were split into 3 different groups that differed with respect to tube voltage (120 - 140 kVp) and level of applied ASIR reconstruction (ASIR 20 - 50 %). The CT protocol included a native acquisition of the head followed by a single contrast-enhanced acquisition of the whole body (64-MSCT). CA (350 mg/ml iodine) was administered as a split bolus injection of 100 ml (2 ml/s), 20 ml NaCl (1 ml/s), 60 ml (4 ml/s), 40 ml NaCl (4 ml/s) with a scan delay of 85 s to detect injuries of both the arterial system and parenchymal organs in a single acquisition. Both the quantitative (SNR/CNR) and qualitative (5-point Likert scale) image quality was evaluated in parenchymal organs that are often injured in trauma patients. Radiation exposure was assessed. Results  The use of IR combined with a reduction of tube voltage resulted in good qualitative and quantitative image quality and a significant reduction in radiation exposure of more than 40 % (DLP 1087 vs. 647 mGyxcm). Image quality could be improved due to a dedicated protocol that included different levels of IR adapted to different slice thicknesses, kernels and the examined area for the evaluation of head, lung, body and bone injury patterns. In synopsis of our results, we recommend the implementation of a polytrauma protocol with a tube voltage of 120 kVp and the following IR levels: cCT 5mm: ASIR 20; cCT 0.625 mm: ASIR 40; lung 2.5 mm: ASIR 30, body 5 mm: ASIR 40; body 1.25 mm: ASIR 50; body 0.625 mm: ASIR 0. Conclusion  A dedicated adaptation of the CT trauma protocol (level of reduction of tube voltage and of IR) according to the examined body region (head, lung, body, bone) combined with a split bolus CA injection protocol allows for a high-quality CT examination and a relevant reduction of radiation exposure in the examination of polytraumatized patients Key Points   · Dedicated adaption of the CT trauma protocol allows for an optimized examination.. · Different levels of iterative reconstruction, tube voltage and the CA injection protocol are crucial.. · A reduction of radiation exposure of more than 40 % with good image quality is possible.. Citation Format · Kahn J, Kaul D, Böning G et al. Quality and Dose Optimized CT Trauma Protocol - Recommendation from a University Level-I Trauma Center. Fortschr Röntgenstr 2017; 189: 844 - 854. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Routing Protocols in Wireless Sensor Networks

    PubMed Central

    Villalba, Luis Javier García; Orozco, Ana Lucila Sandoval; Cabrera, Alicia Triviño; Abbas, Cláudia Jacy Barenco

    2009-01-01

    The applications of wireless sensor networks comprise a wide variety of scenarios. In most of them, the network is composed of a significant number of nodes deployed in an extensive area in which not all nodes are directly connected. Then, the data exchange is supported by multihop communications. Routing protocols are in charge of discovering and maintaining the routes in the network. However, the appropriateness of a particular routing protocol mainly depends on the capabilities of the nodes and on the application requirements. This paper presents a review of the main routing protocols proposed for wireless sensor networks. Additionally, the paper includes the efforts carried out by Spanish universities on developing optimization techniques in the area of routing protocols for wireless sensor networks. PMID:22291515

  10. Routing protocols in wireless sensor networks.

    PubMed

    Villalba, Luis Javier García; Orozco, Ana Lucila Sandoval; Cabrera, Alicia Triviño; Abbas, Cláudia Jacy Barenco

    2009-01-01

    The applications of wireless sensor networks comprise a wide variety of scenarios. In most of them, the network is composed of a significant number of nodes deployed in an extensive area in which not all nodes are directly connected. Then, the data exchange is supported by multihop communications. Routing protocols are in charge of discovering and maintaining the routes in the network. However, the appropriateness of a particular routing protocol mainly depends on the capabilities of the nodes and on the application requirements. This paper presents a review of the main routing protocols proposed for wireless sensor networks. Additionally, the paper includes the efforts carried out by Spanish universities on developing optimization techniques in the area of routing protocols for wireless sensor networks.

  11. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    PubMed

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  12. Multiple enface image averaging for enhanced optical coherence tomography angiography imaging.

    PubMed

    Uji, Akihito; Balasubramanian, Siva; Lei, Jianqin; Baghdasaryan, Elmira; Al-Sheikh, Mayss; Borrelli, Enrico; Sadda, SriniVas R

    2018-05-31

    To investigate the effect of multiple enface image averaging on image quality of the optical coherence tomography angiography (OCTA). Twenty-one normal volunteers were enrolled in this study. For each subject, one eye was imaged with 3 × 3 mm scan protocol, and another eye was imaged with the 6 × 6 mm scan protocol centred on the fovea using the ZEISS Angioplex™ spectral-domain OCTA device. Eyes were repeatedly imaged to obtain nine OCTA cube scan sets, and nine superficial capillary plexus (SCP) and deep capillary plexus (DCP) were individually averaged after registration. Eighteen eyes with a 3 × 3 mm scan field and 14 eyes with a 6 × 6 mm scan field were studied. Averaged images showed more continuous vessels and less background noise in both the SCP and the DCP as the number of frames used for averaging increased, with both 3 × 3 and 6 × 6 mm scan protocols. The intensity histogram of the vessels dramatically changed after averaging. Contrast-to-noise ratio (CNR) and subjectively assessed image quality scores also increased as the number of frames used for averaging increased in all image types. However, the additional benefit in quality diminished when averaging more than five frames. Averaging only three frames achieved significant improvement in CNR and the score assigned by certified grades. Use of multiple image averaging in OCTA enface images was found to be both objectively and subjectively effective for enhancing image quality. These findings may of value for developing optimal OCTA imaging protocols for future studies. © 2018 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  13. Purifying, Separating, and Concentrating Cells From a Sample Low in Biomass

    NASA Technical Reports Server (NTRS)

    Benardini, James N.; LaDuc, Myron T.; Diamond, Rochelle

    2012-01-01

    Frequently there is an inability to process and analyze samples of low biomass due to limiting amounts of relevant biomaterial in the sample. Furthermore, molecular biological protocols geared towards increasing the density of recovered cells and biomolecules of interest, by their very nature, also concentrate unwanted inhibitory humic acids and other particulates that have an adversarial effect on downstream analysis. A novel and robust fluorescence-activated cell-sorting (FACS)-based technology has been developed for purifying (removing cells from sampling matrices), separating (based on size, density, morphology), and concentrating cells (spores, prokaryotic, eukaryotic) from a sample low in biomass. The technology capitalizes on fluorescent cell-sorting technologies to purify and concentrate bacterial cells from a low-biomass, high-volume sample. Over the past decade, cell-sorting detection systems have undergone enhancements and increased sensitivity, making bacterial cell sorting a feasible concept. Although there are many unknown limitations with regard to the applicability of this technology to environmental samples (smaller cells, few cells, mixed populations), dogmatic principles support the theoretical effectiveness of this technique upon thorough testing and proper optimization. Furthermore, the pilot study from which this report is based proved effective and demonstrated this technology capable of sorting and concentrating bacterial endospore and bacterial cells of varying size and morphology. Two commercial off-the-shelf bacterial counting kits were used to optimize a bacterial stain/dye FACS protocol. A LIVE/DEAD BacLight Viability and Counting Kit was used to distinguish between the live and dead cells. A Bacterial Counting Kit comprising SYTO BC (mixture of SYTO dyes) was employed as a broad-spectrum bacterial counting agent. Optimization using epifluorescence microscopy was performed with these two dye/stains. This refined protocol was further validated using varying ratios and mixtures of cells to ensure homogenous staining compared to that of individual cells, and were utilized for flow analyzer and FACS labeling. This technology focuses on the purification and concentration of cells from low-biomass spacecraft assembly facility samples. Currently, purification and concentration of low-biomass samples plague planetary protection downstream analyses. Having a capability to use flow cytometry to concentrate cells out of low-biomass, high-volume spacecraft/ facility sample extracts will be of extreme benefit to the fields of planetary protection and astrobiology. Successful research and development of this novel methodology will significantly increase the knowledge base for designing more effective cleaning protocols, and ultimately lead to a more empirical and true account of the microbial diversity present on spacecraft surfaces. Refined cleaning and an enhanced ability to resolve microbial diversity may decrease the overall cost of spacecraft assembly and/or provide a means to begin to assess challenging planetary protection missions.

  14. Treatment algorithms and protocolized care.

    PubMed

    Morris, Alan H

    2003-06-01

    Excess information in complex ICU environments exceeds human decision-making limits and likely contributes to unnecessary variation in clinical care, increasing the likelihood of clinical errors. I reviewed recent critical care clinical trials searching for information about the impact of protocol use on clinically pertinent outcomes. Several recently published clinical trials illustrate the importance of distinguishing efficacy and effectiveness trials. One of these trials illustrates the danger of conducting effectiveness trials before the efficacy of an intervention is established. The trials also illustrate the importance of distinguishing guidelines and inadequately explicit protocols from adequately explicit protocols. Only adequately explicit protocols contain enough detail to lead different clinicians to the same decision when faced with the same clinical scenario. Differences between guidelines and protocols are important. Guidelines lack detail and provide general guidance that requires clinicians to fill in many gaps. Computerized or paper-based protocols are detailed and, when used for complex clinical ICU problems, can generate patient-specific, evidence-based therapy instructions that can be carried out by different clinicians with almost no interclinician variability. Individualization of patient therapy can be preserved by these protocols when they are driven by individual patient data. Explicit decision-support tools (eg, guidelines and protocols) have favorable effects on clinician and patient outcomes and can reduce the variation in clinical practice. Guidelines and protocols that aid ICU decision makers should be more widely distributed.

  15. Finite-key security analyses on passive decoy-state QKD protocols with different unstable sources.

    PubMed

    Song, Ting-Ting; Qin, Su-Juan; Wen, Qiao-Yan; Wang, Yu-Kun; Jia, Heng-Yue

    2015-10-16

    In quantum communication, passive decoy-state QKD protocols can eliminate many side channels, but the protocols without any finite-key analyses are not suitable for in practice. The finite-key securities of passive decoy-state (PDS) QKD protocols with two different unstable sources, type-II parametric down-convention (PDC) and phase randomized weak coherent pulses (WCPs), are analyzed in our paper. According to the PDS QKD protocols, we establish an optimizing programming respectively and obtain the lower bounds of finite-key rates. Under some reasonable values of quantum setup parameters, the lower bounds of finite-key rates are simulated. The simulation results show that at different transmission distances, the affections of different fluctuations on key rates are different. Moreover, the PDS QKD protocol with an unstable PDC source can resist more intensity fluctuations and more statistical fluctuation.

  16. The Design of Finite State Machine for Asynchronous Replication Protocol

    NASA Astrophysics Data System (ADS)

    Wang, Yanlong; Li, Zhanhuai; Lin, Wei; Hei, Minglei; Hao, Jianhua

    Data replication is a key way to design a disaster tolerance system and to achieve reliability and availability. It is difficult for a replication protocol to deal with the diverse and complex environment. This means that data is less well replicated than it ought to be. To reduce data loss and to optimize replication protocols, we (1) present a finite state machine, (2) run it to manage an asynchronous replication protocol and (3) report a simple evaluation of the asynchronous replication protocol based on our state machine. It's proved that our state machine is applicable to guarantee the asynchronous replication protocol running in the proper state to the largest extent in the event of various possible events. It also can helpful to build up replication-based disaster tolerance systems to ensure the business continuity.

  17. A Self-Paced, Web-Based, Positive Emotion Skills Intervention for Reducing Symptoms of Depression: Protocol for Development and Pilot Testing of MARIGOLD.

    PubMed

    Cheung, Elaine O; Addington, Elizabeth L; Bassett, Sarah M; Schuette, Stephanie A; Shiu, Eva W; Cohn, Michael A; Leykin, Yan; Saslow, Laura R; Moskowitz, Judith T

    2018-06-05

    Living with elevated symptoms of depression can have debilitating consequences for an individual's psychosocial and physical functioning, quality of life, and health care utilization. A growing body of evidence demonstrates that skills for increasing positive emotion can be helpful to individuals with depression. Although Web-based interventions to reduce negative emotion in individuals with depression are available, these interventions frequently suffer from poor retention and adherence and do not capitalize on the potential benefits of increasing positive emotion. The aim of this study was to develop and test a Web-based positive emotion skills intervention tailored for individuals living with elevated depressive symptoms, as well as to develop and test enhancement strategies for increasing retention and adherence to that intervention. This study protocol describes the development and testing for Mobile Affect Regulation Intervention with the Goal of Lowering Depression (MARIGOLD), a Web-based positive emotion skills intervention, adapted for individuals with elevated depressive symptomatology. The intervention development is taking place in three phases. In phase 1, we are tailoring an existing positive emotion skills intervention for individuals with elevated symptoms of depression and are pilot testing the tailored version of the intervention in a randomized controlled trial with two control conditions (N=60). In phase 2, we are developing and testing three enhancements aimed at boosting retention and adherence to the Web-based intervention (N=75): facilitator contact, an online discussion board, and virtual badges. In phase 3, we are conducting a multifactorial, nine-arm pilot trial (N=600) to systematically test these enhancement strategies, individually and in combination. The primary outcome is depressive symptom severity. Secondary outcomes include positive and negative emotion, psychological well-being, and coping resources. The project was funded in August 2014, and data collection was completed in May 2018. Data analysis is currently under way, and the first results are expected to be submitted for publication in 2018. Findings from this investigation will enable us to develop an optimal package of intervention content and enhancement strategies for individuals with elevated symptoms of depression. If this intervention proves to be effective, it will provide a cost-effective, anonymous, appealing, and flexible approach for reducing symptoms of depression and improving psychological adjustment through increasing positive emotion. ClinicalTrials.gov NCT01964820 (Phase 1); https://clinicaltrials.gov/ct2/show/NCT01964820 (Archived by WebCite at http://www.webcitation.org/6zpmKBcyX). ClinicalTrials.gov NCT02861755 (Phase 2); https://clinicaltrials.gov/ct2/show/NCT02861755 (Archived by WebCite at http://www.webcitation.org/6zpmLmy8k). RR1-10.2196/10494. ©Elaine O Cheung, Elizabeth L Addington, Sarah M Bassett, Stephanie A Schuette, Eva W Shiu, Michael A Cohn, Yan Leykin, Laura R Saslow, Judith T Moskowitz. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 05.06.2018.

  18. Cow's Milk Desensitization in Anaphylactic Patients: A New Personalized-dose Method.

    PubMed

    Babaie, Delara; Nabavi, Mohammad; Arshi, Saba; Mesdaghi, Mehrnaz; Chavoshzadeh, Zahra; Bemanian, Mohammad Hasan; Tafakori, Mitra; Amirmoini, Mehrdad; Esmailzadeh, Hosein; Molatefi, Rasoul; Rekabi, Mahsa; Akbarpour, Nadieh; Masoumi, Farimah; Fallahpour, Morteza

    2017-02-01

    Cow's milk allergy (CMA) is the most frequent food allergy in children and oral immunotherapy (OIT) is a promising approach for treatment of patients. The most challenging cases are anaphylactic with coexisting asthma and proposing safe protocols is crucial especially in high risk groups. Considering that CMA varies among patients, an individualized OIT protocol would be beneficial to achieve a safer and more efficient method of desensitization. 18 children more than 3 years of age with IgE-mediated CMA were enrolled. CMA was confirmed by positive skin prick test (SPT) and positive oral food challenge (OFC) and 60% of individuals had a convincing history of persistent asthma. SPT with milk extracts, whole fresh milk and serially diluted milk concentrations were performed.  The dilution of milk that induced 3-5 mm of wheal in each individual was selected as the starting dilution for OIT. Desensitization began by 1 drop of the defined dilution and continued increasingly. Overall, 16 out of 18 children (88.8%) achieved the daily intake of 120 mL of milk. Four out of these 16 children accomplished the protocol without any adverse allergic reactions. 12 patients experienced mild to severe reactions. Wheal and erythema in SPT (p≤0.001), and sIgE (p≤0.003) to most milk allergens were significantly decreased following desensitization. We successfully desensitized 16 of 18 children with IgE-mediated CMA by individualized desensitization protocol. Individualizing the OIT protocol would be helpful to save time and perhaps to relieve the allergic symptoms after ingesting cow's milk intake.

  19. Analysis of 213 currently used rehabilitation protocols in foot and ankle fractures.

    PubMed

    Pfeifer, Christian G; Grechenig, Stephan; Frankewycz, Borys; Ernstberger, Antonio; Nerlich, Michael; Krutsch, Werner

    2015-10-01

    Fractures of the ankle, hind- and midfoot are amongst the five most common fractures. Besides initial operative or non-operative treatment, rehabilitation of the patients plays a crucial role for fracture union and long term functional outcome. Limited evidence is available with regard to what a rehabilitation regimen should include and what guidelines should be in place for the initial clinical course of these patients. This study therefore investigated the current rehabilitation concepts after fractures of the ankle, hind- and midfoot. Written rehabilitation protocols provided by orthopedic and trauma surgery institutions in terms of recommendations for weight bearing, range of motion (ROM), physiotherapy and choice of orthosis were screened and analysed. All protocols for lateral ankle fractures type AO 44A1, AO 44B1 and AO 44C1, for calcaneal fractures and fractures of the metatarsal as well as other not specific were included. Descriptive analysis was carried out and statistical analysis applied where appropriate. 209 rehabilitation protocols for ankle fractures type AO 44B1 and AO 44C1, 98 for AO 44A1, 193 for metatarsal fractures, 142 for calcaneal fractures, 107 for 5(th) metatarsal base fractures and 70 for 5(th) metatarsal Jones fractures were evaluated. The mean time recommended for orthosis treatment was 6.04 (SD 0.04) weeks. While the majority of protocols showed a trend towards increased weight bearing and increased ROM over time, the best consensus was noted for weight bearing recommendations. Our study shows that there exists a huge variability in rehabilitation of fractures of the ankle-, hind- and midfoot. This may be contributed to a lack of consensus (e.g. missing publication of guidelines), individualized patient care (e.g. in fragility fractures) or lack of specialization. This study might serve as basis for prospective randomized controlled trials in order to optimize rehabilitation for these common fractures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. [Sampling optimization for tropical invertebrates: an example using dung beetles (Coleoptera: Scarabaeinae) in Venezuela].

    PubMed

    Ferrer-Paris, José Rafael; Sánchez-Mercado, Ada; Rodríguez, Jon Paul

    2013-03-01

    The development of efficient sampling protocols is an essential prerequisite to evaluate and identify priority conservation areas. There are f ew protocols for fauna inventory and monitoring in wide geographical scales for the tropics, where the complexity of communities and high biodiversity levels, make the implementation of efficient protocols more difficult. We proposed here a simple strategy to optimize the capture of dung beetles, applied to sampling with baited traps and generalizable to other sampling methods. We analyzed data from eight transects sampled between 2006-2008 withthe aim to develop an uniform sampling design, that allows to confidently estimate species richness, abundance and composition at wide geographical scales. We examined four characteristics of any sampling design that affect the effectiveness of the sampling effort: the number of traps, sampling duration, type and proportion of bait, and spatial arrangement of the traps along transects. We used species accumulation curves, rank-abundance plots, indicator species analysis, and multivariate correlograms. We captured 40 337 individuals (115 species/morphospecies of 23 genera). Most species were attracted by both dung and carrion, but two thirds had greater relative abundance in traps baited with human dung. Different aspects of the sampling design influenced each diversity attribute in different ways. To obtain reliable richness estimates, the number of traps was the most important aspect. Accurate abundance estimates were obtained when the sampling period was increased, while the spatial arrangement of traps was determinant to capture the species composition pattern. An optimum sampling strategy for accurate estimates of richness, abundance and diversity should: (1) set 50-70 traps to maximize the number of species detected, (2) get samples during 48-72 hours and set trap groups along the transect to reliably estimate species abundance, (3) set traps in groups of at least 10 traps to suitably record the local species composition, and (4) separate trap groups by a distance greater than 5-10km to avoid spatial autocorrelation. For the evaluation of other sampling protocols we recommend to, first, identify the elements of sampling design that could affect the sampled effort (the number of traps, sampling duration, type and proportion of bait) and their spatial distribution (spatial arrangement of the traps) and then, to evaluate how they affect richness, abundance and species composition estimates.

  1. Unique nucleotide sequence-guided assembly of repetitive DNA parts for synthetic biology applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torella, JP; Lienert, F; Boehm, CR

    2014-08-07

    Recombination-based DNA construction methods, such as Gibson assembly, have made it possible to easily and simultaneously assemble multiple DNA parts, and they hold promise for the development and optimization of metabolic pathways and functional genetic circuits. Over time, however, these pathways and circuits have become more complex, and the increasing need for standardization and insulation of genetic parts has resulted in sequence redundancies-for example, repeated terminator and insulator sequences-that complicate recombination-based assembly. We and others have recently developed DNA assembly methods, which we refer to collectively as unique nucleotide sequence (UNS)-guided assembly, in which individual DNA parts are flanked withmore » UNSs to facilitate the ordered, recombination-based assembly of repetitive sequences. Here we present a detailed protocol for UNS-guided assembly that enables researchers to convert multiple DNA parts into sequenced, correctly assembled constructs, or into high-quality combinatorial libraries in only 2-3 d. If the DNA parts must be generated from scratch, an additional 2-5 d are necessary. This protocol requires no specialized equipment and can easily be implemented by a student with experience in basic cloning techniques.« less

  2. Surface cell immobilization within perfluoroalkoxy microchannels

    NASA Astrophysics Data System (ADS)

    Stojkovič, Gorazd; Krivec, Matic; Vesel, Alenka; Marinšek, Marjan; Žnidaršič-Plazl, Polona

    2014-11-01

    Perfluoroalkoxy (PFA) is one of the most promising materials for the fabrication of cheap, solvent resistant and reusable microfluidic chips, which have been recently recognized as effective tools for biocatalytic process development. The application of biocatalysts significantly depends on efficient immobilization of enzymes or cells within the reactor enabling long-term biocatalyst use. Functionalization of PFA microchannels by 3-aminopropyltriethoxysilane (ATPES) and glutaraldehyde was used for rapid preparation of microbioreactors with surface-immobilized cells. X-ray photoelectron spectroscopy and scanning electron microscopy were used to accurately monitor individual treatment steps and to select conditions for cell immobilization. The optimized protocol for Saccharomyces cerevisiae immobilization on PFA microchannel walls comprised ethanol surface pretreatment, 4 h contacting with 10% APTES aqueous solution, 10 min treatment with 1% glutaraldehyde and 20 min contacting with cells in deionized water. The same protocol enabled also immobilization of Escherichia coli, Pseudomonas putida and Bacillus subtilis cells on PFA surface in high densities. Furthermore, the developed procedure has been proved to be very efficient also for surface immobilization of tested cells on other materials that are used for microreactor fabrication, including glass, polystyrene, poly (methyl methacrylate), polycarbonate, and two olefin-based polymers, namely Zeonor® and Topas®.

  3. Oral hygiene is an important factor for prevention of ventilator-associated pneumonia.

    PubMed

    Par, Matej; Badovinac, Ana; Plancak, Darije

    2014-03-01

    Inadequate oral hygiene in intensive care units (ICUs) has been recognized as a critical issue, for it is an important risk factor for ventilator associated pneumonia (VAP). VAP is an aspiration pneumonia that occurs in mechanically ventilated patients, mostly caused by bacteria colonizing the oral cavity and dental plaque. It is the second most common nosocomial infection and the leading cause of complications and death in mechanically ventilated patients. It has been suggested that improvement of oral hygiene in ICU patients could lead to a reduced incidence of VAP. Although diverse oral care measures for ICU patients have been proposed in the literature, there is no evidence that could identify the most efficient ones. Although there are several evidence-based protocols, oral care measures are still performed inconsistently and differ greatly between individual ICUs. This paper lists the oral care measures most commonly performed in ICUs, indicating their advantages and disadvantages. Brushing with regular toothbrush and rinsing with chlorhexidine are considered optimal measures of oral hygiene in critically ill patients. To date, there is no definitive agreement about the most effective oral care protocol, but evidence demonstrates that consistent performance of oral care may lower the incidence of VAP in critically ill patients.

  4. A Software Tool for the Annotation of Embolic Events in Echo Doppler Audio Signals

    PubMed Central

    Pierleoni, Paola; Maurizi, Lorenzo; Palma, Lorenzo; Belli, Alberto; Valenti, Simone; Marroni, Alessandro

    2017-01-01

    The use of precordial Doppler monitoring to prevent decompression sickness (DS) is well known by the scientific community as an important instrument for early diagnosis of DS. However, the timely and correct diagnosis of DS without assistance from diving medical specialists is unreliable. Thus, a common protocol for the manual annotation of echo Doppler signals and a tool for their automated recording and annotation are necessary. We have implemented original software for efficient bubble appearance annotation and proposed a unified annotation protocol. The tool auto-sets the response time of human “bubble examiners,” performs playback of the Doppler file by rendering it independent of the specific audio player, and enables the annotation of individual bubbles or multiple bubbles known as “showers.” The tool provides a report with an optimized data structure and estimates the embolic risk level according to the Extended Spencer Scale. The tool is built in accordance with ISO/IEC 9126 on software quality and has been projected and tested with assistance from the Divers Alert Network (DAN) Europe Foundation, which employs this tool for its diving data acquisition campaigns. PMID:29242701

  5. Unique nucleotide sequence (UNS)-guided assembly of repetitive DNA parts for synthetic biology applications

    PubMed Central

    Torella, Joseph P.; Lienert, Florian; Boehm, Christian R.; Chen, Jan-Hung; Way, Jeffrey C.; Silver, Pamela A.

    2016-01-01

    Recombination-based DNA construction methods, such as Gibson assembly, have made it possible to easily and simultaneously assemble multiple DNA parts and hold promise for the development and optimization of metabolic pathways and functional genetic circuits. Over time, however, these pathways and circuits have become more complex, and the increasing need for standardization and insulation of genetic parts has resulted in sequence redundancies — for example repeated terminator and insulator sequences — that complicate recombination-based assembly. We and others have recently developed DNA assembly methods that we refer to collectively as unique nucleotide sequence (UNS)-guided assembly, in which individual DNA parts are flanked with UNSs to facilitate the ordered, recombination-based assembly of repetitive sequences. Here we present a detailed protocol for UNS-guided assembly that enables researchers to convert multiple DNA parts into sequenced, correctly-assembled constructs, or into high-quality combinatorial libraries in only 2–3 days. If the DNA parts must be generated from scratch, an additional 2–5 days are necessary. This protocol requires no specialized equipment and can easily be implemented by a student with experience in basic cloning techniques. PMID:25101822

  6. Therapeutic drug monitoring of antimetabolic cytotoxic drugs

    PubMed Central

    Lennard, L

    1999-01-01

    Therapeutic drug monitoring is not routinely used for cytotoxic agents. There are several reasons, but one major drawback is the lack of established therapeutic concentration ranges. Combination chemotherapy makes the establishment of therapeutic ranges for individual drugs difficult, the concentration-effect relationship for a single drug may not be the same as that when the drug is used in a drug combination. Pharmacokinetic optimization protocols for many classes of cytotoxic compounds exist in specialized centres, and some of these protocols are now part of large multicentre trials. Nonetheless, methotrexate is the only agent which is routinely monitored in most treatment centres. An additional factor, especially in antimetabolite therapy, is the existence of pharmacogenetic enzymes which play a major role in drug metabolism. Monitoring of therapy could include assay of phenotypic enzyme activities or genotype in addition to, or instead of, the more traditional measurement of parent drug or drug metabolites. The cytotoxic activities of mercaptopurine and fluorouracil are regulated by thiopurine methyltransferase (TPMT) and dihydropyrimidine dehydrogenase (DPD), respectively. Lack of TPMT functional activity produces life-threatening mercaptopurine myelotoxicity. Very low DPD activity reduces fluorouracil breakdown producing severe cytotoxicity. These pharmacogenetic enzymes can influence the bioavailability, pharmacokinetics, toxicity and efficacy of their substrate drugs. PMID:10190647

  7. EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols

    PubMed Central

    Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A

    2012-01-01

    The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490

  8. Isolation of site-specific anharmonicities of individual water molecules in the I-·(H2O)2 complex using tag-free, isotopomer selective IR-IR double resonance

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Duong, Chinh H.; Kelleher, Patrick J.; Johnson, Mark A.; McCoy, Anne B.

    2017-12-01

    We reveal the microscopic mechanics of iodide ion microhydration by recording the isotopomer-selective vibrational spectra of the I-·(H2O)·(D2O), I-·(HOD)·(D2O), and I-·(DOH)·(H2O) isotopologues using a new class of ion spectrometer that is optimized to carry out two-color, IR-IR photodissociation in a variety of pump-probe schemes. Using one of these, we record the linear absorption spectrum of a cryogenically cooled cluster without the use of a messenger ;tag;. In another protocol, we reveal the spectra of individual H2O and D2O molecules embedded in each of the two possible binding sites in the iodide dihydrate, as well as the bands due to individual OH and OD groups in each of the four local binding environments. Finally, we demonstrate how temperature dependent isotopic scrambling among the spectral features can be used to monitor the onset of large amplitude motion, heretofore inferred from changes in the envelope of the OH stretching vibrational manifold.

  9. Sperm Cell Population Dynamics in Ram Semen during the Cryopreservation Process

    PubMed Central

    Ramón, Manuel; Pérez-Guzmán, M. Dolores; Jiménez-Rabadán, Pilar; Esteso, Milagros C.; García-Álvarez, Olga; Maroto-Morales, Alejandro; Anel-López, Luis; Soler, Ana J.; Fernández-Santos, M. Rocío; Garde, J. Julián

    2013-01-01

    Background Sperm cryopreservation has become an indispensable tool in biology. Initially, studies were aimed towards the development of efficient freezing protocols in different species that would allow for an efficient storage of semen samples for long periods of time, ensuring its viability. Nowadays, it is widely known that an important individual component exists in the cryoresistance of semen, and efforts are aimed at identifying those sperm characteristics that may allow us to predict this cryoresistance. This knowledge would lead, ultimately, to the design of optimized freezing protocols for the sperm characteristics of each male. Methodology/Principal Findings We have evaluated the changes that occur in the sperm head dimensions throughout the cryopreservation process. We have found three different patterns of response, each of one related to a different sperm quality at thawing. We have been able to characterize males based on these patterns. For each male, its pattern remained constant among different ejaculates. This latter would imply that males always respond in the same way to freezing, giving even more importance to this sperm feature. Conclusions/Significance Changes in the sperm head during cryopreservation process have resulted useful to identify the ability of semen of males for freezing. We suggest that analyses of these response patterns would represent an important tool to characterize the cryoresistance of males when implemented within breeding programs. We also propose follow-up experiments to examine the outcomes of the use of different freezing protocols depending on the pattern of response of males. PMID:23544054

  10. International guideline for the delineation of the clinical target volumes (CTV) for nasopharyngeal carcinoma.

    PubMed

    Lee, Anne W; Ng, Wai Tong; Pan, Jian Ji; Poh, Sharon S; Ahn, Yong Chan; AlHussain, Hussain; Corry, June; Grau, Cai; Grégoire, Vincent; Harrington, Kevin J; Hu, Chao Su; Kwong, Dora L; Langendijk, Johannes A; Le, Quynh Thu; Lee, Nancy Y; Lin, Jin Ching; Lu, Tai Xiang; Mendenhall, William M; O'Sullivan, Brian; Ozyar, Enis; Peters, Lester J; Rosenthal, David I; Soong, Yoke Lim; Tao, Yungan; Yom, Sue S; Wee, Joseph T

    2018-01-01

    Target delineation in nasopharyngeal carcinoma (NPC) often proves challenging because of the notoriously narrow therapeutic margin. High doses are needed to achieve optimal levels of tumour control, and dosimetric inadequacy remains one of the most important independent factors affecting treatment outcome. A review of the available literature addressing the natural behaviour of NPC and correlation between clinical and pathological aspects of the disease was conducted. Existing international guidelines as well as published protocols specified by clinical trials on contouring of clinical target volumes (CTV) were compared. This information was then summarized into a preliminary draft guideline which was then circulated to international experts in the field for exchange of opinions and subsequent voting on areas with the greatest controversies. Common areas of uncertainty and variation in practices among experts experienced in radiation therapy for NPC were elucidated. Iterative revisions were made based on extensive discussion and final voting on controversial areas by the expert panel, to formulate the recommendations on contouring of CTV based on optimal geometric expansion and anatomical editing for those structures with substantial risk of microscopic infiltration. Through this comprehensive review of available evidence and best practices at major institutions, as well as interactive exchange of vast experience by international experts, this set of consensus guidelines has been developed to provide a practical reference for appropriate contouring to ensure optimal target coverage. However, the final decision on the treatment volumes should be based on full consideration of individual patients' factors and facilities of an individual centre (including the quality of imaging methods and the precision of treatment delivery). Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Performance characteristics of magnetic resonance imaging without contrast agents or sedation in pediatric appendicitis.

    PubMed

    Didier, Ryne A; Hopkins, Katharine L; Coakley, Fergus V; Krishnaswami, Sanjay; Spiro, David M; Foster, Bryan R

    2017-09-01

    Magnetic resonance imaging (MRI) has emerged as a promising modality for evaluating pediatric appendicitis. However optimal imaging protocols, including roles of contrast agents and sedation, have not been established and diagnostic criteria have not been fully evaluated. To investigate performance characteristics of rapid MRI without contrast agents or sedation in the diagnosis of pediatric appendicitis. We included patients ages 4-18 years with suspicion of appendicitis who underwent rapid MRI between October 2013 and March 2015 without contrast agent or sedation. After two-radiologist review, we determined performance characteristics of individual diagnostic criteria and aggregate diagnostic criteria by comparing MRI results to clinical outcomes. We used receiver operating characteristic (ROC) curves to determine cut-points for appendiceal diameter and wall thickness for optimization of predictive power, and we calculated area under the curve (AUC) as a measure of test accuracy. Ninety-eight MRI examinations were performed in 97 subjects. Overall, MRI had a 94% sensitivity, 95% specificity, 91% positive predictive value and 97% negative predictive value. Optimal cut-points for appendiceal diameter and wall thickness were ≥7 mm and ≥2 mm, respectively. Independently, those cut-points produced sensitivities of 91% and 84% and specificities of 84% and 43%. Presence of intraluminal fluid (30/33) or localized periappendiceal fluid (32/33) showed a significant association with acute appendicitis (P<0.01), with sensitivities of 91% and 97% and specificities of 60% and 50%. For examinations in which the appendix was not identified by one or both reviewers (23/98), the clinical outcome was negative. Rapid MRI without contrast agents or sedation is accurate for diagnosis of pediatric appendicitis when multiple diagnostic criteria are considered in aggregate. Individual diagnostic criteria including optimized cut-points of ≥7 mm for diameter and ≥2 mm for wall thickness demonstrate high sensitivities but relatively low specificities. Nonvisualization of the appendix favors a negative diagnosis.

  12. Effects of Data Replication on Data Exfiltration in Mobile Ad Hoc Networks Utilizing Reactive Protocols

    DTIC Science & Technology

    2015-03-01

    2.5.5 Availability Schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.6 Simulation Environments...routing scheme can prove problematic. Two prominent proactive protocols, 7 Destination-Sequenced Distance-Vector (DSDV) and Optimized Link State...distributed file management systems such as Tahoe- LAFS as part of its replication scheme . Altman and De Pellegrini [4] examine the impact of FEC and

  13. Generation of Oligodendrogenic Spinal Neural Progenitor Cells From Human Induced Pluripotent Stem Cells.

    PubMed

    Khazaei, Mohamad; Ahuja, Christopher S; Fehlings, Michael G

    2017-08-14

    This unit describes protocols for the efficient generation of oligodendrogenic neural progenitor cells (o-NPCs) from human induced pluripotent stem cells (hiPSCs). Specifically, detailed methods are provided for the maintenance and differentiation of hiPSCs, human induced pluripotent stem cell-derived neural progenitor cells (hiPS-NPCs), and human induced pluripotent stem cell-oligodendrogenic neural progenitor cells (hiPSC-o-NPCs) with the final products being suitable for in vitro experimentation or in vivo transplantation. Throughout, cell exposure to growth factors and patterning morphogens has been optimized for both concentration and timing, based on the literature and empirical experience, resulting in a robust and highly efficient protocol. Using this derivation procedure, it is possible to obtain millions of oligodendrogenic-NPCs within 40 days of initial cell plating which is substantially shorter than other protocols for similar cell types. This protocol has also been optimized to use translationally relevant human iPSCs as the parent cell line. The resultant cells have been extensively characterized both in vitro and in vivo and express key markers of an oligodendrogenic lineage. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley and Sons, Inc.

  14. The introduction of a protocol for the use of biobrane for facial burns in children.

    PubMed

    Rogers, A D; Adams, S; Rode, H

    2011-01-01

    BIOBRANE HAS BECOME AN INDISPENSIBLE DRESSING WITH THREE ESTABLISHED INDICATIONS IN ACUTE BURNS CARE AT OUR INSTITUTION: (1) as the definitive dressing of superficial partial thickness facial burns, (2) after tangential excision of deep burns when autograft or cadaver skin is unavailable, and (3) for graft reduction. This paper details our initial experience of Biobrane for the management of superficial partial thickness facial burns in children and the protocol that was compiled for its optimal use. A retrospective analysis of theatre records, case notes and photographs was performed to evaluate our experience with Biobrane over a one-year period. Endpoints included length of stay, analgesic requirements, time to application of Biobrane, healing times, and aesthetic results. Historical controls were used to compare the results with our previous standard of care. 87 patients with superficial partial thickness burns of the face had Biobrane applied during this period. By adhering to the protocol we were able to demonstrate significant reductions in hospital stay, healing time, analgesic requirements, nursing care, with excellent cosmetic results. The protocol is widely accepted by all involved in the optimal management of these patients, including parents, anaesthetists, and nursing staff.

  15. Novel functional hepatitis C virus glycoprotein isolates identified using an optimized viral pseudotype entry assay.

    PubMed

    Urbanowicz, Richard A; McClure, C Patrick; King, Barnabas; Mason, Christopher P; Ball, Jonathan K; Tarr, Alexander W

    2016-09-01

    Retrovirus pseudotypes are a highly tractable model used to study the entry pathways of enveloped viruses. This model has been extensively applied to the study of the hepatitis C virus (HCV) entry pathway, preclinical screening of antiviral antibodies and for assessing the phenotype of patient-derived viruses using HCV pseudoparticles (HCVpp) possessing the HCV E1 and E2 glycoproteins. However, not all patient-isolated clones produce particles that are infectious in this model. This study investigated factors that might limit phenotyping of patient-isolated HCV glycoproteins. Genetically related HCV glycoproteins from quasispecies in individual patients were discovered to behave very differently in this entry model. Empirical optimization of the ratio of packaging construct and glycoprotein-encoding plasmid was required for successful HCVpp genesis for different clones. The selection of retroviral packaging construct also influenced the function of HCV pseudoparticles. Some glycoprotein constructs tolerated a wide range of assay parameters, while others were much more sensitive to alterations. Furthermore, glycoproteins previously characterized as unable to mediate entry were found to be functional. These findings were validated using chimeric cell-cultured HCV bearing these glycoproteins. Using the same empirical approach we demonstrated that generation of infectious ebolavirus pseudoviruses (EBOVpv) was also sensitive to the amount and ratio of plasmids used, and that protocols for optimal production of these pseudoviruses are dependent on the exact virus glycoprotein construct. These findings demonstrate that it is crucial for studies utilizing pseudoviruses to conduct empirical optimization of pseudotype production for each specific glycoprotein sequence to achieve optimal titres and facilitate accurate phenotyping.

  16. Discrete Particle Swarm Optimization Routing Protocol for Wireless Sensor Networks with Multiple Mobile Sinks

    PubMed Central

    Yang, Jin; Liu, Fagui; Cao, Jianneng; Wang, Liangming

    2016-01-01

    Mobile sinks can achieve load-balancing and energy-consumption balancing across the wireless sensor networks (WSNs). However, the frequent change of the paths between source nodes and the sinks caused by sink mobility introduces significant overhead in terms of energy and packet delays. To enhance network performance of WSNs with mobile sinks (MWSNs), we present an efficient routing strategy, which is formulated as an optimization problem and employs the particle swarm optimization algorithm (PSO) to build the optimal routing paths. However, the conventional PSO is insufficient to solve discrete routing optimization problems. Therefore, a novel greedy discrete particle swarm optimization with memory (GMDPSO) is put forward to address this problem. In the GMDPSO, particle’s position and velocity of traditional PSO are redefined under discrete MWSNs scenario. Particle updating rule is also reconsidered based on the subnetwork topology of MWSNs. Besides, by improving the greedy forwarding routing, a greedy search strategy is designed to drive particles to find a better position quickly. Furthermore, searching history is memorized to accelerate convergence. Simulation results demonstrate that our new protocol significantly improves the robustness and adapts to rapid topological changes with multiple mobile sinks, while efficiently reducing the communication overhead and the energy consumption. PMID:27428971

  17. On the MTD paradigm and optimal control for multi-drug cancer chemotherapy.

    PubMed

    Ledzewicz, Urszula; Schättler, Heinz; Gahrooi, Mostafa Reisi; Dehkordi, Siamak Mahmoudian

    2013-06-01

    In standard chemotherapy protocols, drugs are given at maximum tolerated doses (MTD) with rest periods in between. In this paper, we briey discuss the rationale behind this therapy approach and, using as example multidrug cancer chemotherapy with a cytotoxic and cytostatic agent, show that these types of protocols are optimal in the sense of minimizing a weighted average of the number of tumor cells (taken both at the end of therapy and at intermediate times) and the total dose given if it is assumed that the tumor consists of a homogeneous population of chemotherapeutically sensitive cells. A 2-compartment linear model is used to model the pharmacokinetic equations for the drugs.

  18. A standardized staining protocol for L1CAM on formalin-fixed, paraffin-embedded tissues using automated platforms.

    PubMed

    Fogel, Mina; Harari, Ayelet; Müller-Holzner, Elisabeth; Zeimet, Alain G; Moldenhauer, Gerhard; Altevogt, Peter

    2014-06-25

    The L1 cell adhesion molecule (L1CAM) is overexpressed in many human cancers and can serve as a biomarker for prognosis in most of these cancers (including type I endometrial carcinomas). Here we provide an optimized immunohistochemical staining procedure for a widely used automated platform (VENTANA™), which has recourse to commercially available primary antibody and detection reagents. In parallel, we optimized the staining on a semi-automated BioGenix (i6000) 
immunostainer. These protocols yield good stainings and should represent the basis for a reliable and standardized immunohistochemical detection of L1CAM in a variety of malignancies in different laboratories.

  19. Optimizing radiation exposure in screening of body packing: image quality and diagnostic acceptability of an 80 kVp protocol with automated tube current modulation.

    PubMed

    Aissa, Joel; Boos, Johannes; Rubbert, Christian; Caspers, Julian; Schleich, Christoph; Thomas, Christoph; Kröpil, Patric; Antoch, Gerald; Miese, Falk

    2017-06-01

    The aim of this study was to evaluate the objective and subjective image quality of a novel computed tomography (CT) protocol with reduced radiation dose for body packing with 80 kVp and automated tube current modulation (ATCM) compared to a standard body packing CT protocol. 80 individuals who were examined between March 2012 and July 2015 in suspicion of ingested drug packets were retrospectively included in this study. Thirty-one CT examinations were performed using ATCM and a fixed tube voltage of 80 kVp (group A). Forty-nine CT examinations were performed using a standard protocol with a tube voltage of 120 kVp and a fixed tube current time product of 40 mAs (group B). Subjective and objective image quality and visibility of drug packets were assessed. Radiation exposure of both protocols was compared. Contrast-to-noise ratio (group A: 0.56 ± 0.36; group B: 1.13 ± 0.91) and Signal-to-noise ratio (group A: 3.69 ± 0.98; group B: 7.08 ± 2.67) were significantly lower for group A compared to group B (p < 0.001). Subjectively, image quality was decreased for group A compared to group B (2.5 ± 0.8 vs. 1.2 ± 0.4; p < 0.001). Attenuation of body packets was higher with the new protocol (group A: 362.2 ± 70.3 Hounsfield Units (HU); group B: 210.6 ± 60.2 HU; p = 0.005). Volumetric Computed Tomography Dose Index (CTDIvol) and Dose Length Product (DLP) were significantly lower in group A (CTDIvol 2.2 ± 0.9 mGy, DLP 105.7 ± 52.3 mGycm) as compared to group B (CTDIvol 2.7 ± 0.1 mGy, DLP 126.0 ± 9.7 mGycm, p = 0.002 and p = 0.01). The novel 80 kVp CT protocol with ATCM leads to a significant dose reduction compared to a standard CT body packing protocol. The novel protocol led to a diagnostic image quality and cocaine body packets were reliably detected due to the high attenuation.

  20. Utilization of paramagnetic microparticles for automated isolation of free circulating mRNA as a new tool in prostate cancer diagnostics.

    PubMed

    Fojtu, Michaela; Gumulec, Jaromir; Balvan, Jan; Raudenska, Martina; Sztalmachova, Marketa; Polanska, Hana; Smerkova, Kristyna; Adam, Vojtech; Kizek, Rene; Masarik, Michal

    2014-02-01

    Determination of serum mRNA gained a lot of attention in recent years, particularly from the perspective of disease markers. Streptavidin-modified paramagnetic particles (SMPs) seem an interesting technique, mainly due to possible automated isolation and high efficiency. The aim of this study was to optimize serum isolation protocol to reduce the consumption of chemicals and sample volume. The following factors were optimized: amounts of (i) paramagnetic particles, (ii) oligo(dT)20 probe, (iii) serum, and (iv) the binding sequence (SMPs, oligo(dT)20 , serum vs. oligo(dT)20 , serum and SMPs). RNA content was measured, and the expression of metallothionein-2A as possible prostate cancer marker was analyzed to demonstrate measurable RNA content with ability for RT-PCR detection. Isolation is possible on serum volume range (10-200 μL) without altering of efficiency or purity. Amount of SMPs can be reduced up to 5 μL, with optimal results within 10-30 μL SMPs. Volume of oligo(dT)20 does not affect efficiency, when used within 0.1-0.4 μL. This optimized protocol was also modified to fit needs of automated one-step single-tube analysis with identical efficiency compared to conventional setup. One-step analysis protocol is considered a promising simplification, making RNA isolation suitable for automatable process. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Distributed Wireless Power Transfer With Energy Feedback

    NASA Astrophysics Data System (ADS)

    Lee, Seunghyun; Zhang, Rui

    2017-04-01

    Energy beamforming (EB) is a key technique for achieving efficient radio-frequency (RF) transmission enabled wireless energy transfer (WET). By optimally designing the waveforms from multiple energy transmitters (ETs) over the wireless channels, they can be constructively combined at the energy receiver (ER) to achieve an EB gain that scales with the number of ETs. However, the optimal design of EB waveforms requires accurate channel state information (CSI) at the ETs, which is challenging to obtain practically, especially in a distributed system with ETs at separate locations. In this paper, we study practical and efficient channel training methods to achieve optimal EB in a distributed WET system. We propose two protocols with and without centralized coordination, respectively, where distributed ETs either sequentially or in parallel adapt their transmit phases based on a low-complexity energy feedback from the ER. The energy feedback only depends on the received power level at the ER, where each feedback indicates one particular transmit phase that results in the maximum harvested power over a set of previously used phases. Simulation results show that the two proposed training protocols converge very fast in practical WET systems even with a large number of distributed ETs, while the protocol with sequential ET phase adaptation is also analytically shown to converge to the optimal EB design with perfect CSI by increasing the training time. Numerical results are also provided to evaluate the performance of the proposed distributed EB and training designs as compared to other benchmark schemes.

  2. A two-hop based adaptive routing protocol for real-time wireless sensor networks.

    PubMed

    Rachamalla, Sandhya; Kancherla, Anitha Sheela

    2016-01-01

    One of the most important and challenging issues in wireless sensor networks (WSNs) is to optimally manage the limited energy of nodes without degrading the routing efficiency. In this paper, we propose an energy-efficient adaptive routing mechanism for WSNs, which saves energy of nodes by removing the much delayed packets without degrading the real-time performance of the used routing protocol. It uses the adaptive transmission power algorithm which is based on the attenuation of the wireless link to improve the energy efficiency. The proposed routing mechanism can be associated with any geographic routing protocol and its performance is evaluated by integrating with the well known two-hop based real-time routing protocol, PATH and the resulting protocol is energy-efficient adaptive routing protocol (EE-ARP). The EE-ARP performs well in terms of energy consumption, deadline miss ratio, packet drop and end-to-end delay.

  3. Finite-key security analyses on passive decoy-state QKD protocols with different unstable sources

    PubMed Central

    Song, Ting-Ting; Qin, Su-Juan; Wen, Qiao-Yan; Wang, Yu-Kun; Jia, Heng-Yue

    2015-01-01

    In quantum communication, passive decoy-state QKD protocols can eliminate many side channels, but the protocols without any finite-key analyses are not suitable for in practice. The finite-key securities of passive decoy-state (PDS) QKD protocols with two different unstable sources, type-II parametric down-convention (PDC) and phase randomized weak coherent pulses (WCPs), are analyzed in our paper. According to the PDS QKD protocols, we establish an optimizing programming respectively and obtain the lower bounds of finite-key rates. Under some reasonable values of quantum setup parameters, the lower bounds of finite-key rates are simulated. The simulation results show that at different transmission distances, the affections of different fluctuations on key rates are different. Moreover, the PDS QKD protocol with an unstable PDC source can resist more intensity fluctuations and more statistical fluctuation. PMID:26471947

  4. Packet-Based Protocol Efficiency for Aeronautical and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Carek, David A.

    2005-01-01

    This paper examines the relation between bit error ratios and the effective link efficiency when transporting data with a packet-based protocol. Relations are developed to quantify the impact of a protocol s packet size and header size relative to the bit error ratio of the underlying link. These relations are examined in the context of radio transmissions that exhibit variable error conditions, such as those used in satellite, aeronautical, and other wireless networks. A comparison of two packet sizing methodologies is presented. From these relations, the true ability of a link to deliver user data, or information, is determined. Relations are developed to calculate the optimal protocol packet size forgiven link error characteristics. These relations could be useful in future research for developing an adaptive protocol layer. They can also be used for sizing protocols in the design of static links, where bit error ratios have small variability.

  5. Numerical approach for unstructured quantum key distribution

    PubMed Central

    Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert

    2016-01-01

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739

  6. Optimizing Filter-Probe Diffusion Weighting in the Rat Spinal Cord for Human Translation

    PubMed Central

    Budde, Matthew D.; Skinner, Nathan P.; Muftuler, L. Tugan; Schmit, Brian D.; Kurpad, Shekar N.

    2017-01-01

    Diffusion tensor imaging (DTI) is a promising biomarker of spinal cord injury (SCI). In the acute aftermath, DTI in SCI animal models consistently demonstrates high sensitivity and prognostic performance, yet translation of DTI to acute human SCI has been limited. In addition to technical challenges, interpretation of the resulting metrics is ambiguous, with contributions in the acute setting from both axonal injury and edema. Novel diffusion MRI acquisition strategies such as double diffusion encoding (DDE) have recently enabled detection of features not available with DTI or similar methods. In this work, we perform a systematic optimization of DDE using simulations and an in vivo rat model of SCI and subsequently implement the protocol to the healthy human spinal cord. First, two complementary DDE approaches were evaluated using an orientationally invariant or a filter-probe diffusion encoding approach. While the two methods were similar in their ability to detect acute SCI, the filter-probe DDE approach had greater predictive power for functional outcomes. Next, the filter-probe DDE was compared to an analogous single diffusion encoding (SDE) approach, with the results indicating that in the spinal cord, SDE provides similar contrast with improved signal to noise. In the SCI rat model, the filter-probe SDE scheme was coupled with a reduced field of view (rFOV) excitation, and the results demonstrate high quality maps of the spinal cord without contamination from edema and cerebrospinal fluid, thereby providing high sensitivity to injury severity. The optimized protocol was demonstrated in the healthy human spinal cord using the commercially-available diffusion MRI sequence with modifications only to the diffusion encoding directions. Maps of axial diffusivity devoid of CSF partial volume effects were obtained in a clinically feasible imaging time with a straightforward analysis and variability comparable to axial diffusivity derived from DTI. Overall, the results and optimizations describe a protocol that mitigates several difficulties with DTI of the spinal cord. Detection of acute axonal damage in the injured or diseased spinal cord will benefit the optimized filter-probe diffusion MRI protocol outlined here. PMID:29311786

  7. Template-based de novo design for type II kinase inhibitors and its extented application to acetylcholinesterase inhibitors.

    PubMed

    Su, Bo-Han; Huang, Yi-Syuan; Chang, Chia-Yun; Tu, Yi-Shu; Tseng, Yufeng J

    2013-10-31

    There is a compelling need to discover type II inhibitors targeting the unique DFG-out inactive kinase conformation since they are likely to possess greater potency and selectivity relative to traditional type I inhibitors. Using a known inhibitor, such as a currently available and approved drug or inhibitor, as a template to design new drugs via computational de novo design is helpful when working with known ligand-receptor interactions. This study proposes a new template-based de novo design protocol to discover new inhibitors that preserve and also optimize the binding interactions of the type II kinase template. First, sorafenib (Nexavar) and nilotinib (Tasigna), two type II inhibitors with different ligand-receptor interactions, were selected as the template compounds. The five-step protocol can reassemble each drug from a large fragment library. Our procedure demonstrates that the selected template compounds can be successfully reassembled while the key ligand-receptor interactions are preserved. Furthermore, to demonstrate that the algorithm is able to construct more potent compounds, we considered kinase inhibitors and other protein dataset, acetylcholinesterase (AChE) inhibitors. The de novo optimization was initiated using a template compound possessing a less than optimal activity from a series of aminoisoquinoline and TAK-285 inhibiting type II kinases, and E2020 derivatives inhibiting AChE respectively. Three compounds with greater potency than the template compound were discovered that were also included in the original congeneric series. This template-based lead optimization protocol with the fragment library can help to design compounds with preferred binding interactions of known inhibitors automatically and further optimize the compounds in the binding pockets.

  8. Community Pharmacists Assisting in Total Cardiovascular Health (CPATCH): A Cluster-Randomized, Controlled Trial Testing a Focused Adherence Strategy Involving Community Pharmacies.

    PubMed

    Blackburn, David F; Evans, Charity D; Eurich, Dean T; Mansell, Kerry D; Jorgenson, Derek J; Taylor, Jeff G; Semchuk, William M; Shevchuk, Yvonne M; Remillard, Alfred J; Tran, David A; Champagne, Anne P

    2016-10-01

    To test a brief intervention for preventing statin nonadherence among community pharmacy patrons. Prospective, cluster-randomized, controlled trial (the Community Pharmacists Assisting in Total Cardiovascular Health [CPATCH] trial). Thirty community pharmacies in Saskatchewan, Canada. Participating pharmacies were randomized to 15 intervention pharmacies where a brief statin adherence intervention was delivered by pharmacists (intervention group [907 patients]) or 15 usual care pharmacies where no statin adherence intervention was delivered (usual care group [999 patients]) to new users of statins (defined as less than 1 yr of statin therapy). Staff (pharmacy managers, staff pharmacists, and technicians) from intervention pharmacies attended a 2.5-hour workshop on the CPATCH program that prepared pharmacists to deal with the adherence barriers most likely associated with statin use (e.g., safety, cost, patient-provider relationship, and tolerability). Intervention pharmacists screened for new statin users and assessed these adherence barriers. Pharmacists were then instructed to tailor their follow-up plan based on the individual patient's situation. Investigators contacted the intervention pharmacies monthly to assess their compliance with the protocol and to offer additional support to motivate ongoing participation. The primary outcome was mean difference in statin adherence between the intervention and usual care groups. Adherence was measured by the proportion of days covered (PDC) between 6 and 12 months following the original prescription fill date. General estimating equations were used to evaluate the difference in mean adherence between groups. Secondary outcomes included the percentage of new statin users exhibiting optimal adherence (defined as PDC of 80% or higher) and the percentage exhibiting nonpersistence (defined as the cessation of all statin dispensations within 3 mo of the first dispensation). Among 1906 eligible patients, no significant differences in mean adherence were observed between those receiving the intervention and those receiving usual care (71.6% vs 70.9%, p=0.64), the percentage of patients achieving optimal adherence (57.3% vs 55.9%, p=0.51), or the percentage exhibiting nonpersistence (9.4% vs 8.3%, p=0.41). However, compliance to the study protocol was extremely low in several intervention pharmacies. In a post hoc analysis, a higher level of protocol compliance among intervention pharmacies was significantly associated with higher adherence (p<0.01 for trend). Pharmacies falling in the highest tertile of compliance to the study protocol exhibited higher mean adherence among their patients compared with those in the usual care group (β = 0.056, 95% confidence interval [CI] 0.010-0.101, p=0.01), and a significantly higher percentage of patients achieving optimal adherence (odds ratio 1.32, 95% CI 1.08-1.61; p<0.01); however, nonpersistence did not significantly differ between the two groups (5.5% vs 8.3%, p=0.27). The CPATCH intervention was ineffective for improving patient adherence to statin therapy in community pharmacies. However, poor effectiveness may have resulted from a failure to deliver the protocol consistently in several intervention pharmacies. © 2016 Pharmacotherapy Publications, Inc.

  9. Facilitatory non-invasive brain stimulation in older adults: the effect of stimulation type and duration on the induction of motor cortex plasticity.

    PubMed

    Puri, Rohan; Hinder, Mark R; Canty, Alison J; Summers, Jeffery J

    2016-12-01

    Despite holding significant promise for counteracting the deleterious effects of ageing on cognitive and motor function, little is known of the effects of facilitatory non-invasive brain stimulation (NBS) techniques on corticospinal excitability (CSE) in older adults. Thirty-three older adults (≥60 years) participated in four NBS sessions on separate days, receiving 10- and 20-min anodal transcranial direct current stimulation (atDCS), and 300 and 600 pulses of intermittent theta burst stimulation (iTBS) over the left M1. Motor-evoked potentials measured in the contralateral hand served as a measure of CSE before and for 30 min following each NBS intervention. At the group level, generalized post-stimulation CSE increases were observed (p < 0.001) with no significant differences between the two durations of each stimulation type (atDCS: p = 0.5; iTBS: p = 0.9). For individuals exhibiting overall facilitatory change to atDCS ('responders', n = 10), 20-min atDCS resulted in longer lasting CSE facilitation than 10 min. No such difference was observed between the two iTBS protocols. Considerable variability was observed inter-individually, where 52-58 % of the cohort exhibited the expected facilitation after each of the NBS protocols-as well as intra-individually, where 45-48 % of the cohort maintained consistent post-stimulation responses across the varying durations and types of stimulation. In conclusion, as shown previously in young adults, older adults demonstrate substantial variability in response to different facilitatory NBS protocols. Studies to assess the intra-individual reliability of these protocols are critical to progress towards translation of appropriate protocols (i.e. those that elicit the greatest response for each individual) into clinical practice.

  10. Introduction of a standardized multimodality image protocol for navigation-guided surgery of suspected low-grade gliomas.

    PubMed

    Mert, Aygül; Kiesel, Barbara; Wöhrer, Adelheid; Martínez-Moreno, Mauricio; Minchev, Georgi; Furtner, Julia; Knosp, Engelbert; Wolfsberger, Stefan; Widhalm, Georg

    2015-01-01

    OBJECT Surgery of suspected low-grade gliomas (LGGs) poses a special challenge for neurosurgeons due to their diffusely infiltrative growth and histopathological heterogeneity. Consequently, neuronavigation with multimodality imaging data, such as structural and metabolic data, fiber tracking, and 3D brain visualization, has been proposed to optimize surgery. However, currently no standardized protocol has been established for multimodality imaging data in modern glioma surgery. The aim of this study was therefore to define a specific protocol for multimodality imaging and navigation for suspected LGG. METHODS Fifty-one patients who underwent surgery for a diffusely infiltrating glioma with nonsignificant contrast enhancement on MRI and available multimodality imaging data were included. In the first 40 patients with glioma, the authors retrospectively reviewed the imaging data, including structural MRI (contrast-enhanced T1-weighted, T2-weighted, and FLAIR sequences), metabolic images derived from PET, or MR spectroscopy chemical shift imaging, fiber tracking, and 3D brain surface/vessel visualization, to define standardized image settings and specific indications for each imaging modality. The feasibility and surgical relevance of this new protocol was subsequently prospectively investigated during surgery with the assistance of an advanced electromagnetic navigation system in the remaining 11 patients. Furthermore, specific surgical outcome parameters, including the extent of resection, histological analysis of the metabolic hotspot, presence of a new postoperative neurological deficit, and intraoperative accuracy of 3D brain visualization models, were assessed in each of these patients. RESULTS After reviewing these first 40 cases of glioma, the authors defined a specific protocol with standardized image settings and specific indications that allows for optimal and simultaneous visualization of structural and metabolic data, fiber tracking, and 3D brain visualization. This new protocol was feasible and was estimated to be surgically relevant during navigation-guided surgery in all 11 patients. According to the authors' predefined surgical outcome parameters, they observed a complete resection in all resectable gliomas (n = 5) by using contour visualization with T2-weighted or FLAIR images. Additionally, tumor tissue derived from the metabolic hotspot showed the presence of malignant tissue in all WHO Grade III or IV gliomas (n = 5). Moreover, no permanent postoperative neurological deficits occurred in any of these patients, and fiber tracking and/or intraoperative monitoring were applied during surgery in the vast majority of cases (n = 10). Furthermore, the authors found a significant intraoperative topographical correlation of 3D brain surface and vessel models with gyral anatomy and superficial vessels. Finally, real-time navigation with multimodality imaging data using the advanced electromagnetic navigation system was found to be useful for precise guidance to surgical targets, such as the tumor margin or the metabolic hotspot. CONCLUSIONS In this study, the authors defined a specific protocol for multimodality imaging data in suspected LGGs, and they propose the application of this new protocol for advanced navigation-guided procedures optimally in conjunction with continuous electromagnetic instrument tracking to optimize glioma surgery.

  11. Advanced medical imaging protocol workflow-a flexible electronic solution to optimize process efficiency, care quality and patient safety in the National VA Enterprise.

    PubMed

    Medverd, Jonathan R; Cross, Nathan M; Font, Frank; Casertano, Andrew

    2013-08-01

    Radiologists routinely make decisions with only limited information when assigning protocol instructions for the performance of advanced medical imaging examinations. Opportunity exists to simultaneously improve the safety, quality and efficiency of this workflow through the application of an electronic solution leveraging health system resources to provide concise, tailored information and decision support in real-time. Such a system has been developed using an open source, open standards design for use within the Veterans Health Administration. The Radiology Protocol Tool Recorder (RAPTOR) project identified key process attributes as well as inherent weaknesses of paper processes and electronic emulators of paper processes to guide the development of its optimized electronic solution. The design provides a kernel that can be expanded to create an integrated radiology environment. RAPTOR has implications relevant to the greater health care community, and serves as a case model for modernization of legacy government health information systems.

  12. Profiling optimization for big data transfer over dedicated channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, D.; Wu, Qishi; Rao, Nageswara S

    The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less

  13. Eliciting improved quantitative judgements using the IDEA protocol: A case study in natural resource management.

    PubMed

    Hemming, Victoria; Walshe, Terry V; Hanea, Anca M; Fidler, Fiona; Burgman, Mark A

    2018-01-01

    Natural resource management uses expert judgement to estimate facts that inform important decisions. Unfortunately, expert judgement is often derived by informal and largely untested protocols, despite evidence that the quality of judgements can be improved with structured approaches. We attribute the lack of uptake of structured protocols to the dearth of illustrative examples that demonstrate how they can be applied within pressing time and resource constraints, while also improving judgements. In this paper, we demonstrate how the IDEA protocol for structured expert elicitation may be deployed to overcome operational challenges while improving the quality of judgements. The protocol was applied to the estimation of 14 future abiotic and biotic events on the Great Barrier Reef, Australia. Seventy-six participants with varying levels of expertise related to the Great Barrier Reef were recruited and allocated randomly to eight groups. Each participant provided their judgements using the four-step question format of the IDEA protocol ('Investigate', 'Discuss', 'Estimate', 'Aggregate') through remote elicitation. When the events were realised, the participant judgements were scored in terms of accuracy, calibration and informativeness. The results demonstrate that the IDEA protocol provides a practical, cost-effective, and repeatable approach to the elicitation of quantitative estimates and uncertainty via remote elicitation. We emphasise that i) the aggregation of diverse individual judgements into pooled group judgments almost always outperformed individuals, and ii) use of a modified Delphi approach helped to remove linguistic ambiguity, and further improved individual and group judgements. Importantly, the protocol encourages review, critical appraisal and replication, each of which is required if judgements are to be used in place of data in a scientific context. The results add to the growing body of literature that demonstrates the merit of using structured elicitation protocols. We urge decision-makers and analysts to use insights and examples to improve the evidence base of expert judgement in natural resource management.

  14. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.

  15. Optimizing radiotherapy protocols using computer automata to model tumour cell death as a function of oxygen diffusion processes.

    PubMed

    Paul-Gilloteaux, Perrine; Potiron, Vincent; Delpon, Grégory; Supiot, Stéphane; Chiavassa, Sophie; Paris, François; Costes, Sylvain V

    2017-05-23

    The concept of hypofractionation is gaining momentum in radiation oncology centres, enabled by recent advances in radiotherapy apparatus. The gain of efficacy of this innovative treatment must be defined. We present a computer model based on translational murine data for in silico testing and optimization of various radiotherapy protocols with respect to tumour resistance and the microenvironment heterogeneity. This model combines automata approaches with image processing algorithms to simulate the cellular response of tumours exposed to ionizing radiation, modelling the alteration of oxygen permeabilization in blood vessels against repeated doses, and introducing mitotic catastrophe (as opposed to arbitrary delayed cell-death) as a means of modelling radiation-induced cell death. Published data describing cell death in vitro as well as tumour oxygenation in vivo are used to inform parameters. Our model is validated by comparing simulations to in vivo data obtained from the radiation treatment of mice transplanted with human prostate tumours. We then predict the efficacy of untested hypofractionation protocols, hypothesizing that tumour control can be optimized by adjusting daily radiation dosage as a function of the degree of hypoxia in the tumour environment. Further biological refinement of this tool will permit the rapid development of more sophisticated strategies for radiotherapy.

  16. Quantitative Analysis of the Effect of Iterative Reconstruction Using a Phantom: Determining the Appropriate Blending Percentage

    PubMed Central

    Kim, Hyun Gi; Lee, Young Han; Choi, Jin-Young; Park, Mi-Suk; Kim, Myeong-Jin; Kim, Ki Whang

    2015-01-01

    Purpose To investigate the optimal blending percentage of adaptive statistical iterative reconstruction (ASIR) in a reduced radiation dose while preserving a degree of image quality and texture that is similar to that of standard-dose computed tomography (CT). Materials and Methods The CT performance phantom was scanned with standard and dose reduction protocols including reduced mAs or kVp. Image quality parameters including noise, spatial, and low-contrast resolution, as well as image texture, were quantitatively evaluated after applying various blending percentages of ASIR. The optimal blending percentage of ASIR that preserved image quality and texture compared to standard dose CT was investigated in each radiation dose reduction protocol. Results As the percentage of ASIR increased, noise and spatial-resolution decreased, whereas low-contrast resolution increased. In the texture analysis, an increasing percentage of ASIR resulted in an increase of angular second moment, inverse difference moment, and correlation and in a decrease of contrast and entropy. The 20% and 40% dose reduction protocols with 20% and 40% ASIR blending, respectively, resulted in an optimal quality of images with preservation of the image texture. Conclusion Blending the 40% ASIR to the 40% reduced tube-current product can maximize radiation dose reduction and preserve adequate image quality and texture. PMID:25510772

  17. Extracting DNA from 'jaws': high yield and quality from archived tiger shark (Galeocerdo cuvier) skeletal material.

    PubMed

    Nielsen, E E; Morgan, J A T; Maher, S L; Edson, J; Gauthier, M; Pepperell, J; Holmes, B J; Bennett, M B; Ovenden, J R

    2017-05-01

    Archived specimens are highly valuable sources of DNA for retrospective genetic/genomic analysis. However, often limited effort has been made to evaluate and optimize extraction methods, which may be crucial for downstream applications. Here, we assessed and optimized the usefulness of abundant archived skeletal material from sharks as a source of DNA for temporal genomic studies. Six different methods for DNA extraction, encompassing two different commercial kits and three different protocols, were applied to material, so-called bio-swarf, from contemporary and archived jaws and vertebrae of tiger sharks (Galeocerdo cuvier). Protocols were compared for DNA yield and quality using a qPCR approach. For jaw swarf, all methods provided relatively high DNA yield and quality, while large differences in yield between protocols were observed for vertebrae. Similar results were obtained from samples of white shark (Carcharodon carcharias). Application of the optimized methods to 38 museum and private angler trophy specimens dating back to 1912 yielded sufficient DNA for downstream genomic analysis for 68% of the samples. No clear relationships between age of samples, DNA quality and quantity were observed, likely reflecting different preparation and storage methods for the trophies. Trial sequencing of DNA capture genomic libraries using 20 000 baits revealed that a significant proportion of captured sequences were derived from tiger sharks. This study demonstrates that archived shark jaws and vertebrae are potential high-yield sources of DNA for genomic-scale analysis. It also highlights that even for similar tissue types, a careful evaluation of extraction protocols can vastly improve DNA yield. © 2016 John Wiley & Sons Ltd.

  18. Optimal vitrification protocol for mouse ovarian tissue cryopreservation: effect of cryoprotective agents and in vitro culture on vitrified-warmed ovarian tissue survival.

    PubMed

    Youm, Hye Won; Lee, Jung Ryeol; Lee, Jaewang; Jee, Byung Chul; Suh, Chang Suk; Kim, Seok Hyun

    2014-04-01

    What is the optimal vitrification protocol according to the cryoprotective agent (CPA) for ovarian tissue (OT) cryopreservation? The two-step protocol with 7.5% ethylene glycol (EG) and 7.5% dimethyl sulfoxide (DMSO) for 10 min then 20% EG, 20% DMSO and 0.5 M sucrose for 5 min showed the best results in mouse OT vitrification. Establishing the optimal cryopreservation protocol is one of the most important steps to improve OT survival. However, only a few studies have compared vitrification protocols with different CPAs and investigated the effect of in vitro culture (IVC) on vitrified-warmed OT survival. Some recent papers proposed that a combination of CPAs has less toxicity than one type of CPA. However, the efficacy of different types and concentrations of CPA are not yet well documented. A total of 644 ovaries were collected from 4-week-old BDF1 mice, of which 571 ovaries were randomly assigned to 8 groups and vitrified using different protocols according to CPA composition and the remaining 73 ovaries were used as controls. After warming, each of the eight groups of ovaries was further randomly divided into four subgroups and in vitro cultured for 0, 0.5, 2 and 4 h, respectively. Ovaries of the best two groups among the eight groups were autotransplanted after IVC. The CPA solutions for the eight groups were composed of EDS, ES, ED, EPS, EF, EFS, E and EP, respectively (E, EG; D, DMSO; P, propanediol; S, sucrose; F, Ficoll). The IVC medium was composed of α-minimal essential medium, 10% fetal bovine serum and 10 mIU/ml follicle-stimulating hormone (FSH). Autotransplantation of vitrified-warmed OTs after IVC (0 to 4 h) using the EDS or ES protocol was performed, and the grafts were recovered after 3 weeks. Ovarian follicles were assessed for morphology, apoptosis, proliferation and FSH level. The percentages of the morphologically intact (G1) and apoptotic follicles in each group at 0, 0.5, 2 and 4 h of IVC were compared. For G1 follicles at 0 and 4 h of IVC, the EDS group showed the best results at 63.8 and 46.6%, respectively, whereas the EP group showed the worst results at 42.2 and 12.8%, respectively. The apoptotic follicle ratio was lowest in the EDS group at 0 h (8.1%) and 0.5 h (12.7%) of IVC. All of the eight groups showed significant decreases in G1 follicles and increases in apoptotic follicles as IVC duration progressed. After autotransplantation, the EDS 0 h group showed a significantly higher G1 percentage (84.9%) than did the other groups (42.4-58.8%), while only the ES 4 h group showed a significant decrease in the number of proliferative cells (80.6%, 87.6-92.9%). However, no significant differences in apoptotic rates and FSH levels were observed between the groups after autotransplantation. The limitation of this study was the absence of in vitro fertilization using oocytes obtained from OT grafts, which should be performed to confirm the outcomes of ovarian cryopreservation and transplantation. We compared eight vitrification protocols according to CPA composition and found the EDS protocol to be the optimal method among them. The data presented herein will help improve OT cryopreservation protocols for humans or other animals.

  19. Optimality of Gaussian attacks in continuous-variable quantum cryptography.

    PubMed

    Navascués, Miguel; Grosshans, Frédéric; Acín, Antonio

    2006-11-10

    We analyze the asymptotic security of the family of Gaussian modulated quantum key distribution protocols for continuous-variables systems. We prove that the Gaussian unitary attack is optimal for all the considered bounds on the key rate when the first and second momenta of the canonical variables involved are known by the honest parties.

  20. Returning Individual Research Results: Development of a Cancer Genetics Education and Risk Communication Protocol

    PubMed Central

    Roberts, J. Scott; Shalowitz, David I.; Christensen, Kurt D.; Everett, Jessica N.; Kim, Scott Y. H.; Raskin, Leon; Gruber, Stephen B.

    2011-01-01

    The obligations of researchers to disclose clinically and/or personally significant individual research results are highly debated, but few empirical studies have addressed this topic. We describe the development of a protocol for returning research results to participants at one site of a multicenter study of the genetic epidemiology of melanoma. Protocol development involved numerous challenges: (1) deciding whether genotype results merited disclosure; (2) achieving an appropriate format for communicating results; (3) developing education materials; (4) deciding whether to retest samples for additional laboratory validation; (5) identifying and notifying selected participants; and (6) assessing the impact of disclosure. Our experience suggests potential obstacles depending on researcher resources and the design of the parent study, but offers a process by which researchers can responsibly return individual study results and evaluate the impact of disclosure. PMID:20831418

  1. A self-optimizing scheme for energy balanced routing in Wireless Sensor Networks using SensorAnt.

    PubMed

    Shamsan Saleh, Ahmed M; Ali, Borhanuddin Mohd; Rasid, Mohd Fadlee A; Ismail, Alyani

    2012-01-01

    Planning of energy-efficient protocols is critical for Wireless Sensor Networks (WSNs) because of the constraints on the sensor nodes' energy. The routing protocol should be able to provide uniform power dissipation during transmission to the sink node. In this paper, we present a self-optimization scheme for WSNs which is able to utilize and optimize the sensor nodes' resources, especially the batteries, to achieve balanced energy consumption across all sensor nodes. This method is based on the Ant Colony Optimization (ACO) metaheuristic which is adopted to enhance the paths with the best quality function. The assessment of this function depends on multi-criteria metrics such as the minimum residual battery power, hop count and average energy of both route and network. This method also distributes the traffic load of sensor nodes throughout the WSN leading to reduced energy usage, extended network life time and reduced packet loss. Simulation results show that our scheme performs much better than the Energy Efficient Ant-Based Routing (EEABR) in terms of energy consumption, balancing and efficiency.

  2. Towards a hybrid energy efficient multi-tree-based optimized routing protocol for wireless networks.

    PubMed

    Mitton, Nathalie; Razafindralambo, Tahiry; Simplot-Ryl, David; Stojmenovic, Ivan

    2012-12-13

    This paper considers the problem of designing power efficient routing with guaranteed delivery for sensor networks with unknown geographic locations. We propose HECTOR, a hybrid energy efficient tree-based optimized routing protocol, based on two sets of virtual coordinates. One set is based on rooted tree coordinates, and the other is based on hop distances toward several landmarks. In HECTOR, the node currently holding the packet forwards it to its neighbor that optimizes ratio of power cost over distance progress with landmark coordinates, among nodes that reduce landmark coordinates and do not increase distance in tree coordinates. If such a node does not exist, then forwarding is made to the neighbor that reduces tree-based distance only and optimizes power cost over tree distance progress ratio. We theoretically prove the packet delivery and propose an extension based on the use of multiple trees. Our simulations show the superiority of our algorithm over existing alternatives while guaranteeing delivery, and only up to 30% additional power compared to centralized shortest weighted path algorithm.

  3. Towards a Hybrid Energy Efficient Multi-Tree-Based Optimized Routing Protocol for Wireless Networks

    PubMed Central

    Mitton, Nathalie; Razafindralambo, Tahiry; Simplot-Ryl, David; Stojmenovic, Ivan

    2012-01-01

    This paper considers the problem of designing power efficient routing with guaranteed delivery for sensor networks with unknown geographic locations. We propose HECTOR, a hybrid energy efficient tree-based optimized routing protocol, based on two sets of virtual coordinates. One set is based on rooted tree coordinates, and the other is based on hop distances toward several landmarks. In HECTOR, the node currently holding the packet forwards it to its neighbor that optimizes ratio of power cost over distance progress with landmark coordinates, among nodes that reduce landmark coordinates and do not increase distance in tree coordinates. If such a node does not exist, then forwarding is made to the neighbor that reduces tree-based distance only and optimizes power cost over tree distance progress ratio. We theoretically prove the packet delivery and propose an extension based on the use of multiple trees. Our simulations show the superiority of our algorithm over existing alternatives while guaranteeing delivery, and only up to 30% additional power compared to centralized shortest weighted path algorithm. PMID:23443398

  4. Comparison of the Liaison® Calprotectin kit with a well established point of care test (Quantum Blue - Bühlmann-Alere®) in terms of analytical performances and ability to detect relapses amongst a Crohn population in follow-up.

    PubMed

    Delefortrie, Quentin; Schatt, Patricia; Grimmelprez, Alexandre; Gohy, Patrick; Deltour, Didier; Collard, Geneviève; Vankerkhoven, Patrick

    2016-02-01

    Although colonoscopy associated with histopathological sampling remains the gold standard in the diagnostic and follow-up of inflammatory bowel disease (IBD), calprotectin is becoming an essential biomarker in gastroenterology. The aim of this work is to compare a newly developed kit (Liaison® Calprotectin - Diasorin®) and its two distinct extraction protocols (weighing and extraction device protocol) with a well established point of care test (Quantum Blue® - Bühlmann-Alere®) in terms of analytical performances and ability to detect relapses amongst a Crohn's population in follow-up. Stool specimens were collected over a six month period and were composed of control and Crohn's patients. Amongst the Crohn's population disease activity (active vs quiescent) was evaluated by gastroenterologists. A significant difference was found between all three procedures in terms of calprotectin measurements (weighing protocol=30.3μg/g (median); stool extraction device protocol=36.9μg/g (median); Quantum Blue® (median)=63; Friedman test, P value=0.05). However, a good correlation was found between both extraction methods coupled with the Liaison® analyzer and between the Quantum Blue® (weighing protocol/extraction device protocol Rs=0.844, P=0.01; Quantum Blue®/extraction device protocol Rs=0.708, P=0.01; Quantum Blue®/weighing protocol, Rs=0.808, P=0.01). Finally, optimal cut-offs (and associated negative predictive values - NPV) for detecting relapses were in accordance with above results (Quantum Blue® 183.5μg/g and NPV of 100%>extraction device protocol+Liaison® analyzer 124.5μg/g and NPV of 93.5%>weighing protocol+Liaison® analyzer 106.5μg/g and NPV of 95%). Although all three methods correlated well and had relatively good NPV in terms of detecting relapses amongst a Crohn's population in follow-up, the lack of any international standard is the origin of different optimal cut-offs between the three procedures. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. New Protocol Based on UHPLC-MS/MS for Quantitation of Metabolites in Xylose-Fermenting Yeasts

    NASA Astrophysics Data System (ADS)

    Campos, Christiane Gonçalves; Veras, Henrique César Teixeira; de Aquino Ribeiro, José Antônio; Costa, Patrícia Pinto Kalil Gonçalves; Araújo, Katiúscia Pereira; Rodrigues, Clenilson Martins; de Almeida, João Ricardo Moreira; Abdelnur, Patrícia Verardi

    2017-12-01

    Xylose fermentation is a bottleneck in second-generation ethanol production. As such, a comprehensive understanding of xylose metabolism in naturally xylose-fermenting yeasts is essential for prospection and construction of recombinant yeast strains. The objective of the current study was to establish a reliable metabolomics protocol for quantification of key metabolites of xylose catabolism pathways in yeast, and to apply this protocol to Spathaspora arborariae. Ultra-high performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS) was used to quantify metabolites, and afterwards, sample preparation was optimized to examine yeast intracellular metabolites. S. arborariae was cultivated using xylose as a carbon source under aerobic and oxygen-limited conditions. Ion pair chromatography (IPC) and hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC-MS/MS) were shown to efficiently quantify 14 and 5 metabolites, respectively, in a more rapid chromatographic protocol than previously described. Thirteen and eleven metabolites were quantified in S. arborariae under aerobic and oxygen-limited conditions, respectively. This targeted metabolomics protocol is shown here to quantify a total of 19 metabolites, including sugars, phosphates, coenzymes, monosaccharides, and alcohols, from xylose catabolism pathways (glycolysis, pentose phosphate pathway, and tricarboxylic acid cycle) in yeast. Furthermore, to our knowledge, this is the first time that intracellular metabolites have been quantified in S. arborariae after xylose consumption. The results indicated that fine control of oxygen levels during fermentation is necessary to optimize ethanol production by S. arborariae. The protocol presented here may be applied to other yeast species and could support yeast genetic engineering to improve second generation ethanol production. [Figure not available: see fulltext.

  6. Developing an Optimum Protocol for Thermoluminescence Dosimetry with GR-200 Chips using Taguchi Method.

    PubMed

    Sadeghi, Maryam; Faghihi, Reza; Sina, Sedigheh

    2017-06-15

    Thermoluminescence dosimetry (TLD) is a powerful technique with wide applications in personal, environmental and clinical dosimetry. The optimum annealing, storage and reading protocols are very effective in accuracy of TLD response. The purpose of this study is to obtain an optimum protocol for GR-200; LiF: Mg, Cu, P, by optimizing the effective parameters, to increase the reliability of the TLD response using Taguchi method. Taguchi method has been used in this study for optimization of annealing, storage and reading protocols of the TLDs. A number of 108 GR-200 chips were divided into 27 groups, each containing four chips. The TLDs were exposed to three different doses, and stored, annealed and read out by different procedures as suggested by Taguchi Method. By comparing the signal-to-noise ratios the optimum dosimetry procedure was obtained. According to the results, the optimum values for annealing temperature (°C), Annealing Time (s), Annealing to Exposure time (d), Exposure to Readout time (d), Pre-heat Temperature (°C), Pre-heat Time (s), Heating Rate (°C/s), Maximum Temperature of Readout (°C), readout time (s) and Storage Temperature (°C) are 240, 90, 1, 2, 50, 0, 15, 240, 13 and -20, respectively. Using the optimum protocol, an efficient glow curve with low residual signals can be achieved. Using optimum protocol obtained by Taguchi method, the dosimetry can be effectively performed with great accuracy. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Automating individualized coaching and authentic role-play practice for brief intervention training.

    PubMed

    Hayes-Roth, B; Saker, R; Amano, K

    2010-01-01

    Brief intervention helps to reduce alcohol abuse, but there is a need for accessible, cost-effective training of clinicians. This study evaluated STAR Workshop , a web-based training system that automates efficacious techniques for individualized coaching and authentic role-play practice. We compared STAR Workshop to a web-based, self-guided e-book and a no-treatment control, for training the Engage for Change (E4C) brief intervention protocol. Subjects were medical and nursing students. Brief written skill probes tested subjects' performance of individual protocol steps, in different clinical scenarios, at three test times: pre-training, post-training, and post-delay (two weeks). Subjects also did live phone interviews with a standardized patient, post-delay. STAR subjects performed significantly better than both other groups. They showed significantly greater improvement from pre-training probes to post-training and post-delay probes. They scored significantly higher on post-delay phone interviews. STAR Workshop appears to be an accessible, cost-effective approach for training students to use the E4C protocol for brief intervention in alcohol abuse. It may also be useful for training other clinical interviewing protocols.

  8. Evaluation of parameters affecting switchgrass tissue culture: toward a consolidated procedure for Agrobacterium-mediated transformation of switchgrass (Panicum virgatum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Chien-Yuan; Donohoe, Bryon S.; Ahuja, Neha

    Switchgrass (Panicum virgatum), a robust perennial C4-type grass, has been evaluated and designated as a model bioenergy crop by the U.S. DOE and USDA. Conventional breeding of switchgrass biomass is difficult because it displays self-incompatible hindrance. Therefore, direct genetic modifications of switchgrass have been considered the more effective approach to tailor switchgrass with traits of interest. Successful transformations have demonstrated increased biomass yields, reduction in the recalcitrance of cell walls and enhanced saccharification efficiency. Several tissue culture protocols have been previously described to produce transgenic switchgrass lines using different nutrient-based media, co-cultivation approaches, and antibiotic strengths for selection. After evaluatingmore » the published protocols, we consolidated these approaches and optimized the process to develop a more efficient protocol for producing transgenic switchgrass. First, seed sterilization was optimized, which led to a 20% increase in yield of induced calluses. Second, we have selected a N 6 macronutrient/B 5 micronutrient (NB)-based medium for callus induction from mature seeds of the Alamo cultivar, and chose a Murashige and Skoog-based medium to regenerate both Type I and Type II calluses. Third, Agrobacterium-mediated transformation was adopted that resulted in 50-100% positive regenerated transformants after three rounds (2 weeks/round) of selection with antibiotic. Genomic DNA PCR, RT-PCR, Southern blot, visualization of the red fluorescent protein and histochemical β-glucuronidase (GUS) staining were conducted to confirm the positive switchgrass transformants. The optimized methods developed here provide an improved strategy to promote the production and selection of callus and generation of transgenic switchgrass lines. The process for switchgrass transformation has been evaluated and consolidated to devise an improved approach for transgenic switchgrass production. With the optimization of seed sterilization, callus induction, and regeneration steps, a reliable and effective protocol is established to facilitate switchgrass engineering.« less

  9. Identification of the Optimal Protocol for Automated Office Blood Pressure Measurement Among Patients With Treated Hypertension.

    PubMed

    Moore, Myles N; Schultz, Martin G; Nelson, Mark R; Black, J Andrew; Dwyer, Nathan B; Hoban, Ella; Jose, Matthew D; Kosmala, Wojciech; Przewlocka-Kosmala, Monika; Zachwyc, Jowita; Otahal, Petr; Picone, Dean S; Roberts-Thomson, Philip; Veloudi, Panagiota; Sharman, James E

    2018-02-09

    Automated office blood pressure (AOBP) involving repeated, unobserved blood pressure (BP) readings during one clinic visit is recommended for in-office diagnosis and assessment of hypertension. However, the optimal AOBP protocol to determine BP control in the least amount of time with the fewest BP readings is yet to be determined and was the aim of this study. One hundred and eighty-nine patients (mean age 62.8 ± 12.1 years; 50.3% female) with treated hypertension referred to specialist clinics at 2 sites underwent AOBP in a quiet room alone. Eight BP measurements were taken starting immediately after sitting and then at 2-minute intervals (15 minutes total). The optimal AOBP protocol was defined by the smallest mean difference and highest intraclass correlation coefficient (ICC) compared with daytime ambulatory BP (ABP). The same BP device (Mobil-o-graph, IEM) was used for both AOBP and daytime ABP. Average 15-minute AOBP and daytime ABP were 134 ± 22/82 ± 13 and 137 ± 17/83 ± 11 mm Hg, respectively. The optimal AOBP protocol was derived within a total duration of 6 minutes from the average of 2 measures started after 2 and 4 minutes of seated rest (systolic BP: mean difference (95% confidence interval) 0.004(-2.21, 2.21) mm Hg, P = 1.0; ICC = 0.81; diastolic BP: mean difference 0.37(-0.90, 1.63) mm Hg, P = 0.57; ICC = 0.86). AOBP measures taken after 8 minutes tended to underestimate daytime ABP (whether as a single BP or the average of more than 1 BP reading). Only 2 AOBP readings taken over 6 minutes (excluding an initial reading immediately after sitting) may be needed to be comparable with daytime ABP. © American Journal of Hypertension, Ltd 2017. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  10. Evaluation of parameters affecting switchgrass tissue culture: toward a consolidated procedure for Agrobacterium-mediated transformation of switchgrass (Panicum virgatum)

    DOE PAGES

    Lin, Chien-Yuan; Donohoe, Bryon S.; Ahuja, Neha; ...

    2017-12-19

    Switchgrass (Panicum virgatum), a robust perennial C4-type grass, has been evaluated and designated as a model bioenergy crop by the U.S. DOE and USDA. Conventional breeding of switchgrass biomass is difficult because it displays self-incompatible hindrance. Therefore, direct genetic modifications of switchgrass have been considered the more effective approach to tailor switchgrass with traits of interest. Successful transformations have demonstrated increased biomass yields, reduction in the recalcitrance of cell walls and enhanced saccharification efficiency. Several tissue culture protocols have been previously described to produce transgenic switchgrass lines using different nutrient-based media, co-cultivation approaches, and antibiotic strengths for selection. After evaluatingmore » the published protocols, we consolidated these approaches and optimized the process to develop a more efficient protocol for producing transgenic switchgrass. First, seed sterilization was optimized, which led to a 20% increase in yield of induced calluses. Second, we have selected a N 6 macronutrient/B 5 micronutrient (NB)-based medium for callus induction from mature seeds of the Alamo cultivar, and chose a Murashige and Skoog-based medium to regenerate both Type I and Type II calluses. Third, Agrobacterium-mediated transformation was adopted that resulted in 50-100% positive regenerated transformants after three rounds (2 weeks/round) of selection with antibiotic. Genomic DNA PCR, RT-PCR, Southern blot, visualization of the red fluorescent protein and histochemical β-glucuronidase (GUS) staining were conducted to confirm the positive switchgrass transformants. The optimized methods developed here provide an improved strategy to promote the production and selection of callus and generation of transgenic switchgrass lines. The process for switchgrass transformation has been evaluated and consolidated to devise an improved approach for transgenic switchgrass production. With the optimization of seed sterilization, callus induction, and regeneration steps, a reliable and effective protocol is established to facilitate switchgrass engineering.« less

  11. Components of an Anticancer Diet: Dietary Recommendations, Restrictions and Supplements of the Bill Henderson Protocol

    PubMed Central

    Mannion, Cynthia; Page, Stacey; Bell, Laurie Heilman; Verhoef, Marja

    2010-01-01

    The use of complementary and alternative medicines including dietary supplements, herbals and special diets to prevent or treat disease continues to be popular. The following paper provides a description of an alternative dietary approach to the self-management and treatment of cancer, the Bill Henderson Protocol (BHP). This diet encourages daily intake of raw foods, a combination of cottage cheese and flaxseed oil and a number of supplements. Some foods and food groups are restricted (e.g., gluten, meat, dairy). Early background theory that contributed to the protocol’s development is presented as is a summary of relevant evidence concerning the anti-cancer fighting properties of the individual components. Supplement intake is considered in relation to daily recommended intakes. Challenges and risks to protocol adherence are discussed. As with many complementary and alternative interventions, clear evidence of this dietary protocol’s safety and efficacy is lacking. Consumers of this protocol may require guidance on the ability of this protocol to meet their individual nutritional needs. PMID:22254073

  12. Diverse Protocols for Correlative Super-Resolution Fluorescence Imaging and Electron Microscopy of Cells and Tissue

    DTIC Science & Technology

    2016-05-25

    tissue is critical to biology. Many factors determine optimal experimental design, including attainable localization precision, ultrastructural...both imaging modalities. Examples include: weak tissue preservation protocols resulting in poor ultrastructure, e.g. mitochondrial cristae membranes...tension effects during sample drying that may result in artifacts44. Samples dried in the presence of polyvinyl alcohol do not have the haziness

  13. Advertisement-Based Energy Efficient Medium Access Protocols for Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Ray, Surjya Sarathi

    One of the main challenges that prevents the large-scale deployment of Wireless Sensor Networks (WSNs) is providing the applications with the required quality of service (QoS) given the sensor nodes' limited energy supplies. WSNs are an important tool in supporting applications ranging from environmental and industrial monitoring, to battlefield surveillance and traffic control, among others. Most of these applications require sensors to function for long periods of time without human intervention and without battery replacement. Therefore, energy conservation is one of the main goals for protocols for WSNs. Energy conservation can be performed in different layers of the protocol stack. In particular, as the medium access control (MAC) layer can access and control the radio directly, large energy savings is possible through intelligent MAC protocol design. To maximize the network lifetime, MAC protocols for WSNs aim to minimize idle listening of the sensor nodes, packet collisions, and overhearing. Several approaches such as duty cycling and low power listening have been proposed at the MAC layer to achieve energy efficiency. In this thesis, I explore the possibility of further energy savings through the advertisement of data packets in the MAC layer. In the first part of my research, I propose Advertisement-MAC or ADV-MAC, a new MAC protocol for WSNs that utilizes the concept of advertising for data contention. This technique lets nodes listen dynamically to any desired transmission and sleep during transmissions not of interest. This minimizes the energy lost in idle listening and overhearing while maintaining an adaptive duty cycle to handle variable loads. Additionally, ADV-MAC enables energy efficient MAC-level multicasting. An analytical model for the packet delivery ratio and the energy consumption of the protocol is also proposed. The analytical model is verified with simulations and is used to choose an optimal value of the advertisement period. Simulations show that the optimized ADV-MAC provides substantial energy gains (50% to 70% less than other MAC protocols for WSNs such as T-MAC and S-MAC for the scenarios investigated) while faring as well as T-MAC in terms of packet delivery ratio and latency. Although ADV-MAC provides substantial energy gains over S-MAC and T-MAC, it is not optimal in terms of energy savings because contention is done twice -- once in the Advertisement Period and once in the Data Period. In the next part of my research, the second contention in the Data Period is eliminated and the advantages of contention-based and TDMA-based protocols are combined to form Advertisement based Time-division Multiple Access (ATMA), a distributed TDMA-based MAC protocol for WSNs. ATMA utilizes the bursty nature of the traffic to prevent energy waste through advertisements and reservations for data slots. Extensive simulations and qualitative analysis show that with bursty traffic, ATMA outperforms contention-based protocols (S-MAC, T-MAC and ADV-MAC), a TDMA based protocol (TRAMA) and hybrid protocols (Z-MAC and IEEE 802.15.4). ATMA provides energy reductions of up to 80%, while providing the best packet delivery ratio (close to 100%) and latency among all the investigated protocols. Simulations alone cannot reflect many of the challenges faced by real implementations of MAC protocols, such as clock-drift, synchronization, imperfect physical layers, and irregular interference from other transmissions. Such issues may cripple a protocol that otherwise performs very well in software simulations. Hence, to validate my research, I conclude with a hardware implementation of the ATMA protocol on SORA (Software Radio), developed by Microsoft Research Asia. SORA is a reprogrammable Software Defined Radio (SDR) platform that satisfies the throughput and timing requirements of modern wireless protocols while utilizing the rich general purpose PC development environment. Experimental results obtained from the hardware implementation of ATMA closely mirror the simulation results obtained for a single hop network with 4 nodes.

  14. Community-based exercise programs as a strategy to optimize function in chronic disease: a systematic review.

    PubMed

    Desveaux, Laura; Beauchamp, Marla; Goldstein, Roger; Brooks, Dina

    2014-03-01

    Chronic diseases are the leading cause of death and disability worldwide. Preliminary evidence suggests that community-based exercise (CBE) improves functional capacity (FC) and health-related quality of life (HRQL). To describe the structure and delivery of CBE programs for chronic disease populations and compare their impact on FC and HRQL to standard care. Randomized trials examining CBE programs for individuals with stroke, chronic obstructive pulmonary disease, osteoarthritis, diabetes, and cardiovascular disease were identified. Quality was assessed using the Cochrane risk of bias tool. Meta-analyses were conducted using Review Manager 5.1. The protocol was registered on PROSPERO (CRD42012002786). Sixteen studies (2198 individuals, mean age 66.8±4.9 y) were included to describe program structures, which were comparable in their design and components, irrespective of the chronic disease. Aerobic exercise and resistance training were the primary interventions in 85% of studies. Nine studies were included in the meta-analysis. The weighted mean difference for FC, evaluated using the 6-minute walk test, was 41.7 m (95% confidence interval [CI], 20.5-62.8). The standardized mean difference for all FC measures was 0.18 (95% CI, 0.05-0.3). The standardized mean difference for the physical component of HRQL measures was 0.21 (95% CI, 0.05-0.4) and 0.38 (95% CI, 0.04-0.7) for the total score. CBE programs across chronic disease populations have similar structures. These programs appear superior to standard care with respect to optimizing FC and HRQL in individuals with osteoarthritis; however, the effect beyond this population is unknown. Long-term sustainability of these programs remains to be established.

  15. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  16. Mathematical model formulation and validation of water and solute transport in whole hamster pancreatic islets.

    PubMed

    Benson, James D; Benson, Charles T; Critser, John K

    2014-08-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3×3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87±0.06 (mean ± SD). Only the treatment variable of perfusing solution was found to be significant (p<0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Clinically Effective Treatment of Fibromyalgia Pain With High-Definition Transcranial Direct Current Stimulation: Phase II Open-Label Dose Optimization.

    PubMed

    Castillo-Saavedra, Laura; Gebodh, Nigel; Bikson, Marom; Diaz-Cruz, Camilo; Brandao, Rivail; Coutinho, Livia; Truong, Dennis; Datta, Abhishek; Shani-Hershkovich, Revital; Weiss, Michal; Laufer, Ilan; Reches, Amit; Peremen, Ziv; Geva, Amir; Parra, Lucas C; Fregni, Felipe

    2016-01-01

    Despite promising preliminary results in treating fibromyalgia (FM) pain, no neuromodulation technique has been adopted in clinical practice because of limited efficacy, low response rate, or poor tolerability. This phase II open-label trial aims to define a methodology for a clinically effective treatment of pain in FM by establishing treatment protocols and screening procedures to maximize efficacy and response rate. High-definition transcranial direct current stimulation (HD-tDCS) provides targeted subthreshold brain stimulation, combining tolerability with specificity. We aimed to establish the number of HD-tDCS sessions required to achieve a 50% FM pain reduction, and to characterize the biometrics of the response, including brain network activation pain scores of contact heat-evoked potentials. We report a clinically significant benefit of a 50% pain reduction in half (n = 7) of the patients (N = 14), with responders and nonresponders alike benefiting from a cumulative effect of treatment, reflected in significant pain reduction (P = .035) as well as improved quality of life (P = .001) over time. We also report an aggregate 6-week response rate of 50% of patients and estimate 15 as the median number of HD-tDCS sessions to reach clinically meaningful outcomes. The methodology for a pivotal FM neuromodulation clinical trial with individualized treatment is thus supported. Registered in Clinicaltrials.gov under registry number NCT01842009. In this article, an optimized protocol for the treatment of fibromyalgia pain with targeted subthreshold brain stimulation using high-definition transcranial direct current stimulation is outlined. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  18. Standardization and optimization of fluorescence in situ hybridization (FISH) for HER-2 assessment in breast cancer: A single center experience.

    PubMed

    Bogdanovska-Todorovska, Magdalena; Petrushevska, Gordana; Janevska, Vesna; Spasevska, Liljana; Kostadinova-Kunovska, Slavica

    2018-05-20

    Accurate assessment of human epidermal growth factor receptor 2 (HER-2) is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH). Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE) tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples) and DAKO Histology FISH Accessory Kit (30 samples). The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing) affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70); the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p < 0.0001). The greatest discordance (82%) between IHC and FISH was observed in IHC 2+ group. A standardized FISH protocol for HER-2 assessment, with high hybridization efficiency, is necessary due to variability in tissue processing and individual tissue characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.

  19. Rapid analysis and exploration of fluorescence microscopy images.

    PubMed

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J

    2014-03-19

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.

  20. Patient-specific radiation dose and cancer risk estimation in pediatric chest CT: a study in 30 patients

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Frush, Donald P.

    2010-04-01

    Radiation-dose awareness and optimization in CT can greatly benefit from a dosereporting system that provides radiation dose and cancer risk estimates specific to each patient and each CT examination. Recently, we reported a method for estimating patientspecific dose from pediatric chest CT. The purpose of this study is to extend that effort to patient-specific risk estimation and to a population of pediatric CT patients. Our study included thirty pediatric CT patients (16 males and 14 females; 0-16 years old), for whom full-body computer models were recently created based on the patients' clinical CT data. Using a validated Monte Carlo program, organ dose received by the thirty patients from a chest scan protocol (LightSpeed VCT, 120 kVp, 1.375 pitch, 40-mm collimation, pediatric body scan field-of-view) was simulated and used to estimate patient-specific effective dose. Risks of cancer incidence were calculated for radiosensitive organs using gender-, age-, and tissue-specific risk coefficients and were used to derive patientspecific effective risk. The thirty patients had normalized effective dose of 3.7-10.4 mSv/100 mAs and normalized effective risk of 0.5-5.8 cases/1000 exposed persons/100 mAs. Normalized lung dose and risk of lung cancer correlated strongly with average chest diameter (correlation coefficient: r = -0.98 to -0.99). Normalized effective risk also correlated strongly with average chest diameter (r = -0.97 to -0.98). These strong correlations can be used to estimate patient-specific dose and risk prior to or after an imaging study to potentially guide healthcare providers in justifying CT examinations and to guide individualized protocol design and optimization.

  1. The optimal timing of stimulation to induce long-lasting positive effects on episodic memory in physiological aging.

    PubMed

    Manenti, Rosa; Sandrini, Marco; Brambilla, Michela; Cotelli, Maria

    2016-09-15

    Episodic memory displays the largest degree of age-related decline. A noninvasive brain stimulation technique that can be used to modulate memory in physiological aging is transcranial Direct Current Stimulation (tDCS). However, an aspect that has not been adequately investigated in previous studies is the optimal timing of stimulation to induce long-lasting positive effects on episodic memory function. Our previous studies showed episodic memory enhancement in older adults when anodal tDCS was applied over the left lateral prefrontal cortex during encoding or after memory consolidation with or without a contextual reminder. Here we directly compared the two studies to explore which of the tDCS protocols would induce longer-lasting positive effects on episodic memory function in older adults. In addition, we aimed to determine whether subjective memory complaints would be related to the changes in memory performance (forgetting) induced by tDCS, a relevant issue in aging research since individuals with subjective memory complaints seem to be at higher risk of later memory decline. The results showed that anodal tDCS applied after consolidation with a contextual reminder induced longer-lasting positive effects on episodic memory, conceivably through reconsolidation, than anodal tDCS during encoding. Furthermore, we reported, providing new data, a moderate negative correlation between subjective memory complaints and forgetting when anodal tDCS was applied after consolidation with a contextual reminder. This study sheds light on the best-suited timing of stimulation to induce long-lasting positive effects on memory function and might help the clinicians to select the most effective tDCS protocol to prevent memory decline. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Optimization of scat detection methods for a social ungulate, the wild pig, and experimental evaluation of factors affecting detection of scat

    USGS Publications Warehouse

    Keiter, David A.; Cunningham, Fred L.; Rhodes, Olin E.; Irwin, Brian J.; Beasley, James

    2016-01-01

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.

  3. Optimization of Scat Detection Methods for a Social Ungulate, the Wild Pig, and Experimental Evaluation of Factors Affecting Detection of Scat.

    PubMed

    Keiter, David A; Cunningham, Fred L; Rhodes, Olin E; Irwin, Brian J; Beasley, James C

    2016-01-01

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.

  4. Effect of age, diet, and tissue type on PCr response to creatine supplementation.

    PubMed

    Solis, Marina Yazigi; Artioli, Guilherme Giannini; Otaduy, Maria Concepción García; Leite, Claudia da Costa; Arruda, Walquiria; Veiga, Raquel Ramos; Gualano, Bruno

    2017-08-01

    Creatine/phosphorylcreatine (PCr) responses to creatine supplementation may be modulated by age, diet, and tissue, but studies assessing this possibility are lacking. Therefore we aimed to determine whether PCr responses vary as a function of age, diet, and tissue. Fifteen children, 17 omnivorous and 14 vegetarian adults, and 18 elderly individuals ("elderly") participated in this study. Participants were given placebo and subsequently creatine (0.3 g·kg -1 ·day -1 ) for 7 days in a single-blind fashion. PCr was measured through phosphorus magnetic resonance spectroscopy ( 31 P-MRS) in muscle and brain. Creatine supplementation increased muscle PCr in children ( P < 0.0003) and elderly ( P < 0.001), whereas the increase in omnivores did not reach statistically significant difference ( P = 0.3348). Elderly had greater PCr increases than children and omnivores ( P < 0.0001 for both), whereas children experienced greater PCr increases than omnivores ( P = 0.0022). In relation to diet, vegetarians ( P < 0.0001), but not omnivores, had significant increases in muscle PCr content. Brain PCr content was not affected by creatine supplementation in any group, and delta changes in brain PCr (-0.7 to +3.9%) were inferior to those in muscle PCr content (+10.3 to +27.6%; P < 0.0001 for all comparisons). PCr responses to a standardized creatine protocol (0.3 g·kg -1 ·day -1 for 7 days) may be affected by age, diet, and tissue. Whereas creatine supplementation was able to increase muscle PCr in all groups, although to different extents, brain PCr was shown to be unresponsive overall. These findings demonstrate the need to tailor creatine protocols to optimize creatine/PCr accumulation both in muscle and in brain, enabling a better appreciation of the pleiotropic properties of creatine. NEW & NOTEWORTHY A standardized creatine supplementation protocol (0.3 g·kg -1 ·day -1 for 7 days) effectively increased muscle, but not brain, phosphorylcreatine. Older participants responded better than younger participants whereas vegetarians responded better than omnivores. Responses to supplementation are thus dependent on age, tissue, and diet. This suggests that a single "universal" protocol, originally designed for increasing muscle creatine in young individuals, may lead to heterogeneous muscle responses in different populations or even no responses in tissues other than skeletal muscle. Copyright © 2017 the American Physiological Society.

  5. Pharmaceutical care for patients with COPD in Belgium and views on protocol implementation.

    PubMed

    Tommelein, Eline; Tollenaere, Kathleen; Mehuys, Els; Boussery, Koen

    2014-08-01

    A protocol-based pharmaceutical care program (the PHARMACOP-protocol) focusing on patient counselling during prescription filling has shown to be effective in patients with chronic obstructive pulmonary disease (COPD). However, implementation of this protocol in daily practice has not yet been studied. To describe current implementation level of the items included in the PHARMACOP-protocol in Belgian community pharmacies and to evaluate pharmacists' perspectives on the implementation of this protocol in daily practice. A cross-sectional study was conducted from April to June 2012, in randomly selected community pharmacies in Flanders. Pharmacists were questionned using structured interviews. 125 pharmacies were contacted and 80 managing pharmacists (64 %) participated. In >70 % of pharmacies, 4/7 protocol items for first prescriptions and 3/5 protocol items for follow-up prescriptions were already routinely implemented. For first and follow-up prescriptions, respectively 39 (49 %) and 34 pharmacists (43 %) stated they would need to spend at least 5 min extra to offer optimal patient counselling. Most mentioned barriers preventing protocol implementation included lack of time (80 %), no integration in pharmacy software (61 %) and too much administrative burden (58 %). Approximately 50 % of the PHARMACOP-protocol items are currently routinely provided in Belgian community pharmacies. Nearly all interviewed pharmacists are willing to implement the protocol fully or partially in daily practice.

  6. Model-based imaging of cardiac electrical function in human atria

    NASA Astrophysics Data System (ADS)

    Modre, Robert; Tilg, Bernhard; Fischer, Gerald; Hanser, Friedrich; Messnarz, Bernd; Schocke, Michael F. H.; Kremser, Christian; Hintringer, Florian; Roithinger, Franz

    2003-05-01

    Noninvasive imaging of electrical function in the human atria is attained by the combination of data from electrocardiographic (ECG) mapping and magnetic resonance imaging (MRI). An anatomical computer model of the individual patient is the basis for our computer-aided diagnosis of cardiac arrhythmias. Three patients suffering from Wolff-Parkinson-White syndrome, from paroxymal atrial fibrillation, and from atrial flutter underwent an electrophysiological study. After successful treatment of the cardiac arrhythmia with invasive catheter technique, pacing protocols with stimuli at several anatomical sites (coronary sinus, left and right pulmonary vein, posterior site of the right atrium, right atrial appendage) were performed. Reconstructed activation time (AT) maps were validated with catheter-based electroanatomical data, with invasively determined pacing sites, and with pacing at anatomical markers. The individual complex anatomical model of the atria of each patient in combination with a high-quality mesh optimization enables accurate AT imaging, resulting in a localization error for the estimated pacing sites within 1 cm. Our findings may have implications for imaging of atrial activity in patients with focal arrhythmias.

  7. Nomogram for 30-day morbidity after primary cytoreductive surgery for advanced stage ovarian cancer.

    PubMed

    Nieuwenhuyzen-de Boer, G M; Gerestein, C G; Eijkemans, M J C; Burger, C W; Kooi, G S

    2016-01-01

    Extensive surgical procedures to achieve maximal cytoreduction in patients with advanced stage epithelial ovarian cancer (EOC) are inevitably associated with postoperative morbidity and mortality. This study aimed to identify preoperative predictors of 30-day morbidity after primary cytoreductive surgery for advanced stage EOC and to develop a nomogram for individual risk assessment. Patients in The Netherlands who underwent primary cytoreductive surgery for advanced stage EOC between January 2004 and December 2007. All peri- and postoperative complications within 30 days after surgery were registered and classified. To investigate predictors of 30-day morbidity, a Cox proportional hazard model with backward stepwise elimination was utilized. The identified predictors were entered into a nomogram. The main outcome was to identify parameters that predict operative risk. 293 patients entered the study protocol. Optimal cytoreduction was achieved in 136 (46%) patients. Thirty-day morbidity was seen in 99 (34%) patients. Morbidity could be predicted by age (p = 0.033; OR 1.024), preoperative hemoglobin (p = 0.194; OR 0.843), and WHO performance status (p = 0.015; OR 1.821) with a optimism-corrected c-statistic of 0.62. Determinants co-morbidity status, serum CA125 level, platelet count, and presence of ascites were comparable in both groups. Thirty-day morbidity after primary cytoreductive surgery for advanced stage EOC could be predicted by age, hemoglobin, and WHO performance status. The generated nomogram could be valuable for predicting operative risk in the individual patient.

  8. Cross-Layer Protocol Combining Tree Routing and TDMA Slotting in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Bai, Ronggang; Ji, Yusheng; Lin, Zhiting; Wang, Qinghua; Zhou, Xiaofang; Qu, Yugui; Zhao, Baohua

    Being different from other networks, the load and direction of data traffic for wireless sensor networks are rather predictable. The relationships between nodes are cooperative rather than competitive. These features allow the design approach of a protocol stack to be able to use the cross-layer interactive way instead of a hierarchical structure. The proposed cross-layer protocol CLWSN optimizes the channel allocation in the MAC layer using the information from the routing tables, reduces the conflicting set, and improves the throughput. Simulations revealed that it outperforms SMAC and MINA in terms of delay and energy consumption.

  9. Integration of Molecular Dynamics Based Predictions into the Optimization of De Novo Protein Designs: Limitations and Benefits.

    PubMed

    Carvalho, Henrique F; Barbosa, Arménio J M; Roque, Ana C A; Iranzo, Olga; Branco, Ricardo J F

    2017-01-01

    Recent advances in de novo protein design have gained considerable insight from the intrinsic dynamics of proteins, based on the integration of molecular dynamics simulations protocols on the state-of-the-art de novo protein design protocols used nowadays. With this protocol we illustrate how to set up and run a molecular dynamics simulation followed by a functional protein dynamics analysis. New users will be introduced to some useful open-source computational tools, including the GROMACS molecular dynamics simulation software package and ProDy for protein structural dynamics analysis.

  10. Squeezed-state quantum key distribution with a Rindler observer

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Shi, Ronghua; Guo, Ying

    2018-03-01

    Lengthening the maximum transmission distance of quantum key distribution plays a vital role in quantum information processing. In this paper, we propose a directional squeezed-state protocol with signals detected by a Rindler observer in the relativistic quantum field framework. We derive an analytical solution to the transmission problem of squeezed states from the inertial sender to the accelerated receiver. The variance of the involved signal mode is closer to optimality than that of the coherent-state-based protocol. Simulation results show that the proposed protocol has better performance than the coherent-state counterpart especially in terms of the maximal transmission distance.

  11. Molecular Docking Study on Galantamine Derivatives as Cholinesterase Inhibitors.

    PubMed

    Atanasova, Mariyana; Yordanov, Nikola; Dimitrov, Ivan; Berkov, Strahil; Doytchinova, Irini

    2015-06-01

    A training set of 22 synthetic galantamine derivatives binding to acetylcholinesterase was docked by GOLD and the protocol was optimized in terms of scoring function, rigidity/flexibility of the binding site, presence/absence of a water molecule inside and radius of the binding site. A moderate correlation was found between the affinities of compounds expressed as pIC50 values and their docking scores. The optimized docking protocol was validated by an external test set of 11 natural galantamine derivatives and the correlation coefficient between the docking scores and the pIC50 values was 0.800. The derived relationship was used to analyze the interactions between galantamine derivatives and AChE. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Moderate reagent mixing on an orbital shaker reduces the incubation time of enzyme-linked immunosorbent assay.

    PubMed

    Kumar, Saroj; Ahirwar, Rajesh; Rehman, Ishita; Nahar, Pradip

    2017-07-01

    Rapid diagnostic tests can be developed using ELISA for detection of diseases in emergency conditions. Conventional ELISA takes 1-2 days, making it unsuitable for rapid diagnostics. Here, we report the effect of reagents mixing via shaking or vortexing on the assay timing of ELISA. A 48-min protocol of ELISA involving 12-min incubations with reagent mixing at 750 rpm for every step was optimized. Contrary to this, time-optimized control ELISA performed without mixing produced similar results in 8 h, leaving a time gain of 7 h using the developed protocol. Collectively, the findings suggest the development of ELISA-based rapid diagnostics. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. International Laboratory Comparison of Influenza Microneutralization Assays for A(H1N1)pdm09, A(H3N2), and A(H5N1) Influenza Viruses by CONSISE

    PubMed Central

    Engelhardt, Othmar G.; Wood, John; Heath, Alan; Katz, Jacqueline M.; Peiris, Malik; Hoschler, Katja; Hungnes, Olav; Zhang, Wenqing; Van Kerkhove, Maria D.

    2015-01-01

    The microneutralization assay is commonly used to detect antibodies to influenza virus, and multiple protocols are used worldwide. These protocols differ in the incubation time of the assay as well as in the order of specific steps, and even within protocols there are often further adjustments in individual laboratories. The impact these protocol variations have on influenza serology data is unclear. Thus, a laboratory comparison of the 2-day enzyme-linked immunosorbent assay (ELISA) and 3-day hemagglutination (HA) microneutralization (MN) protocols, using A(H1N1)pdm09, A(H3N2), and A(H5N1) viruses, was performed by the CONSISE Laboratory Working Group. Individual laboratories performed both assay protocols, on multiple occasions, using different serum panels. Thirteen laboratories from around the world participated. Within each laboratory, serum sample titers for the different assay protocols were compared between assays to determine the sensitivity of each assay and were compared between replicates to assess the reproducibility of each protocol for each laboratory. There was good correlation of the results obtained using the two assay protocols in most laboratories, indicating that these assays may be interchangeable for detecting antibodies to the influenza A viruses included in this study. Importantly, participating laboratories have aligned their methodologies to the CONSISE consensus 2-day ELISA and 3-day HA MN assay protocols to enable better correlation of these assays in the future. PMID:26108286

  14. Three-dimensional image technology in forensic anthropology: Assessing the validity of biological profiles derived from CT-3D images of the skeleton

    NASA Astrophysics Data System (ADS)

    Garcia de Leon Valenzuela, Maria Julia

    This project explores the reliability of building a biological profile for an unknown individual based on three-dimensional (3D) images of the individual's skeleton. 3D imaging technology has been widely researched for medical and engineering applications, and it is increasingly being used as a tool for anthropological inquiry. While the question of whether a biological profile can be derived from 3D images of a skeleton with the same accuracy as achieved when using dry bones has been explored, bigger sample sizes, a standardized scanning protocol and more interobserver error data are needed before 3D methods can become widely and confidently used in forensic anthropology. 3D images of Computed Tomography (CT) scans were obtained from 130 innominate bones from Boston University's skeletal collection (School of Medicine). For each bone, both 3D images and original bones were assessed using the Phenice and Suchey-Brooks methods. Statistical analysis was used to determine the agreement between 3D image assessment versus traditional assessment. A pool of six individuals with varying experience in the field of forensic anthropology scored a subsample (n = 20) to explore interobserver error. While a high agreement was found for age and sex estimation for specimens scored by the author, the interobserver study shows that observers found it difficult to apply standard methods to 3D images. Higher levels of experience did not result in higher agreement between observers, as would be expected. Thus, a need for training in 3D visualization before applying anthropological methods to 3D bones is suggested. Future research should explore interobserver error using a larger sample size in order to test the hypothesis that training in 3D visualization will result in a higher agreement between scores. The need for the development of a standard scanning protocol focusing on the optimization of 3D image resolution is highlighted. Applications for this research include the possibility of digitizing skeletal collections in order to expand their use and for deriving skeletal collections from living populations and creating population-specific standards. Further research for the development of a standard scanning and processing protocol is needed before 3D methods in forensic anthropology are considered as reliable tools for generating biological profiles.

  15. Improving Operational Effectiveness of Tactical Long Endurance Unmanned Aerial Systems (TALEUAS) by Utilizing Solar Power

    DTIC Science & Technology

    2014-06-01

    Speed xiii TEK Total Energy Compensated TSP traveling salesman problem UAV unmanned aerial vehicle UDP user datagram protocol UKF unscented...discretized map, and use the map to optimally solve the navigation task. The optimal navigation solution utilizes the well-known “ travelling salesman problem ...2 C. FORMULATION OF THE PROBLEM .................................................. 3 D

  16. Wireless Cooperative Networks: Self-Configuration and Optimization

    DTIC Science & Technology

    2011-09-09

    TERMS wireless sensor networks , wireless cooperative networks, resource optimization, ultra-wideband, localization, ranging 16. SECURITY...Communications We consider two prevalent relay protocols for wireless sensor networks : decode-and-forward (DF) and amplify-and-forward (AF). To... sensor networks where each node may have its own sensing data to transmit, since they can maximally conserve energy while helping others as relays

  17. Magnetic nanobeads present during enzymatic amplification and labeling for a simplified DNA detection protocol based on AC susceptometry

    NASA Astrophysics Data System (ADS)

    Bejhed, Rebecca S.; Strømme, Maria; Svedlindh, Peter; Ahlford, Annika; Strömberg, Mattias

    2015-12-01

    Magnetic biosensors are promising candidates for low-cost point-of-care biodiagnostic devices. For optimal efficiency it is crucial to minimize the time and complexity of the assay protocol including target recognition, amplification, labeling and read-out. In this work, possibilities for protocol simplifications for a DNA biodetection principle relying on hybridization of magnetic nanobeads to rolling circle amplification (RCA) products are investigated. The target DNA is recognized through a padlock ligation assay resulting in DNA circles serving as templates for the RCA process. It is found that beads can be present during amplification without noticeably interfering with the enzyme used for RCA (phi29 polymerase). As a result, the bead-coil hybridization can be performed immediately after amplification in a one-step manner at elevated temperature within a few minutes prior to read-out in an AC susceptometer setup, i.e. a combined protocol approach. Moreover, by recording the phase angle ξ = arctan(χ″/χ'), where χ and χ″ are the in-phase and out-of-phase components of the AC susceptibility, respectively, at one single frequency the total assay time for the optimized combined protocol would be no more than 1.5 hours, often a relevant time frame for diagnosis of cancer and infectious disease. Also, applying the phase angle method normalization of AC susceptibility data is not needed. These findings are useful for the development of point-of-care biodiagnostic devices relying on bead-coil binding and magnetic AC susceptometry.

  18. Time-saving design of experiment protocol for optimization of LC-MS data processing in metabolomic approaches.

    PubMed

    Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine

    2013-08-06

    We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.

  19. Quantum dense key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.

    2004-03-01

    This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility.

  20. Energy and Process Assessment Protocol for Industrial Buildings

    DTIC Science & Technology

    2007-05-01

    address production and maintenance needs at U.S. Army Arsenals and Depots. The Protocol is partly the result of an international collaboration under...the International Energy Agency “Energy Conservation in Buildings and Community Systems” Annex 46, Subtask A. A group of government, institutional...Optimization Technology.” This is also a part of the IEA-ECBCS ( International Energy Agency – En- ergy Conservation in Buildings and Community Systems

  1. In Situ Chemical Oxidation for Groundwater Remediation: Site-Specific Engineering & Technology Application

    DTIC Science & Technology

    2010-10-01

    PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Colorado School of Mines,1500 Illinois St, Golden ,CO,80401 8. PERFORMING ORGANIZATION REPORT NUMBER 9...Protocol page 13 Overall ISCO Protocol Flow Diagram addition, laboratory studies may be used to select optimal chemistry parameters to maximize oxidant...Design Process 5. Because of the complexity of these oxidants’ chemistry and implementation, with much of the knowledge base residing with those

  2. Performance Analysis and Optimization of the Winnow Secret Key Reconciliation Protocol

    DTIC Science & Technology

    2011-06-01

    use in a quantum key system can be defined in two ways :  The number of messages passed between Alice and Bob  The...classical and quantum environment. Post- quantum cryptography , which is generally used to describe classical quantum -resilient protocols, includes...composed of a one- way quantum channel and a two - way classical channel. Owing to the physics of the channel, the quantum channel is subject to

  3. Comparison of Diffusion MRI Acquisition Protocols for the In Vivo Characterization of the Mouse Spinal Cord: Variability Analysis and Application to an Amyotrophic Lateral Sclerosis Model

    PubMed Central

    Marcuzzo, Stefania; Bonanno, Silvia; Padelli, Francesco; Moreno-Manzano, Victoria; García-Verdugo, José Manuel; Bernasconi, Pia; Mantegazza, Renato; Bruzzone, Maria Grazia; Zucca, Ileana

    2016-01-01

    Diffusion-weighted Magnetic Resonance Imaging (dMRI) has relevant applications in the microstructural characterization of the spinal cord, especially in neurodegenerative diseases. Animal models have a pivotal role in the study of such diseases; however, in vivo spinal dMRI of small animals entails additional challenges that require a systematical investigation of acquisition parameters. The purpose of this study is to compare three acquisition protocols and identify the scanning parameters allowing a robust estimation of the main diffusion quantities and a good sensitivity to neurodegeneration in the mouse spinal cord. For all the protocols, the signal-to-noise and contrast-to noise ratios and the mean value and variability of Diffusion Tensor metrics were evaluated in healthy controls. For the estimation of fractional anisotropy less variability was provided by protocols with more diffusion directions, for the estimation of mean, axial and radial diffusivity by protocols with fewer diffusion directions and higher diffusion weighting. Intermediate features (12 directions, b = 1200 s/mm2) provided the overall minimum inter- and intra-subject variability in most cases. In order to test the diagnostic sensitivity of the protocols, 7 G93A-SOD1 mice (model of amyotrophic lateral sclerosis) at 10 and 17 weeks of age were scanned and the derived diffusion parameters compared with those estimated in age-matched healthy animals. The protocols with an intermediate or high number of diffusion directions provided the best differentiation between the two groups at week 17, whereas only few local significant differences were highlighted at week 10. According to our results, a dMRI protocol with an intermediate number of diffusion gradient directions and a relatively high diffusion weighting is optimal for spinal cord imaging. Further work is needed to confirm these results and for a finer tuning of acquisition parameters. Nevertheless, our findings could be important for the optimization of acquisition protocols for preclinical and clinical dMRI studies on the spinal cord. PMID:27560686

  4. A comprehensive study on the relationship between the image quality and imaging dose in low-dose cone beam CT

    NASA Astrophysics Data System (ADS)

    Yan, Hao; Cervino, Laura; Jia, Xun; Jiang, Steve B.

    2012-04-01

    While compressed sensing (CS)-based algorithms have been developed for the low-dose cone beam CT (CBCT) reconstruction, a clear understanding of the relationship between the image quality and imaging dose at low-dose levels is needed. In this paper, we qualitatively investigate this subject in a comprehensive manner with extensive experimental and simulation studies. The basic idea is to plot both the image quality and imaging dose together as functions of the number of projections and mAs per projection over the whole clinically relevant range. On this basis, a clear understanding of the tradeoff between the image quality and imaging dose can be achieved and optimal low-dose CBCT scan protocols can be developed to maximize the dose reduction while minimizing the image quality loss for various imaging tasks in image-guided radiation therapy (IGRT). Main findings of this work include (1) under the CS-based reconstruction framework, image quality has little degradation over a large range of dose variation. Image quality degradation becomes evident when the imaging dose (approximated with the x-ray tube load) is decreased below 100 total mAs. An imaging dose lower than 40 total mAs leads to a dramatic image degradation, and thus should be used cautiously. Optimal low-dose CBCT scan protocols likely fall in the dose range of 40-100 total mAs, depending on the specific IGRT applications. (2) Among different scan protocols at a constant low-dose level, the super sparse-view reconstruction with the projection number less than 50 is the most challenging case, even with strong regularization. Better image quality can be acquired with low mAs protocols. (3) The optimal scan protocol is the combination of a medium number of projections and a medium level of mAs/view. This is more evident when the dose is around 72.8 total mAs or below and when the ROI is a low-contrast or high-resolution object. Based on our results, the optimal number of projections is around 90 to 120. (4) The clinically acceptable lowest imaging dose level is task dependent. In our study, 72.8 mAs is a safe dose level for visualizing low-contrast objects, while 12.2 total mAs is sufficient for detecting high-contrast objects of diameter greater than 3 mm.

  5. Optimizing estimation of hemispheric dominance for language using magnetic source imaging

    PubMed Central

    Passaro, Antony D.; Rezaie, Roozbeh; Moser, Dana C.; Li, Zhimin; Dias, Nadeeka; Papanicolaou, Andrew C.

    2011-01-01

    The efficacy of magnetoencephalography (MEG) as an alternative to invasive methods for investigating the cortical representation of language has been explored in several studies. Recently, studies comparing MEG to the gold standard Wada procedure have found inconsistent and often less-than accurate estimates of laterality across various MEG studies. Here we attempted to address this issue among normal right-handed adults (N=12) by supplementing a well-established MEG protocol involving word recognition and the single dipole method with a sentence comprehension task and a beamformer approach localizing neural oscillations. Beamformer analysis of word recognition and sentence comprehension tasks revealed a desynchronization in the 10–18 Hz range, localized to the temporo-parietal cortices. Inspection of individual profiles of localized desynchronization (10–18 Hz) revealed left hemispheric dominance in 91.7% and 83.3% of individuals during the word recognition and sentence comprehension tasks, respectively. In contrast, single dipole analysis yielded lower estimates, such that activity in temporal language regions was left-lateralized in 66.7% and 58.3% of individuals during word recognition and sentence comprehension, respectively. The results obtained from the word recognition task and localization of oscillatory activity using a beamformer appear to be in line with general estimates of left hemispheric dominance for language in normal right-handed individuals. Furthermore, the current findings support the growing notion that changes in neural oscillations underlie critical components of linguistic processing. PMID:21890118

  6. Using generalizability theory to develop clinical assessment protocols.

    PubMed

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  7. Optimal diabatic dynamics of Majorana-based quantum gates

    NASA Astrophysics Data System (ADS)

    Rahmani, Armin; Seradjeh, Babak; Franz, Marcel

    2017-08-01

    In topological quantum computing, unitary operations on qubits are performed by adiabatic braiding of non-Abelian quasiparticles, such as Majorana zero modes, and are protected from local environmental perturbations. In the adiabatic regime, with timescales set by the inverse gap of the system, the errors can be made arbitrarily small by performing the process more slowly. To enhance the performance of quantum information processing with Majorana zero modes, we apply the theory of optimal control to the diabatic dynamics of Majorana-based qubits. While we sacrifice complete topological protection, we impose constraints on the optimal protocol to take advantage of the nonlocal nature of topological information and increase the robustness of our gates. By using the Pontryagin's maximum principle, we show that robust equivalent gates to perfect adiabatic braiding can be implemented in finite times through optimal pulses. In our implementation, modifications to the device Hamiltonian are avoided. Focusing on thermally isolated systems, we study the effects of calibration errors and external white and 1 /f (pink) noise on Majorana-based gates. While a noise-induced antiadiabatic behavior, where a slower process creates more diabatic excitations, prohibits indefinite enhancement of the robustness of the adiabatic scheme, our fast optimal protocols exhibit remarkable stability to noise and have the potential to significantly enhance the practical performance of Majorana-based information processing.

  8. Evaluation and optimization of microbial DNA extraction from fecal samples of wild Antarctic bird species

    PubMed Central

    Eriksson, Per; Mourkas, Evangelos; González-Acuna, Daniel; Olsen, Björn; Ellström, Patrik

    2017-01-01

    ABSTRACT Introduction: Advances in the development of nucleic acid-based methods have dramatically facilitated studies of host–microbial interactions. Fecal DNA analysis can provide information about the host’s microbiota and gastrointestinal pathogen burden. Numerous studies have been conducted in mammals, yet birds are less well studied. Avian fecal DNA extraction has proved challenging, partly due to the mixture of fecal and urinary excretions and the deficiency of optimized protocols. This study presents an evaluation of the performance in avian fecal DNA extraction of six commercial kits from different bird species, focusing on penguins. Material and methods: Six DNA extraction kits were first tested according to the manufacturers’ instructions using mallard feces. The kit giving the highest DNA yield was selected for further optimization and evaluation using Antarctic bird feces. Results: Penguin feces constitute a challenging sample type: most of the DNA extraction kits failed to yield acceptable amounts of DNA. The QIAamp cador Pathogen kit (Qiagen) performed the best in the initial investigation. Further optimization of the protocol resulted in good yields of high-quality DNA from seven bird species of different avian orders. Conclusion: This study presents an optimized approach to DNA extraction from challenging avian fecal samples. PMID:29152162

  9. AMMOS2: a web server for protein-ligand-water complexes refinement via molecular mechanics.

    PubMed

    Labbé, Céline M; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O; Pajeva, Ilza; Miteva, Maria A

    2017-07-03

    AMMOS2 is an interactive web server for efficient computational refinement of protein-small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein-ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein-ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein-ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein-ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein-ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein-ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Goals and Objectives to Optimize the Value of an Acute Pain Service in Perioperative Pain Management.

    PubMed

    Le-Wendling, Linda; Glick, Wesley; Tighe, Patrick

    2017-12-01

    As newer pharmacologic and procedural interventions, technology, and data on outcomes in pain management are becoming available, effective acute pain management will require a dedicated Acute Pain Service (APS) to help determine the most optimal pain management plan for the patients. Goals for pain management must take into consideration the side effect profile of drugs and potential complications of procedural interventions. Multiple objective optimization is the combination of multiple different objectives for acute pain management. Simple use of opioids, for example, can reduce all pain to minimal levels, but at what cost to the patient, the medical system, and to public health as a whole? Many models for APS exist based on personnel's skills, knowledge and experience, but effective use of an APS will also require allocation of time, space, financial, and personnel resources with clear objectives and a feedback mechanism to guide changes to acute pain medicine practices to meet the constantly evolving medical field. Physician-based practices have the advantage of developing protocols for the management of low-variability, high-occurrence scenarios in addition to tailoring care to individual patients with high-variability, low-occurrence scenarios. Frequent feedback and data collection/assessment on patient outcomes is essential in evaluating the efficacy of the Acute Pain Service's intervention in improving patient outcomes in the acute and perioperative setting.

  11. AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics

    PubMed Central

    Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza

    2017-01-01

    Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703

  12. ABM Clinical Protocol #2: Guidelines for Hospital Discharge of the Breastfeeding Term Newborn and Mother: “The Going Home Protocol,” Revised 2014

    PubMed Central

    Evans, Amy; Taylor, Julie Scott

    2014-01-01

    A central goal of The Academy of Breastfeeding Medicine is the development of clinical protocols for managing common medical problems that may impact breastfeeding success. These protocols serve only as guidelines for the care of breastfeeding mothers and infants and do not delineate an exclusive course of treatment or serve as standards of medical care. Variations in treatment may be appropriate according to the needs of an individual patient. PMID:24456024

  13. Protein substitute dosage in PKU: how much do young patients need?

    PubMed

    MacDonald, A; Chakrapani, A; Hendriksz, C; Daly, A; Davies, P; Asplin, D; Hall, K; Booth, I W

    2006-07-01

    The optimal dose of protein substitute has not been determined in children with phenylketonuria (PKU). To determine if a lower dose of protein substitute could achieve the same or better degree of blood phenylalanine control when compared to the dosage recommended by the UK MRC.(1) In a six week randomised, crossover study, two doses of protein substitute (Protocol A: 2 g/kg/day of protein equivalent; Protocol B: 1.2 g/kg/day protein equivalent) were compared in 25 children with well controlled PKU aged 2-10 years (median 6 years). Each dose of protein substitute was taken for 14 days, with a 14 day washout period in between. Twice daily blood samples (fasting pre-breakfast and evening, at standard times) for plasma phenylalanine were taken on day 8-14 of each protocol. The median usual dose of protein substitute was 2.2 g/kg/day (range 1.5-3.1 g/kg/day). When compared with control values, median plasma phenylalanine on the low dose of protein substitute increased at pre-breakfast by 301 mumol/l (95% CI 215 to 386) and in the evening by 337 micromol/l (95% CI 248 to 431). On the high dose of protein substitute, plasma phenylalanine concentrations remained unchanged when compared to control values. However, wide variability was seen between subjects. A higher dosage of protein substitute appeared to contribute to lower blood phenylalanine concentrations in PKU, but it did have a variable and individual impact and may have been influenced by the carbohydrate (+/- fat) content of the protein substitute.

  14. Paraformaldehyde fixation of neutrophils for immunolabeling of granule antigens in cryoultrasections.

    PubMed

    Elliott, E; Dennison, C; Fortgens, P H; Travis, J

    1995-10-01

    Paraformaldehyde (PFA) fixation was optimized to facilitate the immobilization and labeling of multiple granule antigens, using short fixation regimens and cryoultramicrotomy of unembedded neutrophils (PMNs). In the optimal protocol, extraction of azurophil granule antigens (especially of the abundant elastase) was obviated by manipulating the polymeric state of PFA, and hence its rate of cross-linking, by altering its concentration and pH in a multistep process. Primary fixation conditions used (4% PFA, pH 8.0, 5 min) favor fixative penetration and rapid cross-linking. Stable cross-linking of the antigen was achieved in a secondary fixation step using conditions that favor larger, more cross-linking polymeric forms of PFA (8% PFA, pH 7.2, 15 min). Immobilization of granule antigens was enhanced by flotation of cut sections on fixative (8% PFA, pH 8.0) before labeling and by using post-labeling fixation with 1% glutaraldehyde. The optimized protocol facilitated immobilization and immunolabeling of elastase, myeloperoxidase, lactoferrin, and cathepsin D in highly hydrated, unembedded PMNs.

  15. Optimal quantum operations at zero energy cost

    NASA Astrophysics Data System (ADS)

    Chiribella, Giulio; Yang, Yuxiang

    2017-08-01

    Quantum technologies are developing powerful tools to generate and manipulate coherent superpositions of different energy levels. Envisaging a new generation of energy-efficient quantum devices, here we explore how coherence can be manipulated without exchanging energy with the surrounding environment. We start from the task of converting a coherent superposition of energy eigenstates into another. We identify the optimal energy-preserving operations, both in the deterministic and in the probabilistic scenario. We then design a recursive protocol, wherein a branching sequence of energy-preserving filters increases the probability of success while reaching maximum fidelity at each iteration. Building on the recursive protocol, we construct efficient approximations of the optimal fidelity-probability trade-off, by taking coherent superpositions of the different branches generated by probabilistic filtering. The benefits of this construction are illustrated in applications to quantum metrology, quantum cloning, coherent state amplification, and ancilla-driven computation. Finally, we extend our results to transitions where the input state is generally mixed and we apply our findings to the task of purifying quantum coherence.

  16. Nasal irrigation: From empiricism to evidence-based medicine. A review.

    PubMed

    Bastier, P-L; Lechot, A; Bordenave, L; Durand, M; de Gabory, L

    2015-11-01

    Nasal irrigation plays a non-negligible role in the treatment of numerous sinonasal pathologies and postoperative care. There is, however, a wide variety of protocols. The present review of the evidence-based literature sought objective arguments for optimization and efficacy. It emerged that large-volume low-pressure nasal douche optimizes the distribution and cleansing power of the irrigation solution in the nasal cavity. Ionic composition and pH also influence mucociliary clearance and epithelium trophicity. Seawater is less rich in sodium ions and richer in bicarbonates, potassium, calcium and magnesium than is isotonic normal saline, while alkaline pH and elevated calcium concentration optimized ciliary motility in vitro. Bicarbonates reduce secretion viscosity. Potassium and magnesium promote healing and limit local inflammation. These results show that the efficacy of nasal irrigation is multifactorial. Large-volume low-pressure nasal irrigation using undiluted seawater seems, in the present state of knowledge, to be the most effective protocol. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  17. Optimal single-shot strategies for discrimination of quantum measurements

    NASA Astrophysics Data System (ADS)

    Sedlák, Michal; Ziman, Mário

    2014-11-01

    We study discrimination of m quantum measurements in the scenario when the unknown measurement with n outcomes can be used only once. We show that ancilla-assisted discrimination procedures provide a nontrivial advantage over simple (ancilla-free) schemes for perfect distinguishability and we prove that inevitably m ≤n . We derive necessary and sufficient conditions of perfect distinguishability of general binary measurements. We show that the optimization of the discrimination of projective qubit measurements and their mixtures with white noise is equivalent to the discrimination of specific quantum states. In particular, the optimal protocol for discrimination of projective qubit measurements with fixed failure rate (exploiting maximally entangled test state) is described. While minimum-error discrimination of two projective qubit measurements can be realized without any need of entanglement, we show that discrimination of three projective qubit measurements requires a bipartite probe state. Moreover, when the measurements are not projective, the non-maximally entangled test states can outperform the maximally entangled ones. Finally, we rephrase the unambiguous discrimination of measurements as quantum key distribution protocol.

  18. SU-E-I-23: A General KV Constrained Optimization of CNR for CT Abdominal Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weir, V; Zhang, J

    Purpose: While Tube current modulation has been well accepted for CT dose reduction, kV adjusting in clinical settings is still at its early stage. This is mainly due to the limited kV options of most current CT scanners. kV adjusting can potentially reduce radiation dose and optimize image quality. This study is to optimize CT abdomen imaging acquisition based on the assumption of a continuous kV, with the goal to provide the best contrast to noise ratio (CNR). Methods: For a given dose (CTDIvol) level, the CNRs at different kV and pitches were measured with an ACR GAMMEX phantom. Themore » phantom was scanned in a Siemens Sensation 64 scanner and a GE VCT 64 scanner. A constrained mathematical optimization was used to find the kV which led to the highest CNR for the anatomy and pitch setting. Parametric equations were obtained from polynomial fitting of plots of kVs vs CNRs. A suitable constraint region for optimization was chosen. Subsequent optimization yielded a peak CNR at a particular kV for different collimations and pitch setting. Results: The constrained mathematical optimization approach yields kV of 114.83 and 113.46, with CNRs of 1.27 and 1.11 at the pitch of 1.2 and 1.4, respectively, for the Siemens Sensation 64 scanner with the collimation of 32 x 0.625mm. An optimized kV of 134.25 and 1.51 CNR is obtained for a GE VCT 64 slice scanner with a collimation of 32 x 0.625mm and a pitch of 0.969. At 0.516 pitch and 32 x 0.625 mm an optimized kV of 133.75 and a CNR of 1.14 was found for the GE VCT 64 slice scanner. Conclusion: CNR in CT image acquisition can be further optimized with a continuous kV option instead of current discrete or fixed kV settings. A continuous kV option is a key for individualized CT protocols.« less

  19. Generation of insulin-producing cells from human bone marrow-derived mesenchymal stem cells: comparison of three differentiation protocols.

    PubMed

    Gabr, Mahmoud M; Zakaria, Mahmoud M; Refaie, Ayman F; Khater, Sherry M; Ashamallah, Sylvia A; Ismail, Amani M; El-Badri, Nagwa; Ghoneim, Mohamed A

    2014-01-01

    Many protocols were utilized for directed differentiation of mesenchymal stem cells (MSCs) to form insulin-producing cells (IPCs). We compared the relative efficiency of three differentiation protocols. Human bone marrow-derived MSCs (HBM-MSCs) were obtained from three insulin-dependent type 2 diabetic patients. Differentiation into IPCs was carried out by three protocols: conophylline-based (one-step protocol), trichostatin-A-based (two-step protocol), and β -mercaptoethanol-based (three-step protocol). At the end of differentiation, cells were evaluated by immunolabeling for insulin production, expression of pancreatic endocrine genes, and release of insulin and c-peptide in response to increasing glucose concentrations. By immunolabeling, the proportion of generated IPCs was modest ( ≃ 3%) in all the three protocols. All relevant pancreatic endocrine genes, insulin, glucagon, and somatostatin, were expressed. There was a stepwise increase in insulin and c-peptide release in response to glucose challenge, but the released amounts were low when compared with those of pancreatic islets. The yield of functional IPCs following directed differentiation of HBM-MSCs was modest and was comparable among the three tested protocols. Protocols for directed differentiation of MSCs need further optimization in order to be clinically meaningful. To this end, addition of an extracellular matrix and/or a suitable template should be attempted.

  20. Deterministic and unambiguous dense coding

    NASA Astrophysics Data System (ADS)

    Wu, Shengjun; Cohen, Scott M.; Sun, Yuqing; Griffiths, Robert B.

    2006-04-01

    Optimal dense coding using a partially-entangled pure state of Schmidt rank Dmacr and a noiseless quantum channel of dimension D is studied both in the deterministic case where at most Ld messages can be transmitted with perfect fidelity, and in the unambiguous case where when the protocol succeeds (probability τx ) Bob knows for sure that Alice sent message x , and when it fails (probability 1-τx ) he knows it has failed. Alice is allowed any single-shot (one use) encoding procedure, and Bob any single-shot measurement. For Dmacr ⩽D a bound is obtained for Ld in terms of the largest Schmidt coefficient of the entangled state, and is compared with published results by Mozes [Phys. Rev. A71, 012311 (2005)]. For Dmacr >D it is shown that Ld is strictly less than D2 unless Dmacr is an integer multiple of D , in which case uniform (maximal) entanglement is not needed to achieve the optimal protocol. The unambiguous case is studied for Dmacr ⩽D , assuming τx>0 for a set of Dmacr D messages, and a bound is obtained for the average ⟨1/τ⟩ . A bound on the average ⟨τ⟩ requires an additional assumption of encoding by isometries (unitaries when Dmacr =D ) that are orthogonal for different messages. Both bounds are saturated when τx is a constant independent of x , by a protocol based on one-shot entanglement concentration. For Dmacr >D it is shown that (at least) D2 messages can be sent unambiguously. Whether unitary (isometric) encoding suffices for optimal protocols remains a major unanswered question, both for our work and for previous studies of dense coding using partially-entangled states, including noisy (mixed) states.

  1. Towards optimized anesthesia protocols for stereotactic surgery in rats: Analgesic, stress and general health effects of injectable anesthetics. A comparison of a recommended complete reversal anesthesia with traditional chloral hydrate monoanesthesia.

    PubMed

    Hüske, Christin; Sander, Svenja Esther; Hamann, Melanie; Kershaw, Olivia; Richter, Franziska; Richter, Angelika

    2016-07-01

    Although injectable anesthetics are still widely used in laboratory rodents, scientific data concerning pain and distress during and after stereotactic surgery are rare. However, optimal anesthesia protocols have a high impact on the quality of the derived data. We therefore investigated the suitability of recommended injectable anesthesia with a traditionally used monoanesthesia for stereotactic surgery in view of optimization and refinement in rats. The influence of the recommended complete reversal anesthesia (MMF; 0.15mg/kg medetomidine, 2mg/kg midazolam, 0.005mg/kg fentanyl; i.m.) with or without reversal and of chloral hydrate (430mg/kg, 3.6%, i.p.) on various physiological, biochemical and behavioral parameters (before, during, after surgery) was analyzed. Isoflurane was also included in stress parameter analysis. In all groups, depth of anesthesia was sufficient for stereotactic surgery with no animal losses. MMF caused transient exophthalmos, myositis at the injection site and increased early postoperative pain scores. Reversal induced agitation, restlessness and hypothermia. Even the low concentrated chloral hydrate led to peritonitis and multifocal liver necrosis, corresponding to increased stress hormone levels and loss in body weight. Increased stress response was also exerted by isoflurane anesthesia. Pronounced systemic toxicity of chloral hydrate strongly questions its further use in rodent anesthesia. In view of undesired effects of MMF and isoflurane, thorough consideration of anesthesia protocols for particular research projects is indispensable. Reversal should be restricted to emergency situations. Our data support further refinement of the current protocols and the importance of sham operated controls. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Multipinhole SPECT helical scan parameters and imaging volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less

  3. Loaded hip thrust-based PAP protocol effects on acceleration and sprint performance of handball players.

    PubMed

    Dello Iacono, Antonio; Padulo, Johnny; Seitz, Laurent D

    2018-06-01

    This study aimed to investigate the acute effects of two barbell hip thrust-based (BHT) post-activation potentiation (PAP) protocols on subsequent sprint performance. Using a crossover design, eighteen handball athletes performed maximal 15-m sprints before and 15s, 4min and 8min after two experimental protocols consisting of BHT loaded with either 50% or 85% 1RM (50PAP and 85PAP, respectively), in order to profile the transient PAP effects. The resulting sprint performances were significantly impaired at 15s only after the 85PAP protocol, which induced likely and very likely greater decreases compared to the 50PAP. At 4min and 8min, significant improvements and very likely beneficial effects were observed in the 10m and 15m performances following both protocols. Significant differences were found when comparing the two PAPs over time; the results suggested very likely greater performance improvements in 10m following the 85PAP after 4min and 8min, and possible greater performance improvements in 15m after 4min. Positive correlations between BHT 1RMs values and the greatest individual PAP responses on sprint performance were found. This investigation showed that both moderate and intensive BHT exercises can induce a PAP response, but the effects may differ according to the recovery following the potentiating stimulus and the individual`s strength level.

  4. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol

    PubMed Central

    Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    Objectives: To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. Methods: A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. Results: The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. Conclusions: The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ. PMID:26371077

  5. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol.

    PubMed

    Manoliu, Andrei; Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ.

  6. Radionuclide bone scan SPECT-CT: lowering the dose of CT significantly reduces radiation dose without impacting CT image quality

    PubMed Central

    Gupta, Sandeep Kumar; Trethewey, Scott; Brooker, Bree; Rutherford, Natalie; Diffey, Jenny; Viswanathan, Suresh; Attia, John

    2017-01-01

    The CT component of SPECT-CT is required for attenuation correction and anatomical localization of the uptake on SPECT but there is no guideline about the optimal CT acquisition parameters. In our department, a standard CT acquisition protocol was changed in 2013 to give lower radiation dose to the patient. In this study, we retrospectively compared the effects on patient dose as well as the CT image quality with current versus older CT protocols. Ninety nine consecutive patients [n=51 Standard dose ‘old’ protocol (SDP); n=48 lower dose ‘new’ protocol (LDP)] with lumbar spine SPECT-CT for bone scan were examined. The main differences between the two protocols were that SDP used 130 kVp tube voltage and reference current-time product of 70 mAs whereas the LDP used 110 kVp and 40 mAs respectively. Various quantitative parameters from the CT images were obtained and the images were also rated blindly by two experienced nuclear medicine physicians for bony definition and noise. The mean calculated dose length product of the LDP group (121.5±39.6 mGy.cm) was significantly lower compared to the SDP group patients (266.9±96.9 mGy.cm; P<0.0001). This translated into a significant reduction in the mean effective dose to 1.8 mSv from 4.0 mSv. The physicians reported better CT image quality for the bony structures in LDP group although for soft tissue structures, the SDP group had better image quality. The optimized new CT acquisition protocol significantly reduced the radiation dose to the patient and in-fact improved CT image quality for the assessment of bony structures. PMID:28533938

  7. Osteogenic differentiation of equine adipose tissue derived mesenchymal stem cells using CaCl2.

    PubMed

    Elashry, Mohamed I; Baulig, Nadine; Heimann, Manuela; Bernhardt, Caroline; Wenisch, Sabine; Arnhold, Stefan

    2018-04-01

    Adipose tissue derived mesenchymal stem cells (ASCs) may be used to cure bone defects after osteogenic differentiation. In this study we tried to optimize osteogenic differentiation for equine ASCs using various concentrations of CaCl 2 in comparison to the standard osteogenic protocol. ASCs were isolated from subcutaneous adipose tissue from mixed breed horses. The osteogenic induction protocols were (1) the standard osteogenic medium (OM) composed of dexamethasone, ascorbic acid and β-glycerol phosphate; (2) CaCl 2 based protocol composed of 3, 5 and 7.5mM CaCl 2 . Differentiation and proliferation were evaluated at 7, 10, 14 and 21days post-differentiation induction using the alizarin red staining (ARS) detecting matrix calcification. Semi-quantification of cell protein content, ARS and alkaline phosphatase activity (ALP) were performed using an ELISA reader. Quantification of the transcription level for the common osteogenic markers alkaline phosphatase (ALP) and Osteopontin (OP) was performed using RT-qPCR. In the presence of CaCl 2 , a concentration dependent effect on the osteogenic differentiation capacity was evident by the ARS evaluation and OP gene expression. We provide evidence that 5 and 7mM CaCl 2 enhance the osteogenic differentiation compared to the OM protocol. Although, there was a clear commitment of ASCs to the osteogenic fate in the presence of 5 and 7mM CaCl 2 , cell proliferation was increased compared to OM. We report that an optimized CaCl 2 protocol reliably influences ASCs osteogenesis while conserving the proliferation capacity. Thus, using these protocols provide a platform for using ASCs as a cell source in bone tissue engineering. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Optimization and validation of sample preparation for metagenomic sequencing of viruses in clinical samples.

    PubMed

    Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael

    2017-08-08

    Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.

  9. Rapid Bacterial Testing for Spacecraft Water

    NASA Technical Reports Server (NTRS)

    Lisle, John T.; Pyle, Barry H.; McFeters, Gordon A.

    1996-01-01

    Evaluations of the fluorogenic stains and probes will continue. E. coli 0157:H7 will be used as the reference strain for optimizing protocols. We anticipate the continued use of the fluorescent antibodies (TRITC and FITC labeled) in conjunction with CTC, Rhl23, DiBAC4(3), DAPI and acridine orange. Chemunex, the manufacturer of the ChemScan analyzer system, also makes a fluorogenic probe, Chemchrome B, which will be incorporated into the suite of probes to evaluate once their system is on site. Regardless of the combination of stains and probes all will be evaluated on membrane filters. Development of a FISH protocol that will be applicable to our conditions will be continued. Complimentary 16s rRNA probes to Ps. aeruginosa and currently in our laboratory will be evaluated first. Once this protocol has been adequately optimized other probes will be ordered for u a select number of other species. Currently, protocols to evaluate the effects of disinfection and the resulting lethality, injury on stain and/or probe specificity and reliability are being developed. E. coli 0157:H7 is the reference strain and chlorine the disinfectant the reference protocol is being developed around. Upon completion of this work, the resulting protocol will be extended to other species and disinfectants (e.g., iodine). Similar disinfectant experiments will then be conducted on the same species after starvation to evaluate the effects of starvation on disinfection resistance and the applicability of the stains and probes. Development of the immunomagnetic separation system will continue. Combined with the rapid methods described above, with enumeration by the ChemScan, we anticipate that this will provide a highly sensitive technique for the detection of specific, active bacteria.

  10. Model Based Optimal Control, Estimation, and Validation of Lithium-Ion Batteries

    NASA Astrophysics Data System (ADS)

    Perez, Hector Eduardo

    This dissertation focuses on developing and experimentally validating model based control techniques to enhance the operation of lithium ion batteries, safely. An overview of the contributions to address the challenges that arise are provided below. Chapter 1: This chapter provides an introduction to battery fundamentals, models, and control and estimation techniques. Additionally, it provides motivation for the contributions of this dissertation. Chapter 2: This chapter examines reference governor (RG) methods for satisfying state constraints in Li-ion batteries. Mathematically, these constraints are formulated from a first principles electrochemical model. Consequently, the constraints explicitly model specific degradation mechanisms, such as lithium plating, lithium depletion, and overheating. This contrasts with the present paradigm of limiting measured voltage, current, and/or temperature. The critical challenges, however, are that (i) the electrochemical states evolve according to a system of nonlinear partial differential equations, and (ii) the states are not physically measurable. Assuming available state and parameter estimates, this chapter develops RGs for electrochemical battery models. The results demonstrate how electrochemical model state information can be utilized to ensure safe operation, while simultaneously enhancing energy capacity, power, and charge speeds in Li-ion batteries. Chapter 3: Complex multi-partial differential equation (PDE) electrochemical battery models are characterized by parameters that are often difficult to measure or identify. This parametric uncertainty influences the state estimates of electrochemical model-based observers for applications such as state-of-charge (SOC) estimation. This chapter develops two sensitivity-based interval observers that map bounded parameter uncertainty to state estimation intervals, within the context of electrochemical PDE models and SOC estimation. Theoretically, this chapter extends the notion of interval observers to PDE models using a sensitivity-based approach. Practically, this chapter quantifies the sensitivity of battery state estimates to parameter variations, enabling robust battery management schemes. The effectiveness of the proposed sensitivity-based interval observers is verified via a numerical study for the range of uncertain parameters. Chapter 4: This chapter seeks to derive insight on battery charging control using electrochemistry models. Directly using full order complex multi-partial differential equation (PDE) electrochemical battery models is difficult and sometimes impossible to implement. This chapter develops an approach for obtaining optimal charge control schemes, while ensuring safety through constraint satisfaction. An optimal charge control problem is mathematically formulated via a coupled reduced order electrochemical-thermal model which conserves key electrochemical and thermal state information. The Legendre-Gauss-Radau (LGR) pseudo-spectral method with adaptive multi-mesh-interval collocation is employed to solve the resulting nonlinear multi-state optimal control problem. Minimum time charge protocols are analyzed in detail subject to solid and electrolyte phase concentration constraints, as well as temperature constraints. The optimization scheme is examined using different input current bounds, and an insight on battery design for fast charging is provided. Experimental results are provided to compare the tradeoffs between an electrochemical-thermal model based optimal charge protocol and a traditional charge protocol. Chapter 5: Fast and safe charging protocols are crucial for enhancing the practicality of batteries, especially for mobile applications such as smartphones and electric vehicles. This chapter proposes an innovative approach to devising optimally health-conscious fast-safe charge protocols. A multi-objective optimal control problem is mathematically formulated via a coupled electro-thermal-aging battery model, where electrical and aging sub-models depend upon the core temperature captured by a two-state thermal sub-model. The Legendre-Gauss-Radau (LGR) pseudo-spectral method with adaptive multi-mesh-interval collocation is employed to solve the resulting highly nonlinear six-state optimal control problem. Charge time and health degradation are therefore optimally traded off, subject to both electrical and thermal constraints. Minimum-time, minimum-aging, and balanced charge scenarios are examined in detail. Sensitivities to the upper voltage bound, ambient temperature, and cooling convection resistance are investigated as well. Experimental results are provided to compare the tradeoffs between a balanced and traditional charge protocol. Chapter 6: This chapter provides concluding remarks on the findings of this dissertation and a discussion of future work.

  11. Lead optimization in the nondrug-like space.

    PubMed

    Zhao, Hongyu

    2011-02-01

    Drug-like space might be more densely populated with orally available compounds than the remaining chemical space, but lead optimization can still occur outside this space. Oral drug space is more dynamic than the relatively static drug-like space. As new targets emerge and optimization tools advance the oral drug space might expand. Lead optimization protocols are becoming more complex with greater optimization needs to be satisfied, which consequently could change the role of drug-likeness in the process. Whereas drug-like space should usually be explored preferentially, it can be easier to find oral drugs for certain targets in the nondrug-like space. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Entanglement distillation protocols and number theory

    NASA Astrophysics Data System (ADS)

    Bombin, H.; Martin-Delgado, M. A.

    2005-09-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.

  13. Finite-size analysis of continuous-variable measurement-device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Xueying; Zhang, Yichen; Zhao, Yijia; Wang, Xiangyu; Yu, Song; Guo, Hong

    2017-10-01

    We study the impact of the finite-size effect on the continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocol, mainly considering the finite-size effect on the parameter estimation procedure. The central-limit theorem and maximum likelihood estimation theorem are used to estimate the parameters. We also analyze the relationship between the number of exchanged signals and the optimal modulation variance in the protocol. It is proved that when Charlie's position is close to Bob, the CV-MDI QKD protocol has the farthest transmission distance in the finite-size scenario. Finally, we discuss the impact of finite-size effects related to the practical detection in the CV-MDI QKD protocol. The overall results indicate that the finite-size effect has a great influence on the secret-key rate of the CV-MDI QKD protocol and should not be ignored.

  14. Optimization of the scan protocols for CT-based material extraction in small animal PET/CT studies

    NASA Astrophysics Data System (ADS)

    Yang, Ching-Ching; Yu, Jhih-An; Yang, Bang-Hung; Wu, Tung-Hsin

    2013-12-01

    We investigated the effects of scan protocols on CT-based material extraction to minimize radiation dose while maintaining sufficient image information in small animal studies. The phantom simulation experiments were performed with the high dose (HD), medium dose (MD) and low dose (LD) protocols at 50, 70 and 80 kVp with varying mA s. The reconstructed CT images were segmented based on Hounsfield unit (HU)-physical density (ρ) calibration curves and the dual-energy CT-based (DECT) method. Compared to the (HU;ρ) method performed on CT images acquired with the 80 kVp HD protocol, a 2-fold improvement in segmentation accuracy and a 7.5-fold reduction in radiation dose were observed when the DECT method was performed on CT images acquired with the 50/80 kVp LD protocol, showing the possibility to reduce radiation dose while achieving high segmentation accuracy.

  15. Bayesian deconvolution and quantification of metabolites in complex 1D NMR spectra using BATMAN.

    PubMed

    Hao, Jie; Liebeke, Manuel; Astle, William; De Iorio, Maria; Bundy, Jacob G; Ebbels, Timothy M D

    2014-01-01

    Data processing for 1D NMR spectra is a key bottleneck for metabolomic and other complex-mixture studies, particularly where quantitative data on individual metabolites are required. We present a protocol for automated metabolite deconvolution and quantification from complex NMR spectra by using the Bayesian automated metabolite analyzer for NMR (BATMAN) R package. BATMAN models resonances on the basis of a user-controllable set of templates, each of which specifies the chemical shifts, J-couplings and relative peak intensities for a single metabolite. Peaks are allowed to shift position slightly between spectra, and peak widths are allowed to vary by user-specified amounts. NMR signals not captured by the templates are modeled non-parametrically by using wavelets. The protocol covers setting up user template libraries, optimizing algorithmic input parameters, improving prior information on peak positions, quality control and evaluation of outputs. The outputs include relative concentration estimates for named metabolites together with associated Bayesian uncertainty estimates, as well as the fit of the remainder of the spectrum using wavelets. Graphical diagnostics allow the user to examine the quality of the fit for multiple spectra simultaneously. This approach offers a workflow to analyze large numbers of spectra and is expected to be useful in a wide range of metabolomics studies.

  16. A High Proliferation Rate is Critical for Reproducible and Standardized Embryoid Body Formation from Laminin-521-Based Human Pluripotent Stem Cell Cultures.

    PubMed

    Dziedzicka, Dominika; Markouli, Christina; Barbé, Lise; Spits, Claudia; Sermon, Karen; Geens, Mieke

    2016-12-01

    When aiming for homogenous embryoid body (EB) differentiation, the use of equal-sized EBs is required to avoid a size-induced differentiation bias. In this study we developed an efficient and standardized EB formation protocol for human pluripotent stem cells (hPSC) cultured in a laminin-521-based xeno-free system. As the cell proliferation rate of the cells growing on laminin-521 strongly affected the efficiency of aggregate formation, we found that recently passaged cells, as well as the addition of ROCK inhibitor, were essential for reproducible EB formation from hPSC single-cell suspensions. EBs could be obtained in a variety of differentiation media, in 96-well round-bottom plates and in hanging drops. Gene expression studies on differentially sized EBs from three individual human embryonic stem cell lines demonstrated that the medium used for differentiation influenced the differentiation outcome to a much greater extent than the number of cells used for the initial EB formation. Our findings give a new insight into factors that influence the EB formation and differentiation process. This optimized method allows us to easily manipulate EB formation and provide an excellent starting point for downstream EB-based differentiation protocols.

  17. Using procalcitonin-guided algorithms to improve antimicrobial therapy in ICU patients with respiratory infections and sepsis.

    PubMed

    Schuetz, Philipp; Raad, Issam; Amin, Devendra N

    2013-10-01

    In patients with systemic bacterial infections hospitalized in ICUs, the inflammatory biomarker procalcitonin (PCT) has been shown to aid diagnosis, antibiotic stewardship, and risk stratification. Our aim is to summarize recent evidence about the utility of PCT in the critical care setting and discuss the potential benefits and limitations of PCT when used for clinical decision-making. A growing body of evidence supports PCT use to differentiate bacterial from viral respiratory infections (including influenza), to help risk stratify patients, and to guide decisions about optimal duration of antibiotic therapy. Different PCT protocols were evaluated for these and similar purposes in randomized controlled trials in patients with varying severities of predominantly respiratory tract infection and sepsis. These trials demonstrated effectiveness of monitoring PCT to de-escalate antibiotic treatment earlier without increasing rates of relapsing infections or other adverse outcomes. Although serial PCT measurement has shown value in risk stratification of ICU patients, PCT-guided antibiotic escalation protocols have not yet shown benefit for patients. Inclusion of PCT data in clinical algorithms improves individualized decision-making regarding antibiotic treatment in patients in critical care for respiratory infections or sepsis. Future research should focus on use of repeated PCT measurements to risk-stratify patients and guide treatment to improve their outcomes.

  18. Remote Sensing Protocols for Parameterizing an Individual, Tree-Based, Forest Growth and Yield Model

    DTIC Science & Technology

    2014-09-01

    Leaf-Off Tree Crowns in Small Footprint, High Sampling Density LIDAR Data from Eastern Deciduous Forests in North America.” Remote Sensing of...William A. 2003. “Crown-Diameter Prediction Models for 87 Species of Stand- Grown Trees in the Eastern United States.” Southern Journal of Applied...ER D C/ CE RL T R- 14 -1 8 Base Facilities Environmental Quality Remote Sensing Protocols for Parameterizing an Individual, Tree -Based

  19. A detailed description of the implementation of inpatient insulin orders with a commercial electronic health record system.

    PubMed

    Neinstein, Aaron; MacMaster, Heidemarie Windham; Sullivan, Mary M; Rushakoff, Robert

    2014-07-01

    In the setting of Meaningful Use laws and professional society guidelines, hospitals are rapidly implementing electronic glycemic management order sets. There are a number of best practices established in the literature for glycemic management protocols and programs. We believe that this is the first published account of the detailed steps to be taken to design, implement, and optimize glycemic management protocols in a commercial computerized provider order entry (CPOE) system. Prior to CPOE implementation, our hospital already had a mature glycemic management program. To transition to CPOE, we underwent the following 4 steps: (1) preparation and requirements gathering, (2) design and build, (3) implementation and dissemination, and (4) optimization. These steps required more than 2 years of coordinated work between physicians, nurses, pharmacists, and programmers. With the move to CPOE, our complex glycemic management order sets were successfully implemented without any significant interruptions in care. With feedback from users, we have continued to refine the order sets, and this remains an ongoing process. Successful implementation of glycemic management protocols in CPOE is dependent on broad stakeholder input and buy-in. When using a commercial CPOE system, there may be limitations of the system, necessitating workarounds. There should be an upfront plan to apply resources for continuous process improvement and optimization after implementation. © 2014 Diabetes Technology Society.

  20. Design of optimal hyperthermia protocols for prostate cancer by controlling HSP expression through computer modeling (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Rylander, Marissa N.; Feng, Yusheng; Diller, Kenneth; Bass, J.

    2005-04-01

    Heat shock proteins (HSP) are critical components of a complex defense mechanism essential for preserving cell survival under adverse environmental conditions. It is inevitable that hyperthermia will enhance tumor tissue viability, due to HSP expression in regions where temperatures are insufficient to coagulate proteins, and would likely increase the probability of cancer recurrence. Although hyperthermia therapy is commonly used in conjunction with radiotherapy, chemotherapy, and gene therapy to increase therapeutic effectiveness, the efficacy of these therapies can be substantially hindered due to HSP expression when hyperthermia is applied prior to these procedures. Therefore, in planning hyperthermia protocols, prediction of the HSP response of the tumor must be incorporated into the treatment plan to optimize the thermal dose delivery and permit prediction of overall tissue response. In this paper, we present a highly accurate, adaptive, finite element tumor model capable of predicting the HSP expression distribution and tissue damage region based on measured cellular data when hyperthermia protocols are specified. Cubic spline representations of HSP27 and HSP70, and Arrhenius damage models were integrated into the finite element model to enable prediction of the HSP expression and damage distribution in the tissue following laser heating. Application of the model can enable optimized treatment planning by controlling of the tissue response to therapy based on accurate prediction of the HSP expression and cell damage distribution.

  1. Protein structure modeling for CASP10 by multiple layers of global optimization.

    PubMed

    Joo, Keehyoung; Lee, Juyong; Sim, Sangjin; Lee, Sun Young; Lee, Kiho; Heo, Seungryong; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2014-02-01

    In the template-based modeling (TBM) category of CASP10 experiment, we introduced a new protocol called protein modeling system (PMS) to generate accurate protein structures in terms of side-chains as well as backbone trace. In the new protocol, a global optimization algorithm, called conformational space annealing (CSA), is applied to the three layers of TBM procedure: multiple sequence-structure alignment, 3D chain building, and side-chain re-modeling. For 3D chain building, we developed a new energy function which includes new distance restraint terms of Lorentzian type (derived from multiple templates), and new energy terms that combine (physical) energy terms such as dynamic fragment assembly (DFA) energy, DFIRE statistical potential energy, hydrogen bonding term, etc. These physical energy terms are expected to guide the structure modeling especially for loop regions where no template structures are available. In addition, we developed a new quality assessment method based on random forest machine learning algorithm to screen templates, multiple alignments, and final models. For TBM targets of CASP10, we find that, due to the combination of three stages of CSA global optimizations and quality assessment, the modeling accuracy of PMS improves at each additional stage of the protocol. It is especially noteworthy that the side-chains of the final PMS models are far more accurate than the models in the intermediate steps. Copyright © 2013 Wiley Periodicals, Inc.

  2. MRI mediated, non-invasive tracking of intratumoral distribution of nanocarriers in rat glioma

    NASA Astrophysics Data System (ADS)

    Karathanasis, Efstathios; Park, Jaekeun; Agarwal, Abhiruchi; Patel, Vijal; Zhao, Fuqiang; Annapragada, Ananth V.; Hu, Xiaoping; Bellamkonda, Ravi V.

    2008-08-01

    Nanocarrier mediated therapy of gliomas has shown promise. The success of systemic nanocarrier-based chemotherapy is critically dependent on the so-called leaky vasculature to permit drug extravasation across the blood-brain barrier. Yet, the extent of vascular permeability in individual tumors varies widely, resulting in a correspondingly wide range of responses to the therapy. However, there exist no tools currently for rationally determining whether tumor blood vessels are amenable to nanocarrier mediated therapy in an individualized, patient specific manner today. To address this need for brain tumor therapy, we have developed a multifunctional 100 nm scale liposomal agent encapsulating a gadolinium-based contrast agent for contrast-enhanced magnetic resonance imaging with prolonged blood circulation. Using a 9.4 T MRI system, we were able to track the intratumoral distribution of the gadolinium-loaded nanocarrier in a rat glioma model for a period of three days due to improved magnetic properties of the contrast agent being packaged in a nanocarrier. Such a nanocarrier provides a tool for non-invasively assessing the suitability of tumors for nanocarrier mediated therapy and then optimizing the treatment protocol for each individual tumor. Additionally, the ability to image the tumor in high resolution can potentially constitute a surgical planning tool for tumor resection.

  3. MRI mediated, non-invasive tracking of intratumoral distribution of nanocarriers in rat glioma.

    PubMed

    Karathanasis, Efstathios; Park, Jaekeun; Agarwal, Abhiruchi; Patel, Vijal; Zhao, Fuqiang; Annapragada, Ananth V; Hu, Xiaoping; Bellamkonda, Ravi V

    2008-08-06

    Nanocarrier mediated therapy of gliomas has shown promise. The success of systemic nanocarrier-based chemotherapy is critically dependent on the so-called leaky vasculature to permit drug extravasation across the blood-brain barrier. Yet, the extent of vascular permeability in individual tumors varies widely, resulting in a correspondingly wide range of responses to the therapy. However, there exist no tools currently for rationally determining whether tumor blood vessels are amenable to nanocarrier mediated therapy in an individualized, patient specific manner today. To address this need for brain tumor therapy, we have developed a multifunctional 100 nm scale liposomal agent encapsulating a gadolinium-based contrast agent for contrast-enhanced magnetic resonance imaging with prolonged blood circulation. Using a 9.4 T MRI system, we were able to track the intratumoral distribution of the gadolinium-loaded nanocarrier in a rat glioma model for a period of three days due to improved magnetic properties of the contrast agent being packaged in a nanocarrier. Such a nanocarrier provides a tool for non-invasively assessing the suitability of tumors for nanocarrier mediated therapy and then optimizing the treatment protocol for each individual tumor. Additionally, the ability to image the tumor in high resolution can potentially constitute a surgical planning tool for tumor resection.

  4. Development and Validation of Sandwich ELISA Microarrays with Minimal Assay Interference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Rachel M.; Servoss, Shannon; Crowley, Sheila A.

    Sandwich enzyme-linked immunosorbent assay (ELISA) microarrays are emerging as a strong candidate platform for multiplex biomarker analysis because of the ELISA’s ability to quantitatively measure rare proteins in complex biological fluids. Advantages of this platform are high-throughput potential, assay sensitivity and stringency, and the similarity to the standard ELISA test, which facilitates assay transfer from a research setting to a clinical laboratory. However, a major concern with the multiplexing of ELISAs is maintaining high assay specificity. In this study, we systematically determine the amount of assay interference and noise contributed by individual components of the multiplexed 24-assay system. We findmore » that non-specific reagent cross-reactivity problems are relatively rare. We did identify the presence of contaminant antigens in a “purified antigen”. We tested the validated ELISA microarray chip using paired serum samples that had been collected from four women at a 6-month interval. This analysis demonstrated that protein levels typically vary much more between individuals then within an individual over time, a result which suggests that longitudinal studies may be useful in controlling for biomarker variability across a population. Overall, this research demonstrates the importance of a stringent screening protocol and the value of optimizing the antibody and antigen concentrations when designing chips for ELISA microarrays.« less

  5. Acute effect of whole body vibration on postural control in congenitally blind subjects: a preliminary evidence.

    PubMed

    di Cagno, Alessandra; Giombini, Arrigo; Iuliano, Enzo; Moffa, Stefano; Caliandro, Tiziana; Parisi, Attilio; Borrione, Paolo; Calcagno, Giuseppe; Fiorilli, Giovanni

    2017-07-11

    The purpose of this study was to investigate the acute effects of whole body vibration at optimal frequency, on postural control in blind subjects. Twenty-four participants, 12 congenital blind males (Experimental Group), and 12 non-disabled males with no visual impairment (Control Groups) were recruited. The area of the ellipse and the total distance of the center of pressure displacements, as postural control parameters, were evaluated at baseline (T0), immediately after the vibration (T1), after 10 min (T10) and after 20 min (T20). Whole body vibration protocol consisted into 5 sets of 1 min for each vibration, with 1 min rest between each set on a vibrating platform. The total distance of center of pressure showed a significant difference (p < 0.05) amongst groups, while the area remained constant. No significant differences were detected among times of assessments, or in the interaction group × time. No impairments in static balance were found after an acute bout of whole body vibration at optimal frequency in blind subjects and, consequently, whole body vibration may be considered as a safe application in individuals who are blind.

  6. Fighting Cancer with Mathematics and Viruses.

    PubMed

    Santiago, Daniel N; Heidbuechel, Johannes P W; Kandell, Wendy M; Walker, Rachel; Djeu, Julie; Engeland, Christine E; Abate-Daga, Daniel; Enderling, Heiko

    2017-08-23

    After decades of research, oncolytic virotherapy has recently advanced to clinical application, and currently a multitude of novel agents and combination treatments are being evaluated for cancer therapy. Oncolytic agents preferentially replicate in tumor cells, inducing tumor cell lysis and complex antitumor effects, such as innate and adaptive immune responses and the destruction of tumor vasculature. With the availability of different vector platforms and the potential of both genetic engineering and combination regimens to enhance particular aspects of safety and efficacy, the identification of optimal treatments for patient subpopulations or even individual patients becomes a top priority. Mathematical modeling can provide support in this arena by making use of experimental and clinical data to generate hypotheses about the mechanisms underlying complex biology and, ultimately, predict optimal treatment protocols. Increasingly complex models can be applied to account for therapeutically relevant parameters such as components of the immune system. In this review, we describe current developments in oncolytic virotherapy and mathematical modeling to discuss the benefit of integrating different modeling approaches into biological and clinical experimentation. Conclusively, we propose a mutual combination of these research fields to increase the value of the preclinical development and the therapeutic efficacy of the resulting treatments.

  7. Fighting Cancer with Mathematics and Viruses

    PubMed Central

    Santiago, Daniel N.; Heidbuechel, Johannes P. W.; Kandell, Wendy M.; Walker, Rachel; Djeu, Julie; Abate-Daga, Daniel; Enderling, Heiko

    2017-01-01

    After decades of research, oncolytic virotherapy has recently advanced to clinical application, and currently a multitude of novel agents and combination treatments are being evaluated for cancer therapy. Oncolytic agents preferentially replicate in tumor cells, inducing tumor cell lysis and complex antitumor effects, such as innate and adaptive immune responses and the destruction of tumor vasculature. With the availability of different vector platforms and the potential of both genetic engineering and combination regimens to enhance particular aspects of safety and efficacy, the identification of optimal treatments for patient subpopulations or even individual patients becomes a top priority. Mathematical modeling can provide support in this arena by making use of experimental and clinical data to generate hypotheses about the mechanisms underlying complex biology and, ultimately, predict optimal treatment protocols. Increasingly complex models can be applied to account for therapeutically relevant parameters such as components of the immune system. In this review, we describe current developments in oncolytic virotherapy and mathematical modeling to discuss the benefit of integrating different modeling approaches into biological and clinical experimentation. Conclusively, we propose a mutual combination of these research fields to increase the value of the preclinical development and the therapeutic efficacy of the resulting treatments. PMID:28832539

  8. Optimization of proximity ligation assay (PLA) for detection of protein interactions and fusion proteins in non-adherent cells: application to pre-B lymphocytes.

    PubMed

    Debaize, Lydie; Jakobczyk, Hélène; Rio, Anne-Gaëlle; Gandemer, Virginie; Troadec, Marie-Bérengère

    2017-01-01

    Genetic abnormalities, including chromosomal translocations, are described for many hematological malignancies. From the clinical perspective, detection of chromosomal abnormalities is relevant not only for diagnostic and treatment purposes but also for prognostic risk assessment. From the translational research perspective, the identification of fusion proteins and protein interactions has allowed crucial breakthroughs in understanding the pathogenesis of malignancies and consequently major achievements in targeted therapy. We describe the optimization of the Proximity Ligation Assay (PLA) to ascertain the presence of fusion proteins, and protein interactions in non-adherent pre-B cells. PLA is an innovative method of protein-protein colocalization detection by molecular biology that combines the advantages of microscopy with the advantages of molecular biology precision, enabling detection of protein proximity theoretically ranging from 0 to 40 nm. We propose an optimized PLA procedure. We overcome the issue of maintaining non-adherent hematological cells by traditional cytocentrifugation and optimized buffers, by changing incubation times, and modifying washing steps. Further, we provide convincing negative and positive controls, and demonstrate that optimized PLA procedure is sensitive to total protein level. The optimized PLA procedure allows the detection of fusion proteins and protein interactions on non-adherent cells. The optimized PLA procedure described here can be readily applied to various non-adherent hematological cells, from cell lines to patients' cells. The optimized PLA protocol enables detection of fusion proteins and their subcellular expression, and protein interactions in non-adherent cells. Therefore, the optimized PLA protocol provides a new tool that can be adopted in a wide range of applications in the biological field.

  9. All-optical nanomechanical heat engine.

    PubMed

    Dechant, Andreas; Kiesel, Nikolai; Lutz, Eric

    2015-05-08

    We propose and theoretically investigate a nanomechanical heat engine. We show how a levitated nanoparticle in an optical trap inside a cavity can be used to realize a Stirling cycle in the underdamped regime. The all-optical approach enables fast and flexible control of all thermodynamical parameters and the efficient optimization of the performance of the engine. We develop a systematic optimization procedure to determine optimal driving protocols. Further, we perform numerical simulations with realistic parameters and evaluate the maximum power and the corresponding efficiency.

  10. All-Optical Nanomechanical Heat Engine

    NASA Astrophysics Data System (ADS)

    Dechant, Andreas; Kiesel, Nikolai; Lutz, Eric

    2015-05-01

    We propose and theoretically investigate a nanomechanical heat engine. We show how a levitated nanoparticle in an optical trap inside a cavity can be used to realize a Stirling cycle in the underdamped regime. The all-optical approach enables fast and flexible control of all thermodynamical parameters and the efficient optimization of the performance of the engine. We develop a systematic optimization procedure to determine optimal driving protocols. Further, we perform numerical simulations with realistic parameters and evaluate the maximum power and the corresponding efficiency.

  11. Reliable multicast protocol specifications protocol operations

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd; Whetten, Brian

    1995-01-01

    This appendix contains the complete state tables for Reliable Multicast Protocol (RMP) Normal Operation, Multi-RPC Extensions, Membership Change Extensions, and Reformation Extensions. First the event types are presented. Afterwards, each RMP operation state, normal and extended, is presented individually and its events shown. Events in the RMP specification are one of several things: (1) arriving packets, (2) expired alarms, (3) user events, (4) exceptional conditions.

  12. The Effect of Participating in Suicide Research: Does Participating in a Research Protocol on Suicide and Psychiatric Symptoms Increase Suicide Ideation and Attempts?

    ERIC Educational Resources Information Center

    Smith, Phillip; Poindexter, Erin; Cukrowicz, Kelly

    2010-01-01

    The effect of engaging in an intensive research protocol that inquired extensively about psychiatric and suicide symptoms and exposed participants to a number of images, including suicide-related content was explored. Individuals experiencing a major depressive episode were called at 1 and 3 months after the initial protocol. Participants were…

  13. Program to study optimal protocol for cardiovascular and muscular efficiency. [physical fitness training for manned space flight

    NASA Technical Reports Server (NTRS)

    Olree, H. D.

    1974-01-01

    Training programs necessary for the development of optimal strength during prolonged manned space flight were examined, and exercises performed on the Super Mini Gym Skylab 2 were compared with similar exercises on the Universal Gym and calisthenics. Cardiopulmonary gains were found negligible but all training groups exhibited good gains in strength.

  14. High-dimensional quantum cloning and applications to quantum hacking

    PubMed Central

    Bouchard, Frédéric; Fickler, Robert; Boyd, Robert W.; Karimi, Ebrahim

    2017-01-01

    Attempts at cloning a quantum system result in the introduction of imperfections in the state of the copies. This is a consequence of the no-cloning theorem, which is a fundamental law of quantum physics and the backbone of security for quantum communications. Although perfect copies are prohibited, a quantum state may be copied with maximal accuracy via various optimal cloning schemes. Optimal quantum cloning, which lies at the border of the physical limit imposed by the no-signaling theorem and the Heisenberg uncertainty principle, has been experimentally realized for low-dimensional photonic states. However, an increase in the dimensionality of quantum systems is greatly beneficial to quantum computation and communication protocols. Nonetheless, no experimental demonstration of optimal cloning machines has hitherto been shown for high-dimensional quantum systems. We perform optimal cloning of high-dimensional photonic states by means of the symmetrization method. We show the universality of our technique by conducting cloning of numerous arbitrary input states and fully characterize our cloning machine by performing quantum state tomography on cloned photons. In addition, a cloning attack on a Bennett and Brassard (BB84) quantum key distribution protocol is experimentally demonstrated to reveal the robustness of high-dimensional states in quantum cryptography. PMID:28168219

  15. High-dimensional quantum cloning and applications to quantum hacking.

    PubMed

    Bouchard, Frédéric; Fickler, Robert; Boyd, Robert W; Karimi, Ebrahim

    2017-02-01

    Attempts at cloning a quantum system result in the introduction of imperfections in the state of the copies. This is a consequence of the no-cloning theorem, which is a fundamental law of quantum physics and the backbone of security for quantum communications. Although perfect copies are prohibited, a quantum state may be copied with maximal accuracy via various optimal cloning schemes. Optimal quantum cloning, which lies at the border of the physical limit imposed by the no-signaling theorem and the Heisenberg uncertainty principle, has been experimentally realized for low-dimensional photonic states. However, an increase in the dimensionality of quantum systems is greatly beneficial to quantum computation and communication protocols. Nonetheless, no experimental demonstration of optimal cloning machines has hitherto been shown for high-dimensional quantum systems. We perform optimal cloning of high-dimensional photonic states by means of the symmetrization method. We show the universality of our technique by conducting cloning of numerous arbitrary input states and fully characterize our cloning machine by performing quantum state tomography on cloned photons. In addition, a cloning attack on a Bennett and Brassard (BB84) quantum key distribution protocol is experimentally demonstrated to reveal the robustness of high-dimensional states in quantum cryptography.

  16. Optimization of ultrasound-assisted extraction of charantin from Momordica charantia fruits using response surface methodology.

    PubMed

    Ahamad, Javed; Amin, Saima; Mir, Showkat R

    2015-01-01

    Momordica charantia Linn. (Cucurbitaceae) fruits are well known for their beneficial effects in diabetes that are often attributed to its bioactive component charantin. The aim of the present study is to develop and optimize an efficient protocol for the extraction of charantin from M. charantia fruits. Response surface methodology (RSM) was used for the optimization of ultrasound-assisted extraction (UAE) conditions. RSM was based on a three-level, three-variable Box-Behnken design (BBD), and the studied variables included solid to solvent ratio, extraction temperature, and extraction time. The optimal conditions predicted by the BBD were: UAE with methanol: Water (80:20, v/v) at 46°C for 120 min with solid to solvent ratio of 1:26 w/v, under which the yield of charantin was 3.18 mg/g. Confirmation trials under slightly adjusted conditions yielded 3.12 ± 0.14 mg/g of charantin on dry weight basis of fruits. The result of UAE was also compared with Soxhlet extraction method and UAE was found 2.74-fold more efficient than the Soxhlet extraction for extracting charantin. A facile UAE protocol for a high extraction yield of charantin was developed and validated.

  17. Performance analysis of optimal power allocation in wireless cooperative communication systems

    NASA Astrophysics Data System (ADS)

    Babikir Adam, Edriss E.; Samb, Doudou; Yu, Li

    2013-03-01

    Cooperative communication has been recently proposed in wireless communication systems for exploring the inherent spatial diversity in relay channels.The Amplify-and-Forward (AF) cooperation protocols with multiple relays have not been sufficiently investigated even if it has a low complexity in term of implementation. We consider in this work a cooperative diversity system in which a source transmits some information to a destination with the help of multiple relay nodes with AF protocols and investigate the optimality of allocating powers both at the source and the relays system by optimizing the symbol error rate (SER) performance in an efficient way. Firstly we derive a closedform SER formulation for MPSK signal using the concept of moment generating function and some statistical approximations in high signal to noise ratio (SNR) for the system under studied. We then find a tight corresponding lower bound which converges to the same limit as the theoretical upper bound and develop an optimal power allocation (OPA) technique with mean channel gains to minimize the SER. Simulation results show that our scheme outperforms the equal power allocation (EPA) scheme and is tight to the theoretical approximation based on the SER upper bound in high SNR for different number of relays.

  18. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    PubMed Central

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  19. Optimization of Protein Extraction and Two-Dimensional Electrophoresis Protocols for Oil Palm Leaf.

    PubMed

    Daim, Leona Daniela Jeffery; Ooi, Tony Eng Keong; Yusof, Hirzun Mohd; Majid, Nazia Abdul; Karsani, Saiful Anuar Bin

    2015-08-01

    Oil palm (Elaeis guineensis) is an important economic crop cultivated for its nutritional palm oil. A significant amount of effort has been undertaken to understand oil palm growth and physiology at the molecular level, particularly in genomics and transcriptomics. Recently, proteomics studies have begun to garner interest. However, this effort is impeded by technical challenges. Plant sample preparation for proteomics analysis is plagued with technical challenges due to the presence of polysaccharides, secondary metabolites and other interfering compounds. Although protein extraction methods for plant tissues exist, none work universally on all sample types. Therefore, this study aims to compare and optimize different protein extraction protocols for use with two-dimensional gel electrophoresis of young and mature leaves from the oil palm. Four protein extraction methods were evaluated: phenol-guanidine isothiocyanate, trichloroacetic acid-acetone precipitation, sucrose and trichloroacetic acid-acetone-phenol. Of these four protocols, the trichloroacetic acid-acetone-phenol method was found to give the highest resolution and most reproducible gel. The results from this study can be used in sample preparations of oil palm tissue for proteomics work.

  20. Synthesis of cis-C-Iodo-N-Tosyl-Aziridines using Diiodomethyllithium: Reaction Optimization, Product Scope and Stability, and a Protocol for Selection of Stationary Phase for Chromatography

    PubMed Central

    2013-01-01

    The preparation of C-iodo-N-Ts-aziridines with excellent cis-diastereoselectivity has been achieved in high yields by the addition of diiodomethyllithium to N-tosylimines and N-tosylimine–HSO2Tol adducts. This addition-cyclization protocol successfully provided a wide range of cis-iodoaziridines, including the first examples of alkyl-substituted iodoaziridines, with the reaction tolerating both aryl imines and alkyl imines. An ortho-chlorophenyl imine afforded a β-amino gem-diiodide under the optimized reaction conditions due to a postulated coordinated intermediate preventing cyclization. An effective protocol to assess the stability of the sensitive iodoaziridine functional group to chromatography was also developed. As a result of the judicious choice of stationary phase, the iodoaziridines could be purified by column chromatography; the use of deactivated basic alumina (activity IV) afforded high yield and purity. Rearrangements of electron-rich aryl-iodoaziridines have been promoted, selectively affording either novel α-iodo-N-Ts-imines or α-iodo-aldehydes in high yield. PMID:23738857

Top