Science.gov

Sample records for aapm tg-61 protocol

  1. Compliance with AAPM Practice Guideline 1.a: CT Protocol Management and Review - from the perspective of a university hospital.

    PubMed

    Szczykutowicz, Timothy P; Bour, Robert K; Pozniak, Myron; Ranallo, Frank N

    2015-01-01

    The purpose of this paper is to describe our experience with the AAPM Medical Physics Practice Guideline 1.a: "CT Protocol Management and Review Practice Guideline". Specifically, we will share how our institution's quality management system addresses the suggestions within the AAPM practice report. We feel this paper is needed as it was beyond the scope of the AAPM practice guideline to provide specific details on fulfilling individual guidelines. Our hope is that other institutions will be able to emulate some of our practices and that this article would encourage other types of centers (e.g., community hospitals) to share their methodology for approaching CT protocol optimization and quality control. Our institution had a functioning CT protocol optimization process, albeit informal, since we began using CT. Recently, we made our protocol development and validation process compliant with a number of the ISO 9001:2008 clauses and this required us to formalize the roles of the members of our CT protocol optimization team. We rely heavily on PACS-based IT solutions for acquiring radiologist feedback on the performance of our CT protocols and the performance of our CT scanners in terms of dose (scanner output) and the function of the automatic tube current modulation. Specific details on our quality management system covering both quality control and ongoing optimization have been provided. The roles of each CT protocol team member have been defined, and the critical role that IT solutions provides for the management of files and the monitoring of CT protocols has been reviewed. In addition, the invaluable role management provides by being a champion for the project has been explained; lack of a project champion will mitigate the efforts of a CT protocol optimization team. Meeting the guidelines set forth in the AAPM practice guideline was not inherently difficult, but did, in our case, require the cooperation of radiologists, technologists, physicists, IT

  2. Addendum to the AAPM's TG-51 protocol for clinical reference dosimetry of high-energy photon beams

    SciTech Connect

    McEwen, Malcolm; DeWerd, Larry; Ibbott, Geoffrey; Followill, David; Rogers, David W. O.; Seltzer, Stephen; Seuntjens, Jan

    2014-04-15

    An addendum to the AAPM's TG-51 protocol for the determination of absorbed dose to water in megavoltage photon beams is presented. This addendum continues the procedure laid out in TG-51 but new k{sub Q} data for photon beams, based on Monte Carlo simulations, are presented and recommendations are given to improve the accuracy and consistency of the protocol's implementation. The components of the uncertainty budget in determining absorbed dose to water at the reference point are introduced and the magnitude of each component discussed. Finally, the consistency of experimental determination of N{sub D,w} coefficients is discussed. It is expected that the implementation of this addendum will be straightforward, assuming that the user is already familiar with TG-51. The changes introduced by this report are generally minor, although new recommendations could result in procedural changes for individual users. It is expected that the effort on the medical physicist's part to implement this addendum will not be significant and could be done as part of the annual linac calibration.

  3. Verification of TG-61 dose for synchrotron-produced monochromatic x-ray beams using fluence-normalized MCNP5 calculations

    SciTech Connect

    Brown, Thomas A. D.; Hogstrom, Kenneth R.; Alvarez, Diane; Matthews, Kenneth L. II; Ham, Kyungmin

    2012-12-15

    Purpose: Ion chamber dosimetry is being used to calibrate dose for cell irradiations designed to investigate photoactivated Auger electron therapy at the Louisiana State University Center for Advanced Microstructures and Devices (CAMD) synchrotron facility. This study performed a dosimetry intercomparison for synchrotron-produced monochromatic x-ray beams at 25 and 35 keV. Ion chamber depth-dose measurements in a polymethylmethacrylate (PMMA) phantom were compared with the product of MCNP5 Monte Carlo calculations of dose per fluence and measured incident fluence. Methods: Monochromatic beams of 25 and 35 keV were generated on the tomography beamline at CAMD. A cylindrical, air-equivalent ion chamber was used to measure the ionization created in a 10 Multiplication-Sign 10 Multiplication-Sign 10-cm{sup 3} PMMA phantom for depths from 0.6 to 7.7 cm. The American Association of Physicists in Medicine TG-61 protocol was applied to convert measured ionization into dose. Photon fluence was determined using a NaI detector to make scattering measurements of the beam from a thin polyethylene target at angles 30 Degree-Sign -60 Degree-Sign . Differential Compton and Rayleigh scattering cross sections obtained from xraylib, an ANSI C library for x-ray-matter interactions, were applied to derive the incident fluence. MCNP5 simulations of the irradiation geometry provided the dose deposition per photon fluence as a function of depth in the phantom. Results: At 25 keV the fluence-normalized MCNP5 dose overestimated the ion-chamber measured dose by an average of 7.2 {+-} 3.0%-2.1 {+-} 3.0% for PMMA depths from 0.6 to 7.7 cm, respectively. At 35 keV the fluence-normalized MCNP5 dose underestimated the ion-chamber measured dose by an average of 1.0 {+-} 3.4%-2.5 {+-} 3.4%, respectively. Conclusions: These results showed that TG-61 ion chamber dosimetry, used to calibrate dose output for cell irradiations, agreed with fluence-normalized MCNP5 calculations to within approximately 7

  4. Medical Physicists and AAPM

    NASA Astrophysics Data System (ADS)

    Amols, Howard

    2006-03-01

    The American Association of Physicists in Medicine (AAPM), a member society of the AIP is the largest professional society of medical physicists in the world with nearly 5700 members. Members operate in medical centers, university and community hospitals, research laboratories, industry, and private practice. Medical physics specialties include radiation therapy physics, medical diagnostic and imaging physics, nuclear medicine physics, and medical radiation safety. The majority of AAPM members are based in hospital departments of radiation oncology or radiology and provide technical support for patient diagnosis and treatment in a clinical environment. Job functions include support of clinical care, calibration and quality assurance of medical devices such as linear accelerators for cancer therapy, CT, PET, MRI, and other diagnostic imaging devices, research, and teaching. Pathways into a career in medical physics require an advanced degree in medical physics, physics, engineering, or closely related field, plus clinical training in one or more medical physics specialties (radiation therapy physics, imaging physics, or radiation safety). Most clinically based medical physicists also obtain certification from the American Board of Radiology, and some states require licensure as well.

  5. Application of AAPM TG 119 to volumetric arc therapy (VMAT).

    PubMed

    Mynampati, Dinesh Kumar; Yaparpalvi, Ravindra; Hong, Linda; Kuo, Hsiang-Chi; Mah, Dennis

    2012-09-06

    The purpose of this study was to create AAPM TG 119 benchmark plans for volumetric arc therapy (VMAT) and to compare VMAT plans with IMRT plan data. AAPM TG 119 proposes a set of test clinical cases for testing the accuracy of IMRT planning and delivery system. For these test cases, we generated two treatment plans, the first plan using 7-9 static dMLC IMRT fields and a second plan utilizing one- or two-arc VMAT technique. Dose optimization and calculations performed using 6 MV photons and Eclipse treatment planning system. Dose prescription and planning objectives were set according to the TG 119 goals. Plans were scored based on TG 119 planning objectives. Treatment plans were compared using conformity index (CI) for reference dose and homogeneity index (HI) (for D(5)-D(95)). For test cases prostate, head-and-neck, C-shape and multitarget prescription dose are 75.6 Gy, 50.4 Gy, 50 Gy and 50 Gy, respectively. VMAT dose distributions were comparable to dMLC IMRT plans. Our planning results matched TG 119 planning results. For treatment plans studied, conformity indices ranged from 1.05-1.23 (IMRT) and 1.04-1.23 (VMAT). Homogeneity indices ranged from 4.6%-11.0% (IMRT) and 4.6%-10.5% (VMAT). The ratio of total monitor units necessary for dMLC IMRT to that of VMAT was in the range of 1.1-2.0. AAPM TG 119 test cases are useful to generate VMAT benchmark plans. At preclinical implementation stage, plan comparison of VMAT and IMRT plans of AAPM TG 119 test case allowed us to understand basic capabilities of VMAT technique.

  6. Monitor unit calculations for external photon and electron beams: Report of the AAPM Therapy Physics Committee Task Group No. 71.

    PubMed

    Gibbons, John P; Antolak, John A; Followill, David S; Huq, M Saiful; Klein, Eric E; Lam, Kwok L; Palta, Jatinder R; Roback, Donald M; Reid, Mark; Khan, Faiz M

    2014-03-01

    A protocol is presented for the calculation of monitor units (MU) for photon and electron beams, delivered with and without beam modifiers, for constant source-surface distance (SSD) and source-axis distance (SAD) setups. This protocol was written by Task Group 71 of the Therapy Physics Committee of the American Association of Physicists in Medicine (AAPM) and has been formally approved by the AAPM for clinical use. The protocol defines the nomenclature for the dosimetric quantities used in these calculations, along with instructions for their determination and measurement. Calculations are made using the dose per MU under normalization conditions, D'0, that is determined for each user's photon and electron beams. For electron beams, the depth of normalization is taken to be the depth of maximum dose along the central axis for the same field incident on a water phantom at the same SSD, where D'0 = 1 cGy/MU. For photon beams, this task group recommends that a normalization depth of 10 cm be selected, where an energy-dependent D'0 ≤ 1 cGy/MU is required. This recommendation differs from the more common approach of a normalization depth of dm, with D'0 = 1 cGy/MU, although both systems are acceptable within the current protocol. For photon beams, the formalism includes the use of blocked fields, physical or dynamic wedges, and (static) multileaf collimation. No formalism is provided for intensity modulated radiation therapy calculations, although some general considerations and a review of current calculation techniques are included. For electron beams, the formalism provides for calculations at the standard and extended SSDs using either an effective SSD or an air-gap correction factor. Example tables and problems are included to illustrate the basic concepts within the presented formalism.

  7. Monitor unit calculations for external photon and electron beams: Report of the AAPM Therapy Physics Committee Task Group No. 71

    SciTech Connect

    Gibbons, John P.; Antolak, John A.; Followill, David S.; Huq, M. Saiful; Klein, Eric E.; Lam, Kwok L.; Palta, Jatinder R.; Roback, Donald M.; Reid, Mark; Khan, Faiz M.

    2014-03-15

    A protocol is presented for the calculation of monitor units (MU) for photon and electron beams, delivered with and without beam modifiers, for constant source-surface distance (SSD) and source-axis distance (SAD) setups. This protocol was written by Task Group 71 of the Therapy Physics Committee of the American Association of Physicists in Medicine (AAPM) and has been formally approved by the AAPM for clinical use. The protocol defines the nomenclature for the dosimetric quantities used in these calculations, along with instructions for their determination and measurement. Calculations are made using the dose per MU under normalization conditions, D{sub 0}{sup ′}, that is determined for each user's photon and electron beams. For electron beams, the depth of normalization is taken to be the depth of maximum dose along the central axis for the same field incident on a water phantom at the same SSD, where D{sub 0}{sup ′} = 1 cGy/MU. For photon beams, this task group recommends that a normalization depth of 10 cm be selected, where an energy-dependent D{sub 0}{sup ′} ≤ 1 cGy/MU is required. This recommendation differs from the more common approach of a normalization depth of d{sub m}, with D{sub 0}{sup ′} = 1 cGy/MU, although both systems are acceptable within the current protocol. For photon beams, the formalism includes the use of blocked fields, physical or dynamic wedges, and (static) multileaf collimation. No formalism is provided for intensity modulated radiation therapy calculations, although some general considerations and a review of current calculation techniques are included. For electron beams, the formalism provides for calculations at the standard and extended SSDs using either an effective SSD or an air-gap correction factor. Example tables and problems are included to illustrate the basic concepts within the presented formalism.

  8. Essentials and guidelines for clinical medical physics residency training programs: executive summary of AAPM Report Number 249.

    PubMed

    Prisciandaro, Joann I; Willis, Charles E; Burmeister, Jay W; Clarke, Geoffrey D; Das, Rupak K; Esthappan, Jacqueline; Gerbi, Bruce J; Harkness, Beth A; Patton, James A; Peck, Donald J; Pizzutiello, Robert J; Sandison, George A; White, Sharon L; Wichman, Brian D; Ibbott, Geoffrey S; Both, Stefan

    2014-05-08

    There is a clear need for established standards for medical physics residency training. The complexity of techniques in imaging, nuclear medicine, and radiation oncology continues to increase with each passing year. It is therefore imperative that training requirements and competencies are routinely reviewed and updated to reflect the changing environment in hospitals and clinics across the country. In 2010, the AAPM Work Group on Periodic Review of Medical Physics Residency Training was formed and charged with updating AAPM Report Number 90. This work group includes AAPM members with extensive experience in clinical, professional, and educational aspects of medical physics. The resulting report, AAPM Report Number 249, concentrates on the clinical and professional knowledge needed to function independently as a practicing medical physicist in the areas of radiation oncology, imaging, and nuclear medicine, and constitutes a revision to AAPM Report Number 90. This manuscript presents an executive summary of AAPM Report Number 249.

  9. AAPM Medical Physics Practice Guideline 3.a: Levels of supervision for medical physicists in clinical training.

    PubMed

    Seibert, J Anthony; Clements, Jessica B; Halvorsen, Per H; Herman, Michael G; Martin, Melissa C; Palta, Jatinder; Pfeiffer, Douglas E; Pizzutiello, Robert J; Schueler, Beth A; Shepard, S Jeff; Fairobrent, Lynne A

    2015-05-08

    The American Association of Physicists in Medicine (AAPM) is a nonprofit professional society whose primary purposes are to advance the science, education and professional practice of medical physics. The AAPM has more than 8,000 members and is the principal organization of medical physicists in the United States.The AAPM will periodically define new practice guidelines for medical physics practice to help advance the science of medical physics and to improve the quality of service to patients throughout the United States. Existing medical physics practice guidelines will be reviewed for the purpose of revision or renewal, as appropriate, on their fifth anniversary or sooner.Each medical physics practice guideline represents a policy statement by the AAPM, has undergone a thorough consensus process in which it has been subjected to extensive review, and requires the approval of the Professional Council. The medical physics practice guidelines recognize that the safe and effective use of diagnostic and therapeutic radiology requires specific training, skills, and techniques, as described in each document. Reproduction or modification of the published practice guidelines and technical standards by those entities not providing these services is not authorized.The following terms are used in the AAPM practice guidelines:Must and Must Not: Used to indicate that adherence to the recommendation is considered necessary to conform to this practice guideline.Should and Should Not: Used to indicate a prudent practice to which exceptions may occasionally be made in appropriate circumstances.

  10. History, organization, and oversight of the accredited dosimetry calibration laboratories by the AAPM

    SciTech Connect

    Rozenfeld, M.

    1993-12-31

    For more than 20 years, the American Association of Physicists in Medicine (AAPM) has operated an accreditation program for secondary standards laboratories that calibrate radiation measuring instruments. Except for one short period, that program has been able to provide the facilities to satisfy the national need for accurate calibrations of such instruments. That exception, in 1981, due to the combination of the U.S. Nuclear Regulatory Commission (NRC) requiring instrument calibrations by users of cobalt-60 teletherapy units and the withdrawal of one of the three laboratories accredited at that time. However, after successful operation as a Task Group of the Radiation Therapy Committee (RTC) of the AAPM for two decades, a reorganization of this structure is now under serious consideration by the administration of the AAPM.

  11. AAPM/SNMMI Joint Task Force: report on the current state of nuclear medicine physics training.

    PubMed

    Harkness, Beth A; Allison, Jerry D; Clements, Jessica B; Coffey, Charles W; Fahey, Frederic H; Gress, Dustin A; Kinahan, Paul E; Nickoloff, Edward L; Mawlawi, Osama R; MacDougall, Robert D; Pizzutiello, Robert J

    2015-09-08

    The American Association of Physicists in Medicine (AAPM) and the Society of Nuclear Medicine and Molecular Imaging (SNMMI) recognized the need for a review of the current state of nuclear  medicine physics training and the need to explore pathways for improving nuclear medicine physics training opportunities. For these reasons, the two organizations formed a joint AAPM/SNMMI Ad Hoc Task Force on Nuclear Medicine Physics  Training. The mission of this task force was to assemble a representative group of stakeholders to:• Estimate the demand for board-certified nuclear medicine physicists in the next 5-10 years,• Identify the critical issues related to supplying an adequate number of physicists who have received the appropriate level of training in nuclear medicine physics, and• Identify approaches that may be considered to facilitate the training of nuclear medicine physicists.As a result, a task force was appointed and chaired by an active member of both organizations that included representation from the AAPM, SNMMI, the American Board of Radiology (ABR), the American Board of Science in Nuclear Medicine (ABSNM), and the Commission for the Accreditation of Medical Physics Educational Programs (CAMPEP). The Task Force first met at the AAPM Annual Meeting in Charlotte in July 2012 and has met regularly face-to-face, online, and by conference calls. This manuscript reports the findings of the Task Force, as well as recommendations to achieve the stated mission.

  12. AA-PMe, a novel asiatic acid derivative, induces apoptosis and suppresses proliferation, migration, and invasion of gastric cancer cells

    PubMed Central

    Jing, Yue; Wang, Gang; Ge, Ying; Xu, Minjie; Tang, Shuainan; Gong, Zhunan

    2016-01-01

    Asiatic acid (AA; 2α,3β,23-trihydroxyurs-12-ene-28-oic acid) is widely used for medicinal purposes in many Asian countries due to its various bioactivities. A series of AA derivatives has been synthesized in attempts to improve its therapeutic potencies. Herein we investigated the anti-tumor activities of N-(2α,3β,23-acetoxyurs-12-en-28-oyl)-l-proline methyl ester (AA-PMe), a novel AA derivative. AA-PMe exhibited a stronger anti-cancer activity than its parent compound AA. AA-PMe inhibited the proliferation of SGC7901 and HGC27 human gastric cancer cells in a dose-dependent manner but had no significant toxicity in human gastric mucosa epithelial cells (GES-1). AA-PMe induced cell cycle arrest in G0/G1 phase and blocked G1-S transition, which correlated well with marked decreases in levels of cyclin D1, cyclin-dependent kinase CKD4, and phosphorylated retinoblastoma protein, and increase in cyclin-dependent kinase inhibitor P15. Further, AA-PMe induced apoptosis of human gastric cancer cells by affecting Bcl-2, Bax, c-Myc, and caspase-3. Moreover, AA-PMe suppressed the migration and invasion of human gastric cancer cells (SGC7901 and HGC27) cells by downregulating the expression of MMP-2 and MMP-9. Overall, this study investigated the potential anti-cancer activities of AA-PMe including inducing apoptosis and suppressing proliferation, migration and invasion of gastric cancer cells, as well as the underlying mechanisms, suggesting that AA-PMe is a promising anti-cancer drug candidate in gastric cancer therapy. PMID:27073325

  13. AA-PMe, a novel asiatic acid derivative, induces apoptosis and suppresses proliferation, migration, and invasion of gastric cancer cells.

    PubMed

    Jing, Yue; Wang, Gang; Ge, Ying; Xu, Minjie; Tang, Shuainan; Gong, Zhunan

    2016-01-01

    Asiatic acid (AA; 2α,3β,23-trihydroxyurs-12-ene-28-oic acid) is widely used for medicinal purposes in many Asian countries due to its various bioactivities. A series of AA derivatives has been synthesized in attempts to improve its therapeutic potencies. Herein we investigated the anti-tumor activities of N-(2α,3β,23-acetoxyurs-12-en-28-oyl)-l-proline methyl ester (AA-PMe), a novel AA derivative. AA-PMe exhibited a stronger anti-cancer activity than its parent compound AA. AA-PMe inhibited the proliferation of SGC7901 and HGC27 human gastric cancer cells in a dose-dependent manner but had no significant toxicity in human gastric mucosa epithelial cells (GES-1). AA-PMe induced cell cycle arrest in G0/G1 phase and blocked G1-S transition, which correlated well with marked decreases in levels of cyclin D1, cyclin-dependent kinase CKD4, and phosphorylated retinoblastoma protein, and increase in cyclin-dependent kinase inhibitor P15. Further, AA-PMe induced apoptosis of human gastric cancer cells by affecting Bcl-2, Bax, c-Myc, and caspase-3. Moreover, AA-PMe suppressed the migration and invasion of human gastric cancer cells (SGC7901 and HGC27) cells by downregulating the expression of MMP-2 and MMP-9. Overall, this study investigated the potential anti-cancer activities of AA-PMe including inducing apoptosis and suppressing proliferation, migration and invasion of gastric cancer cells, as well as the underlying mechanisms, suggesting that AA-PMe is a promising anti-cancer drug candidate in gastric cancer therapy. PMID:27073325

  14. AAPM Medical Physics Practice Guideline 5.a.: Commissioning and QA of Treatment Planning Dose Calculations - Megavoltage Photon and Electron Beams.

    PubMed

    Smilowitz, Jennifer B; Das, Indra J; Feygelman, Vladimir; Fraass, Benedick A; Kry, Stephen F; Marshall, Ingrid R; Mihailidis, Dimitris N; Ouhib, Zoubir; Ritter, Timothy; Snyder, Michael G; Fairobent, Lynne

    2015-09-08

    The American Association of Physicists in Medicine (AAPM) is a nonprofit professional society whose primary purposes are to advance the science, education and professional practice of medical physics. The AAPM has more than 8,000 members and is the principal organization of medical physicists in the United States. The AAPM will periodically define new practice guidelines for medical physics practice to help advance the science of medical physics and to improve the quality of service to patients throughout the United States. Existing medical physics practice guidelines will be reviewed for the purpose of revision or renewal, as appropriate, on their fifth anniversary or sooner. Each medical physics practice guideline represents a policy statement by the AAPM, has undergone a thorough consensus process in which it has been subjected to extensive review, and requires the approval of the Professional Council. The medical physics practice guidelines recognize that the safe and effective use of diagnostic and therapeutic radiology requires specific training, skills, and techniques, as described in each document. Reproduction or modification of the published practice guidelines and technical standards by those entities not providing these services is not authorized. The following terms are used in the AAPM practice guidelines:• Must and Must Not: Used to indicate that adherence to the recommendation is considered necessary to conform to this practice guideline.• Should and Should Not: Used to indicate a prudent practice to which exceptions may occasionally be made in appropriate circumstances.

  15. SU-E-J-204: Radiation Dose to Patients Resulting From Image Guidance Procedures and AAPM TG-180 Update

    SciTech Connect

    Ding, G; Alaei, P

    2014-06-01

    Purpose: Image-guided radiation therapy (IGRT) is the new paradigm for patient positioning and target localization in radiotherapy. Daily imaging procedures add additional dose to the patient's treatment volume and normal tissues and may expose the organs at risk to unaccounted doses. This presentation is to update the progress of AAPM TG-180 which aims to provide strategies to quantify and account the dose from both MV and kV imaging in patient treatment planning. Methods: Our current knowledge on image guidance dose is presented. A summary of doses from image guidance procedures delivered to patients in relationship with therapeutic doses is given. Different techniques in reducing the image guidance dose are summarized. Typical organ doses resulting from different image acquisition procedures used in IGRT are tabulated. Results: Many techniques to reduce the imaging doses are available in clinical applications. There are large variations between dose to bone and dose to soft tissues for x-rays at kilovoltage energy range. Methods for clinical implementation of accounting for the imaging dose from an imaging procedure are available. Beam data from imaging systems can be generated by combining Monte Carlo simulations and experimental measurements for commissioning imaging beams in the treatment planning. Conclusion: The current treatment planning systems are not yet equipped to perform patient specific dose calculations resulting from kV imaging procedures. The imaging dose from current kV image devices has been significantly reduced and is generally much less than that resulting from MV. Because the magnitude of kV imaging dose is significantly low and the variation between patients is modest, it is feasible to estimate dose based on imaging producers or protocols using tabulated values which provides an alternative to accomplish the task of accounting and reporting imaging doses.

  16. Medical Physics Practice Guidelines - the AAPM's minimum practice recommendations for medical physicists.

    PubMed

    Mills, Michael D; Chan, Maria F; Prisciandaro, Joann I; Shepard, Jeff; Halvorsen, Per H

    2013-11-04

    The AAPM has long advocated a consistent level of medical physics practice, and has published many recommendations and position statements toward that goal, such as Science Council Task Group reports related to calibration and quality assurance, Education Council and Professional Council Task Group reports related to education, training, and peer review, and Board-approved Position Statements related to the Scope of Practice, physicist qualifications, and other aspects of medical physics practice. Despite these concerted and enduring efforts, the profession does not have clear and concise statements of the acceptable practice guidelines for routine clinical medical physics. As accreditation of clinical practices becomes more common, Medical Physics Practice Guidelines (MPPGs) will be crucial to ensuring a consistent benchmark for accreditation programs. To this end, the AAPM has recently endorsed the development of MPPGs, which may be generated in collaboration with other professional societies. The MPPGs are intended to be freely available to the general public. Accrediting organizations, regulatory agencies, and legislators will be encouraged to reference these MPPGs when defining their respective requirements. MPPGs are intended to provide the medical community with a clear description of the minimum level of medical physics support that the AAPM would consider prudent in clinical practice settings. Support includes, but is not limited to, staffing, equipment, machine access, and training. These MPPGs are not designed to replace extensive Task Group reports or review articles, but rather to describe the recommended minimum level of medical physics support for specific clinical services. This article has described the purpose, scope, and process for the development of MPPGs.

  17. MO-F-16A-03: AAPM Online Learning Support of New ABR MOC Requirements

    SciTech Connect

    Bloch, C; Ogburn, J; Woodward, M

    2014-06-15

    In 2002 the American Board of Radiology (ABR) discontinued issuing lifetime board certification. After that time diplomates received a timelimited certificate and must participate in the Maintenance of Certification (MOC) program in order to maintain their certification. Initially certificates were issued with a 10 year expiration period and the MOC had requirements to be met over that 10 year period. The goal was to demonstrate continuous maintenance of clinical competency, however some diplomates were attempting to fulfill most or all of the requirements near the end of the 10 year period. This failed to meet the continuous aspect of the goal and so the ABR changed to a sliding 3-year window. This was done to recognize that not every year would be the same, but that diplomates should be able to maintain a reasonable average over any 3 year period.A second significant change occurred in 2013. The initial requirements included 20 selfassessment modules (SAMs) over the original 10 year term. SAMs are a special type of continuing education (CE) credit that were an addition to the 250 standard CE credits required over the 10 year period. In 2013, however, the new requirement is 75 CE credits over the previous 3 years, of which 25 must include self-assessment. Effectively this raised the self-assessment requirement from 20 in 10 years to 25 in 3 years. Previously SAMs were an interactive presentation available in limited quantities at live meetings. However, the new requirement is not for SAMs but CE-SA which includes SAMs, but also includes the online quizzes provided at the AAPM online learning center. All credits earned at the AAPM online learning center fulfill the ABR SA requirement.This talk will be an interactive demonstration of the AAPM online learning center along with a discussion of the MOC requirements.

  18. Analyzing the performance of the planning system by use of AAPM TG 119 test cases.

    PubMed

    Nithya, L; Raj, N Arunai Nambi; Rathinamuthu, Sasikumar; Pandey, Manish Bhushan

    2016-01-01

    Our objective in this study was to create AAPM TG 119 test plans for intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) in the Monaco planning system. The results were compared with the published studies, and the performance of the Monaco planning system was analyzed. AAPM TG 119 proposed a set of test cases called multi-target, mock prostate, mock head and neck and C-shape to ascertain the overall accuracy of IMRT planning, measurement, and analysis. We used these test cases to investigate the performance of the Monaco planning system for the complex plans. For these test cases, we created IMRT plans with static multi-leaf collimator (MLC) and dynamic MLC by using 7-9 static beams as explained in TG-119. VMAT plans were also created with a 320° arc length and a single or double arc. The planning objectives and dose were set as described in TG 119. The dose prescriptions for multi-target, mock prostate, mock head and neck, and C-shape were taken as 50, 75.6, 50 and 50 Gy, respectively. All plans were compared with the results of TG 119 and the study done by Mynampati et al. Point dose and fluence measurements were done with a CC13 chamber and ArcCHECK phantom, respectively. Gamma analysis was done for the calculated and measured dose. Using the Monaco planning system, we achieved the goals mentioned in AAPM TG-119, and the plans were comparable to those of other studies. A comparison of point dose and fluence showed good results. From these results, we conclude that the performance of the Monaco planning system is good for complex plans.

  19. MO-C-BRE-01: The WMIS-AAPM Joint Symposium: Advances in Molecular Imaging

    SciTech Connect

    Contag, C; Pogue, B; Lewis, J

    2014-06-15

    This joint symposium of the World Molecular Imaging Society (WMIS) and the AAPM includes three luminary speakers discussing work in new paradigms of molecular imaging in cancer (Contag), applications of optical imaging technologies to radiation therapy (Pogue) and an update on PET imaging as a surrogate biomarker for cancer progression and response to therapy. Learning Objectives: Appreciate the current trends in molecular and systems imaging. Understand how optical imaging technologies, and particularly Cerenkov detectors, can be used in advancing radiation oncology. Stay current on new PET tracers - and targets - of interest in cancer treatment.

  20. Accuracy and calibration of integrated radiation output indicators in diagnostic radiology: A report of the AAPM Imaging Physics Committee Task Group 190

    SciTech Connect

    Lin, Pei-Jan P.; Schueler, Beth A.; Balter, Stephen; Strauss, Keith J.; Wunderle, Kevin A.; LaFrance, M. Terry; Kim, Don-Soo; Behrman, Richard H.; Shepard, S. Jeff; Bercha, Ishtiaq H.

    2015-12-15

    Due to the proliferation of disciplines employing fluoroscopy as their primary imaging tool and the prolonged extensive use of fluoroscopy in interventional and cardiovascular angiography procedures, “dose-area-product” (DAP) meters were installed to monitor and record the radiation dose delivered to patients. In some cases, the radiation dose or the output value is calculated, rather than measured, using the pertinent radiological parameters and geometrical information. The AAPM Task Group 190 (TG-190) was established to evaluate the accuracy of the DAP meter in 2008. Since then, the term “DAP-meter” has been revised to air kerma-area product (KAP) meter. The charge of TG 190 (Accuracy and Calibration of Integrated Radiation Output Indicators in Diagnostic Radiology) has also been realigned to investigate the “Accuracy and Calibration of Integrated Radiation Output Indicators” which is reflected in the title of the task group, to include situations where the KAP may be acquired with or without the presence of a physical “meter.” To accomplish this goal, validation test protocols were developed to compare the displayed radiation output value to an external measurement. These test protocols were applied to a number of clinical systems to collect information on the accuracy of dose display values in the field.

  1. Accuracy and calibration of integrated radiation output indicators in diagnostic radiology: A report of the AAPM Imaging Physics Committee Task Group 190.

    PubMed

    Lin, Pei-Jan P; Schueler, Beth A; Balter, Stephen; Strauss, Keith J; Wunderle, Kevin A; LaFrance, M Terry; Kim, Don-Soo; Behrman, Richard H; Shepard, S Jeff; Bercha, Ishtiaq H

    2015-12-01

    Due to the proliferation of disciplines employing fluoroscopy as their primary imaging tool and the prolonged extensive use of fluoroscopy in interventional and cardiovascular angiography procedures, "dose-area-product" (DAP) meters were installed to monitor and record the radiation dose delivered to patients. In some cases, the radiation dose or the output value is calculated, rather than measured, using the pertinent radiological parameters and geometrical information. The AAPM Task Group 190 (TG-190) was established to evaluate the accuracy of the DAP meter in 2008. Since then, the term "DAP-meter" has been revised to air kerma-area product (KAP) meter. The charge of TG 190 (Accuracy and Calibration of Integrated Radiation Output Indicators in Diagnostic Radiology) has also been realigned to investigate the "Accuracy and Calibration of Integrated Radiation Output Indicators" which is reflected in the title of the task group, to include situations where the KAP may be acquired with or without the presence of a physical "meter." To accomplish this goal, validation test protocols were developed to compare the displayed radiation output value to an external measurement. These test protocols were applied to a number of clinical systems to collect information on the accuracy of dose display values in the field.

  2. Third-party brachytherapy source calibrations and physicist responsibilities: Report of the AAPM Low Energy Brachytherapy Source Calibration Working Group

    SciTech Connect

    Butler, Wayne M.; Bice, William S. Jr.; DeWerd, Larry A.; Hevezi, James M.; Huq, M. Saiful; Ibbott, Geoffrey S.; Palta, Jatinder R.; Rivard, Mark J.; Seuntjens, Jan P.; Thomadsen, Bruce R.

    2008-09-15

    The AAPM Low Energy Brachytherapy Source Calibration Working Group was formed to investigate and recommend quality control and quality assurance procedures for brachytherapy sources prior to clinical use. Compiling and clarifying recommendations established by previous AAPM Task Groups 40, 56, and 64 were among the working group's charges, which also included the role of third-party handlers to perform loading and assay of sources. This document presents the findings of the working group on the responsibilities of the institutional medical physicist and a clarification of the existing AAPM recommendations in the assay of brachytherapy sources. Responsibility for the performance and attestation of source assays rests with the institutional medical physicist, who must use calibration equipment appropriate for each source type used at the institution. Such equipment and calibration procedures shall ensure secondary traceability to a national standard. For each multi-source implant, 10% of the sources or ten sources, whichever is greater, are to be assayed. Procedures for presterilized source packaging are outlined. The mean source strength of the assayed sources must agree with the manufacturer's stated strength to within 3%, or action must be taken to resolve the difference. Third party assays do not absolve the institutional physicist from the responsibility to perform the institutional measurement and attest to the strength of the implanted sources. The AAPM leaves it to the discretion of the institutional medical physicist whether the manufacturer's or institutional physicist's measured value should be used in performing dosimetry calculations.

  3. Third-party brachytherapy source calibrations and physicist responsibilities: report of the AAPM Low Energy Brachytherapy Source Calibration Working Group.

    PubMed

    Butler, Wayne M; Bice, William S; DeWerd, Larry A; Hevezi, James M; Huq, M Saiful; Ibbott, Geoffrey S; Palta, Jatinder R; Rivard, Mark J; Seuntjens, Jan P; Thomadsen, Bruce R

    2008-09-01

    The AAPM Low Energy Brachytherapy Source Calibration Working Group was formed to investigate and recommend quality control and quality assurance procedures for brachytherapy sources prior to clinical use. Compiling and clarifying recommendations established by previous AAPM Task Groups 40, 56, and 64 were among the working group's charges, which also included the role of third-party handlers to perform loading and assay of sources. This document presents the findings of the working group on the responsibilities of the institutional medical physicist and a clarification of the existing AAPM recommendations in the assay of brachytherapy sources. Responsibility for the performance and attestation of source assays rests with the institutional medical physicist, who must use calibration equipment appropriate for each source type used at the institution. Such equipment and calibration procedures shall ensure secondary traceability to a national standard. For each multi-source implant, 10% of the sources or ten sources, whichever is greater, are to be assayed. Procedures for presterilized source packaging are outlined. The mean source strength of the assayed sources must agree with the manufacturer's stated strength to within 3%, or action must be taken to resolve the difference. Third party assays do not absolve the institutional physicist from the responsibility to perform the institutional measurement and attest to the strength of the implanted sources. The AAPM leaves it to the discretion of the institutional medical physicist whether the manufacturer's or institutional physicist's measured value should be used in performing dosimetry calculations. PMID:18841836

  4. Anniversary Paper: Development of x-ray computed tomography: The role of Medical Physics and AAPM from the 1970s to present

    SciTech Connect

    Pan Xiaochuan; Siewerdsen, Jeffrey; La Riviere, Patrick J.; Kalender, Willi A.

    2008-08-15

    The AAPM, through its members, meetings, and its flagship journal Medical Physics, has played an important role in the development and growth of x-ray tomography in the last 50 years. From a spate of early articles in the 1970s characterizing the first commercial computed tomography (CT) scanners through the ''slice wars'' of the 1990s and 2000s, the history of CT and related techniques such as tomosynthesis can readily be traced through the pages of Medical Physics and the annals of the AAPM and RSNA/AAPM Annual Meetings. In this article, the authors intend to give a brief review of the role of Medical Physics and the AAPM in CT and tomosynthesis imaging over the last few decades.

  5. TU-C-16A-01: Joint AAPM/SEFM/AMPR Educational Workshop On “Education of Radiotherapy Physicists”

    SciTech Connect

    Mahesh, M; Borras, C; Frey, G; Ribas-Morales, M; Ballester, F; Kazantsev, P; Kostylev, D

    2014-06-01

    This workshop is jointly organized by the AAPM, the Spanish (SEFM) and the Russian (AMPR) Medical Physics Societies, as part of formal educational exchange agreements signed by the AAPM with each one of these two societies.With the rapid technological advances in radiation therapy both for treatment and imaging, it is challenging how physics is taught to medical physicists practicing in radiation therapy. The main Objectives: of this workshop is to bring forth current status, challenges and issues related to education of radiation therapy physicists here in the US, Spain and Russia. Medical physicists from each one of these countries will present educational requirements of international recommendations and directives and analyze their impact on national legislations. Current and future educational models and plans for harmonization will be described. The role of universities, professional societies and examination boards, such as the American Board of Radiology, will be discussed. Minimum standards will be agreed upon. Learning Objectives: Review medical physics educational models supported by AAPM, SEFM, and AMPR. Discuss the role of governmental and non-governmental organizations in elaborating and adopting medical physics syllabi. Debate minimum educational standards for medical physics education based on country-specific resources.

  6. Ethics and professionalism in medical physics: A survey of AAPM members

    PubMed Central

    Ozturk, Naim; Armato, Samuel G.; Giger, Maryellen L.; Serago, Christopher F.; Ross, Lainie F.

    2013-01-01

    Purpose: To assess current education, practices, attitudes, and perceptions pertaining to ethics and professionalism in medical physics. Methods: A link to a web-based survey was distributed to the American Association of Physicists in Medicine (AAPM) e-mail membership list, with a follow-up e-mail sent two weeks later. The survey included questions about ethics/professionalism education, direct personal knowledge of ethically questionable practices in clinical care, research, education (teaching and mentoring), and professionalism, respondents’ assessment of their ability to address ethical/professional dilemmas, and demographics. For analysis, reports of unethical or ethically questionable practices or behaviors by approximately 40% or more of respondents were classified as “frequent.” Results: Partial or complete responses were received from 18% (1394/7708) of AAPM members. Overall, 60% (827/1377) of the respondents stated that they had not received ethics/professionalism education during their medical physics training. Respondents currently in training were more likely to state that they received instruction in ethics/professionalism (80%, 127/159) versus respondents who were post-training (35%, 401/1159). Respondents’ preferred method of instruction in ethics/professionalism was structured periodic discussions involving both faculty and students/trainees. More than 90% (1271/1384) supported continuing education in ethics/professionalism and 75% (1043/1386) stated they would attend ethics/professionalism sessions at professional/scientific meetings. In the research setting, reports about ethically questionable authorship assignment were frequent (approximately 40%) whereas incidents of ethically questionable practices about human subjects protections were quite infrequent (5%). In the clinical setting, there was frequent recollection of incidents regarding lack of training, resources and skills, and error/incident reporting. In the educational setting

  7. Influence of 320-detector-row volume scanning and AAPM report 111 CT dosimetry metrics on size-specific dose estimate: a Monte Carlo study.

    PubMed

    Haba, Tomonobu; Koyama, Shuji; Kinomura, Yutaka; Ida, Yoshihiro; Kobayashi, Masanao

    2016-09-01

    The American Association of Physicists in Medicine (AAPM) task group 204 has recommended the use of size-dependent conversion factors to calculate size-specific dose estimate (SSDE) values from volume computed tomography dose index (CTDIvol) values. However, these conversion factors do not consider the effects of 320-detector-row volume computed tomography (CT) examinations or the new CT dosimetry metrics proposed by AAPM task group 111. This study aims to investigate the influence of these examinations and metrics on the conversion factors reported by AAPM task group 204, using Monte Carlo simulations. Simulations were performed modelling a Toshiba Aquilion ONE CT scanner, in order to compute dose values in water for cylindrical phantoms with 8-40-cm diameters at 2-cm intervals for each scanning parameter (tube voltage, bow-tie filter, longitudinal beam width). Then, the conversion factors were obtained by applying exponential regression analysis between the dose values for a given phantom diameter and the phantom diameter combined with various scanning parameters. The conversion factors for each scanning method (helical, axial, or volume scanning) and CT dosimetry method (i.e., the CTDI100 method or the AAPM task group 111 method) were in agreement with those reported by AAPM task group 204, within a percentage error of 14.2 % for phantom diameters ≥11.2 cm. The results obtained in this study indicate that the conversion factors previously presented by AAPM task group 204 can be used to provide appropriate SSDE values for 320-detector-row volume CT examinations and the CT dosimetry metrics proposed by the AAPM task group 111.

  8. Influence of 320-detector-row volume scanning and AAPM report 111 CT dosimetry metrics on size-specific dose estimate: a Monte Carlo study.

    PubMed

    Haba, Tomonobu; Koyama, Shuji; Kinomura, Yutaka; Ida, Yoshihiro; Kobayashi, Masanao

    2016-09-01

    The American Association of Physicists in Medicine (AAPM) task group 204 has recommended the use of size-dependent conversion factors to calculate size-specific dose estimate (SSDE) values from volume computed tomography dose index (CTDIvol) values. However, these conversion factors do not consider the effects of 320-detector-row volume computed tomography (CT) examinations or the new CT dosimetry metrics proposed by AAPM task group 111. This study aims to investigate the influence of these examinations and metrics on the conversion factors reported by AAPM task group 204, using Monte Carlo simulations. Simulations were performed modelling a Toshiba Aquilion ONE CT scanner, in order to compute dose values in water for cylindrical phantoms with 8-40-cm diameters at 2-cm intervals for each scanning parameter (tube voltage, bow-tie filter, longitudinal beam width). Then, the conversion factors were obtained by applying exponential regression analysis between the dose values for a given phantom diameter and the phantom diameter combined with various scanning parameters. The conversion factors for each scanning method (helical, axial, or volume scanning) and CT dosimetry method (i.e., the CTDI100 method or the AAPM task group 111 method) were in agreement with those reported by AAPM task group 204, within a percentage error of 14.2 % for phantom diameters ≥11.2 cm. The results obtained in this study indicate that the conversion factors previously presented by AAPM task group 204 can be used to provide appropriate SSDE values for 320-detector-row volume CT examinations and the CT dosimetry metrics proposed by the AAPM task group 111. PMID:27444155

  9. The management of respiratory motion in radiation oncology report of AAPM Task Group 76

    SciTech Connect

    Keall, Paul J.; Mageras, Gig S.; Balter, James M.

    2006-10-15

    This document is the report of a task group of the AAPM and has been prepared primarily to advise medical physicists involved in the external-beam radiation therapy of patients with thoracic, abdominal, and pelvic tumors affected by respiratory motion. This report describes the magnitude of respiratory motion, discusses radiotherapy specific problems caused by respiratory motion, explains techniques that explicitly manage respiratory motion during radiotherapy and gives recommendations in the application of these techniques for patient care, including quality assurance (QA) guidelines for these devices and their use with conformal and intensity modulated radiotherapy. The technologies covered by this report are motion-encompassing methods, respiratory gated techniques, breath-hold techniques, forced shallow-breathing methods, and respiration-synchronized techniques. The main outcome of this report is a clinical process guide for managing respiratory motion. Included in this guide is the recommendation that tumor motion should be measured (when possible) for each patient for whom respiratory motion is a concern. If target motion is greater than 5 mm, a method of respiratory motion management is available, and if the patient can tolerate the procedure, respiratory motion management technology is appropriate. Respiratory motion management is also appropriate when the procedure will increase normal tissue sparing. Respiratory motion management involves further resources, education and the development of and adherence to QA procedures.

  10. WE-E-BRF-01: The ESTRO-AAPM Joint Symposium On Imaging for Proton Treatment Planning and Guidance

    SciTech Connect

    Parodi, K; Dauvergne, D; Kruse, J

    2014-06-15

    In this first inaugural joint ESTRO-AAPM session we will attempt to provide some answers to the problems encountered in the clinical application of particle therapy. Indeed the main advantage is that the physical properties of ion beams offer high ballistic accuracy for tightly conformal irradiation of the tumour volume, with excellent sparing of surrounding healthy tissue and critical organs, This also its Achilles' heel calling for an increasing role of imaging to ensure safe application of the intended dose to the targeted area during the entire course of fractionated therapy. We have three distinguished speakers addressing possible solutions. Katia Parodi (Ludwig Maximilians University, Munich, Germany) To date, Positron Emission Tomography (PET) is the only technique which has been already clinically investigated for in-vivo visualization of the beam range during or shortly after ion beam delivery. The method exploits the transient amount of β{sup 2}-activity induced in nuclear interactions between the primary beam and the irradiated tissue, depending on the ion beam species, the tissue elemental composition and physiological properties (in terms of biological clearance), as well as the time course of irradiation and imaging. This contribution will review initial results, ongoing methodological developments and remaining challenges related to the clinical usage of viable but often suboptimal instrumentation and workflows of PET-based treatment verification. Moreover, it will present and discuss promising new detector developments towards next-generation dedicated PET scanners relying on full-ring or dual-head designs for in-beam quasi real-time imaging. Denis Dauvergne (Institut de Physique Nucleaire de Lyon, Lyon, France) Prompt gamma radiation monitoring of hadron therapy presents the advantage of real time capability to measure the ion range. Both simulations and experiments show that millimetric verification of the range can be achieved at the pencil beam

  11. SU-E-I-20: Comprehensive Quality Assurance Test of Second Generation Toshiba Aquilion Large Bore CT Simulator Based On AAPM TG-66 Recommendations

    SciTech Connect

    Zhang, D

    2015-06-15

    Purpose: AAPM radiation therapy committee task group No. 66 (TG-66) published a report which described a general approach to CT simulator QA. The report outlines the testing procedures and specifications for the evaluation of patient dose, radiation safety, electromechanical components, and image quality for a CT simulator. The purpose of this study is to thoroughly evaluate the performance of a second generation Toshiba Aquilion Large Bore CT simulator with 90 cm bore size (Toshiba, Nasu, JP) based on the TG-66 criteria. The testing procedures and results from this study provide baselines for a routine QA program. Methods: Different measurements and analysis were performed including CTDIvol measurements, alignment and orientation of gantry lasers, orientation of the tabletop with respect to the imaging plane, table movement and indexing accuracy, Scanogram location accuracy, high contrast spatial resolution, low contrast resolution, field uniformity, CT number accuracy, mA linearity and mA reproducibility using a number of different phantoms and measuring devices, such as CTDI phantom, ACR image quality phantom, TG-66 laser QA phantom, pencil ion chamber (Fluke Victoreen) and electrometer (RTI Solidose 400). Results: The CTDI measurements were within 20% of the console displayed values. The alignment and orientation for both gantry laser and tabletop, as well as the table movement and indexing and scanogram location accuracy were within 2mm as specified in TG66. The spatial resolution, low contrast resolution, field uniformity and CT number accuracy were all within ACR’s recommended limits. The mA linearity and reproducibility were both well below the TG66 threshold. Conclusion: The 90 cm bore size second generation Toshiba Aquilion Large Bore CT simulator that comes with 70 cm true FOV can consistently meet various clinical needs. The results demonstrated that this simulator complies with the TG-66 protocol in all aspects including electromechanical component

  12. Flattening filter-free accelerators: a report from the AAPM Therapy Emerging Technology Assessment Work Group.

    PubMed

    Xiao, Ying; Kry, Stephen F; Popple, Richard; Yorke, Ellen; Papanikolaou, Niko; Stathakis, Sotirios; Xia, Ping; Huq, Saiful; Bayouth, John; Galvin, James; Yin, Fang-Fang

    2015-05-08

    This report describes the current state of flattening filter-free (FFF) radiotherapy beams implemented on conventional linear accelerators, and is aimed primarily at practicing medical physicists. The Therapy Emerging Technology Assessment Work Group of the American Association of Physicists in Medicine (AAPM) formed a writing group to assess FFF technology. The published literature on FFF technology was reviewed, along with technical specifications provided by vendors. Based on this information, supplemented by the clinical experience of the group members, consensus guidelines and recommendations for implementation of FFF technology were developed. Areas in need of further investigation were identified. Removing the flattening filter increases beam intensity, especially near the central axis. Increased intensity reduces treatment time, especially for high-dose stereotactic radiotherapy/radiosurgery (SRT/SRS). Furthermore, removing the flattening filter reduces out-of-field dose and improves beam modeling accuracy. FFF beams are advantageous for small field (e.g., SRS) treatments and are appropriate for intensity-modulated radiotherapy (IMRT). For conventional 3D radiotherapy of large targets, FFF beams may be disadvantageous compared to flattened beams because of the heterogeneity of FFF beam across the target (unless modulation is employed). For any application, the nonflat beam characteristics and substantially higher dose rates require consideration during the commissioning and quality assurance processes relative to flattened beams, and the appropriate clinical use of the technology needs to be identified. Consideration also needs to be given to these unique characteristics when undertaking facility planning. Several areas still warrant further research and development. Recommendations pertinent to FFF technology, including acceptance testing, commissioning, quality assurance, radiation safety, and facility planning, are presented. Examples of clinical

  13. Flattening filter-free accelerators: a report from the AAPM Therapy Emerging Technology Assessment Work Group.

    PubMed

    Xiao, Ying; Kry, Stephen F; Popple, Richard; Yorke, Ellen; Papanikolaou, Niko; Stathakis, Sotirios; Xia, Ping; Huq, Saiful; Bayouth, John; Galvin, James; Yin, Fang-Fang

    2015-01-01

    This report describes the current state of flattening filter-free (FFF) radiotherapy beams implemented on conventional linear accelerators, and is aimed primarily at practicing medical physicists. The Therapy Emerging Technology Assessment Work Group of the American Association of Physicists in Medicine (AAPM) formed a writing group to assess FFF technology. The published literature on FFF technology was reviewed, along with technical specifications provided by vendors. Based on this information, supplemented by the clinical experience of the group members, consensus guidelines and recommendations for implementation of FFF technology were developed. Areas in need of further investigation were identified. Removing the flattening filter increases beam intensity, especially near the central axis. Increased intensity reduces treatment time, especially for high-dose stereotactic radiotherapy/radiosurgery (SRT/SRS). Furthermore, removing the flattening filter reduces out-of-field dose and improves beam modeling accuracy. FFF beams are advantageous for small field (e.g., SRS) treatments and are appropriate for intensity-modulated radiotherapy (IMRT). For conventional 3D radiotherapy of large targets, FFF beams may be disadvantageous compared to flattened beams because of the heterogeneity of FFF beam across the target (unless modulation is employed). For any application, the nonflat beam characteristics and substantially higher dose rates require consideration during the commissioning and quality assurance processes relative to flattened beams, and the appropriate clinical use of the technology needs to be identified. Consideration also needs to be given to these unique characteristics when undertaking facility planning. Several areas still warrant further research and development. Recommendations pertinent to FFF technology, including acceptance testing, commissioning, quality assurance, radiation safety, and facility planning, are presented. Examples of clinical

  14. SU-E-P-22: AAPM Task Group 263 Tackling Standardization of Nomenclature for Radiation Therapy

    SciTech Connect

    Matuszak, M; Feng, M; Moran, J; Xiao, Y; Mayo, C; Miller, R; Bosch, W; Popple, R; Marks, L; Wu, Q; Molineu, A; Martel, M; Yock, T; McNutt, T; Brown, N; Purdie, T; Yorke, E; Santanam, L; Gabriel, P; Michalski, J; and others

    2015-06-15

    Purpose: There is growing recognition of need for increased clarity and consistency in the nomenclatures used for body and organ structures, DVH metrics, toxicity, dose and volume units, etc. Standardization has multiple benefits; e.g. facilitating data collection for clinical trials, enabling the pooling of data between institutions, making transfers (i.e. hand-offs) between centers safer, and enabling vendors to define “default” settings. Towards this goal, the American Association of Physicists in Medicine (AAPM) formed a task group (TG263) in July of 2014, operating under the Work Group on Clinical Trials to develop consensus statements. Guiding principles derived from the investigation and example nomenclatures will be presented for public feedback. Methods: We formed a multi-institutional and multi-vendor collaborative group of 39 physicists, physicians and others involved in clinical use and electronic transfer of information. Members include individuals from IROC, NRG, IHE-RO, DICOM WG-7, ASTRO and EORTC groups with overlapping interests to maximize the quality of the consensus and increase the likelihood of adoption. Surveys of group and NRG members were used to define current nomenclatures and requirements. Technical requirements of vendor systems and the proposed DICOM standards were examined. Results: There is a marked degree of inter and intra institutional variation in current approaches, resulting from inter-vendor differences in capabilities, clinic specific conceptualizations and inconsistencies. Using a consensus approach, the group defined optimal formats for the naming of targets and normal structures. A formal objective assessment of 13 existing clinically-used software packages show that all had capabilities to accommodate these recommended nomenclatures. Conclusions: A multi-stakeholder effort is making significant steps forward in developing a standard nomenclature that will work across platforms. Our current working list includes > 550

  15. I-125 seed calibration using the SeedSelectron® afterloader: a practical solution to fulfill AAPM-ESTRO recommendations

    PubMed Central

    Perez-Calatayud, Jose; Richart, Jose; Guirado, Damián; Pérez-García, Jordi; Rodríguez, Silvia; Santos, Manuel

    2012-01-01

    Purpose SeedSelectron® v1.26b (Nucletron BV, The Netherlands) is an afterloader system used in prostate interstitial permanent brachytherapy with I-125 selectSeed seeds. It contains a diode array to assay all implanted seeds. Only one or two seeds can be extracted during the surgical procedure and assayed using a well chamber to check the manufacturer air-kerma strength (SK) and to calibrate the diode array. Therefore, it is not feasible to assay 5–10% seeds as required by the AAPM-ESTRO. In this study, we present a practical solution of the SeedSelectron® users to fulfill the AAPM- ESTRO recommendations. Material and methods The method is based on: a) the SourceCheck® well ionization chamber (PTW, Germany) provided with a PTW insert; b) n = 10 selectSeed from the same batch and class as the seeds for the implant; c) the Nucletron insert to accommodate the n = 10 seeds on the SourceCheck® and to measure their averaged SK. Results for 56 implants have been studied comparing the SK value from the manufacturer with the one obtained with the n = 10 seeds using the Nucletron insert prior to the implant and with the SK of just one seed measured with the PTW insert during the implant. Results We are faced with SK deviation for individual seeds up to 7.8%. However, in the majority of cases SK is in agreement with the manufacturer value. With the method proposed using the Nucletron insert, the large deviations of SK are reduced and for 56 implants studied no deviation outside the range of the class were found. Conclusions The new Nucletron insert and the proposed procedure allow to evaluate the SK of the n = 10 seeds prior to the implant, fulfilling the AAPM-ESTRO recommendations. It has been adopted by Nucletron to be extended to seedSelectron® users under request. PMID:23346136

  16. Quality assurance of U.S.-guided external beam radiotherapy for prostate cancer: report of AAPM Task Group 154.

    PubMed

    Molloy, Janelle A; Chan, Gordon; Markovic, Alexander; McNeeley, Shawn; Pfeiffer, Doug; Salter, Bill; Tome, Wolfgang A

    2011-02-01

    Task Group 154 (TG154) of the American Association of Physicists in Medicine (AAPM) was created to produce a guidance document for clinical medical physicists describing recommended quality assurance (QA) procedures for ultrasound (U.S.)-guided external beam radiotherapy localization. This report describes the relevant literature, state of the art, and briefly summarizes U.S. imaging physics. Simulation, treatment planning and treatment delivery considerations are presented in order to improve consistency and accuracy. User training is emphasized in the report and recommendations regarding peer review are included. A set of thorough, yet practical, QA procedures, frequencies, and tolerances are recommended. These encompass recommendations to ensure both spatial accuracy and image quality.

  17. Off-label use of medical products in radiation therapy: Summary of the Report of AAPM Task Group No. 121

    SciTech Connect

    Thomadsen, Bruce R.; Thompson, Heaton H. II; Jani, Shirish K.; and others

    2010-05-15

    approval process, along with manufacturers' responsibilities, labeling, marketing and promotion, and off-label use. This is an educational and descriptive report and does not contain prescriptive recommendations. This report addresses the role of the medical physicist in clinical situations involving off-label use. Case studies in radiation therapy are presented. Any mention of commercial products is for identification only; it does not imply recommendations or endorsements of any of the authors or the AAPM. The full report, containing extensive background on off-label use with several appendices, is available on the AAPM website (http://www.aapm.org/pubs/reports/).

  18. The effect of differences in data base on the determination of absorbed dose in high-energy photon beams using the American Association of Physicists in Medicine protocol.

    PubMed

    Mijnheer, B J; Chin, L M

    1989-01-01

    Exposure rates were adjusted at the National Institute of Standards and Technology (NIST) on January 1, 1986 to take into account more recent values for some physical parameters, mainly in electron stopping power ratios. Exposure calibration factors for 60Co gamma rays Nx will therefore be lowered by 1.1%. Consequently, absorbed dose determinations in high-energy photon beams will be reduced by the same amount if the values for these physical parameters remain unchanged in the American Association of Physicists in Medicine (AAPM) protocol. If the same data base as used at NIST is applied in the AAPM protocol, then Ngas/Nx values, water-air stopping power ratios, and Pwall values will be different. The overall change in absorbed dose determinations using a consistent set of data will be a reduction of 0.8% for 60Co gamma rays and 1.5% for a 20-MV x-ray beam compared to the values before January 1, 1986. Since the net effect is small when different sets of data are applied, the new NIST exposure calibration factors may be used in combination with the AAPM protocol without significant error.

  19. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  20. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  1. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  2. Communications protocol

    NASA Technical Reports Server (NTRS)

    Zhou, Xiaoming (Inventor); Baras, John S. (Inventor)

    2010-01-01

    The present invention relates to an improved communications protocol which increases the efficiency of transmission in return channels on a multi-channel slotted Alohas system by incorporating advanced error correction algorithms, selective retransmission protocols and the use of reserved channels to satisfy the retransmission requests.

  3. Dosimetric characterization of the M-15 high-dose-rate Iridium-192 brachytherapy source using the AAPM and ESTRO formalism.

    PubMed

    Ho Than, Minh-Tri; Munro Iii, John J; Medich, David C

    2015-05-08

    The Source Production & Equipment Co. (SPEC) model M-15 is a new Iridium-192 brachytherapy source model intended for use as a temporary high-dose-rate (HDR) brachytherapy source for the Nucletron microSelectron Classic afterloading system. The purpose of this study is to characterize this HDR source for clinical application by obtaining a complete set of Monte Carlo calculated dosimetric parameters for the M-15, as recommended by AAPM and ESTRO, for isotopes with average energies greater than 50 keV. This was accomplished by using the MCNP6 Monte Carlo code to simulate the resulting source dosimetry at various points within a pseudoinfinite water phantom. These dosimetric values next were converted into the AAPM and ESTRO dosimetry parameters and the respective statistical uncertainty in each parameter also calculated and presented. The M-15 source was modeled in an MCNP6 Monte Carlo environment using the physical source specifications provided by the manufacturer. Iridium-192 photons were uniformly generated inside the iridium core of the model M-15 with photon and secondary electron transport replicated using photoatomic cross-sectional tables supplied with MCNP6. Simulations were performed for both water and air/vacuum computer models with a total of 4 × 109 sources photon history for each simulation and the in-air photon spectrum filtered to remove low-energy photons belowδ = 10 keV. Dosimetric data, including D·(r,θ), gL(r), F(r,θ), φan(r), and φ-an, and their statistical uncertainty were calculated from the output of an MCNP model consisting of an M-15 source placed at the center of a spherical water phantom of 100 cm diameter. The air kerma strength in free space, SK, and dose rate constant, Λ, also was computed from a MCNP model with M-15 Iridium-192 source, was centered at the origin of an evacuated phantom in which a critical volume containing air at STP was added 100 cm from the source center. The reference dose rate, D·(r0,θ0) ≡ D· (1cm

  4. Dose calculation for photon-emitting brachytherapy sources with average energy higher than 50 keV: Report of the AAPM and ESTRO

    SciTech Connect

    Perez-Calatayud, Jose; Ballester, Facundo; Das, Rupak K.; DeWerd, Larry A.; Ibbott, Geoffrey S.; Meigooni, Ali S.; Ouhib, Zoubir; Rivard, Mark J.; Sloboda, Ron S.; Williamson, Jeffrey F.

    2012-05-15

    Purpose: Recommendations of the American Association of Physicists in Medicine (AAPM) and the European Society for Radiotherapy and Oncology (ESTRO) on dose calculations for high-energy (average energy higher than 50 keV) photon-emitting brachytherapy sources are presented, including the physical characteristics of specific {sup 192}Ir, {sup 137}Cs, and {sup 60}Co source models. Methods: This report has been prepared by the High Energy Brachytherapy Source Dosimetry (HEBD) Working Group. This report includes considerations in the application of the TG-43U1 formalism to high-energy photon-emitting sources with particular attention to phantom size effects, interpolation accuracy dependence on dose calculation grid size, and dosimetry parameter dependence on source active length. Results: Consensus datasets for commercially available high-energy photon sources are provided, along with recommended methods for evaluating these datasets. Recommendations on dosimetry characterization methods, mainly using experimental procedures and Monte Carlo, are established and discussed. Also included are methodological recommendations on detector choice, detector energy response characterization and phantom materials, and measurement specification methodology. Uncertainty analyses are discussed and recommendations for high-energy sources without consensus datasets are given. Conclusions: Recommended consensus datasets for high-energy sources have been derived for sources that were commercially available as of January 2010. Data are presented according to the AAPM TG-43U1 formalism, with modified interpolation and extrapolation techniques of the AAPM TG-43U1S1 report for the 2D anisotropy function and radial dose function.

  5. Overview on the dosimetric uncertainty analysis for photon-emitting brachytherapy sources, in the light of the AAPM Task Group No 138 and GEC-ESTRO report

    NASA Astrophysics Data System (ADS)

    DeWerd, Larry A.; Venselaar, Jack L. M.; Ibbott, Geoffrey S.; Meigooni, Ali S.; Stump, Kurt E.; Thomadsen, Bruce R.; Rivard, Mark J.

    2012-10-01

    In 2011, the American Association of Physicists in Medicine (AAPM) and the Groupe Européen de Curiethérapie-European Society for Radiotherapy and Oncology (GEC-ESTRO) published a report pertaining to uncertainties in brachytherapy single-source dosimetry preceding clinical use. The International Organization for Standardization's Guide to the Expression of Uncertainty in Measurement and Technical Note 1297 by the National Institute of Standards and Technology are taken as reference standards for uncertainty formalism. Uncertainties involved in measurements or Monte Carlo methods to estimate brachytherapy dose distributions are provided with discussion of the components intrinsic to the overall dosimetric assessment. The uncertainty propagation from the primary calibration standard through transfer to the clinic for air-kerma strength is given with uncertainties in each of the brachytherapy dosimetry parameters of the AAPM TG-43 dose-calculation formalism. For low-energy and high-energy brachytherapy sources of low dose-rate and high dose-rate, a combined dosimetric uncertainty <5% (k = 1) is estimated, which is consistent with prior literature estimates. Recommendations are provided for clinical medical physicists, dosimetry investigators, and manufacturers of brachytherapy sources and treatment planning systems. These recommendations reflect the guidance of the AAPM and GEC-ESTRO for their members, and may also be used as guidance to manufacturers and regulatory agencies in developing good manufacturing practices for conventional brachytherapy sources used in routine clinical treatments.

  6. AAPM and GEC-ESTRO guidelines for image-guided robotic brachytherapy: Report of Task Group 192

    SciTech Connect

    Podder, Tarun K.; Beaulieu, Luc; Caldwell, Barrett; Cormack, Robert A.; Crass, Jostin B.; Dicker, Adam P.; Yu, Yan; Fenster, Aaron; Fichtinger, Gabor; Meltsner, Michael A.; Moerland, Marinus A.; Nath, Ravinder; Rivard, Mark J.; Salcudean, Tim; Song, Danny Y.; Thomadsen, Bruce R.

    2014-10-15

    In the last decade, there have been significant developments into integration of robots and automation tools with brachytherapy delivery systems. These systems aim to improve the current paradigm by executing higher precision and accuracy in seed placement, improving calculation of optimal seed locations, minimizing surgical trauma, and reducing radiation exposure to medical staff. Most of the applications of this technology have been in the implantation of seeds in patients with early-stage prostate cancer. Nevertheless, the techniques apply to any clinical site where interstitial brachytherapy is appropriate. In consideration of the rapid developments in this area, the American Association of Physicists in Medicine (AAPM) commissioned Task Group 192 to review the state-of-the-art in the field of robotic interstitial brachytherapy. This is a joint Task Group with the Groupe Européen de Curiethérapie-European Society for Radiotherapy and Oncology (GEC-ESTRO). All developed and reported robotic brachytherapy systems were reviewed. Commissioning and quality assurance procedures for the safe and consistent use of these systems are also provided. Manual seed placement techniques with a rigid template have an estimated in vivo accuracy of 3–6 mm. In addition to the placement accuracy, factors such as tissue deformation, needle deviation, and edema may result in a delivered dose distribution that differs from the preimplant or intraoperative plan. However, real-time needle tracking and seed identification for dynamic updating of dosimetry may improve the quality of seed implantation. The AAPM and GEC-ESTRO recommend that robotic systems should demonstrate a spatial accuracy of seed placement ≤1.0 mm in a phantom. This recommendation is based on the current performance of existing robotic brachytherapy systems and propagation of uncertainties. During clinical commissioning, tests should be conducted to ensure that this level of accuracy is achieved. These tests

  7. AAPM and GEC-ESTRO guidelines for image-guided robotic brachytherapy: report of Task Group 192.

    PubMed

    Podder, Tarun K; Beaulieu, Luc; Caldwell, Barrett; Cormack, Robert A; Crass, Jostin B; Dicker, Adam P; Fenster, Aaron; Fichtinger, Gabor; Meltsner, Michael A; Moerland, Marinus A; Nath, Ravinder; Rivard, Mark J; Salcudean, Tim; Song, Danny Y; Thomadsen, Bruce R; Yu, Yan

    2014-10-01

    In the last decade, there have been significant developments into integration of robots and automation tools with brachytherapy delivery systems. These systems aim to improve the current paradigm by executing higher precision and accuracy in seed placement, improving calculation of optimal seed locations, minimizing surgical trauma, and reducing radiation exposure to medical staff. Most of the applications of this technology have been in the implantation of seeds in patients with early-stage prostate cancer. Nevertheless, the techniques apply to any clinical site where interstitial brachytherapy is appropriate. In consideration of the rapid developments in this area, the American Association of Physicists in Medicine (AAPM) commissioned Task Group 192 to review the state-of-the-art in the field of robotic interstitial brachytherapy. This is a joint Task Group with the Groupe Européen de Curiethérapie-European Society for Radiotherapy & Oncology (GEC-ESTRO). All developed and reported robotic brachytherapy systems were reviewed. Commissioning and quality assurance procedures for the safe and consistent use of these systems are also provided. Manual seed placement techniques with a rigid template have an estimated in vivo accuracy of 3-6 mm. In addition to the placement accuracy, factors such as tissue deformation, needle deviation, and edema may result in a delivered dose distribution that differs from the preimplant or intraoperative plan. However, real-time needle tracking and seed identification for dynamic updating of dosimetry may improve the quality of seed implantation. The AAPM and GEC-ESTRO recommend that robotic systems should demonstrate a spatial accuracy of seed placement ≤1.0 mm in a phantom. This recommendation is based on the current performance of existing robotic brachytherapy systems and propagation of uncertainties. During clinical commissioning, tests should be conducted to ensure that this level of accuracy is achieved. These tests should

  8. AAPM and GEC-ESTRO guidelines for image-guided robotic brachytherapy: report of Task Group 192.

    PubMed

    Podder, Tarun K; Beaulieu, Luc; Caldwell, Barrett; Cormack, Robert A; Crass, Jostin B; Dicker, Adam P; Fenster, Aaron; Fichtinger, Gabor; Meltsner, Michael A; Moerland, Marinus A; Nath, Ravinder; Rivard, Mark J; Salcudean, Tim; Song, Danny Y; Thomadsen, Bruce R; Yu, Yan

    2014-10-01

    In the last decade, there have been significant developments into integration of robots and automation tools with brachytherapy delivery systems. These systems aim to improve the current paradigm by executing higher precision and accuracy in seed placement, improving calculation of optimal seed locations, minimizing surgical trauma, and reducing radiation exposure to medical staff. Most of the applications of this technology have been in the implantation of seeds in patients with early-stage prostate cancer. Nevertheless, the techniques apply to any clinical site where interstitial brachytherapy is appropriate. In consideration of the rapid developments in this area, the American Association of Physicists in Medicine (AAPM) commissioned Task Group 192 to review the state-of-the-art in the field of robotic interstitial brachytherapy. This is a joint Task Group with the Groupe Européen de Curiethérapie-European Society for Radiotherapy & Oncology (GEC-ESTRO). All developed and reported robotic brachytherapy systems were reviewed. Commissioning and quality assurance procedures for the safe and consistent use of these systems are also provided. Manual seed placement techniques with a rigid template have an estimated in vivo accuracy of 3-6 mm. In addition to the placement accuracy, factors such as tissue deformation, needle deviation, and edema may result in a delivered dose distribution that differs from the preimplant or intraoperative plan. However, real-time needle tracking and seed identification for dynamic updating of dosimetry may improve the quality of seed implantation. The AAPM and GEC-ESTRO recommend that robotic systems should demonstrate a spatial accuracy of seed placement ≤1.0 mm in a phantom. This recommendation is based on the current performance of existing robotic brachytherapy systems and propagation of uncertainties. During clinical commissioning, tests should be conducted to ensure that this level of accuracy is achieved. These tests should

  9. AAPM recommendations on dose prescription and reporting methods for permanent interstitial brachytherapy for prostate cancer: Report of Task Group 137

    SciTech Connect

    Nath, Ravinder; Bice, William S.; Butler, Wayne M.; Chen Zhe; Meigooni, Ali S.; Narayana, Vrinda; Rivard, Mark J.; Yu Yan

    2009-11-15

    tumor cure probability models, are reviewed. Based on these developments in literature, the AAPM recommends guidelines for dose prescription from a physics perspective for routine patient treatment, clinical trials, and for treatment planning software developers. The authors continue to follow the current recommendations on using D{sub 90} and V{sub 100} as the primary quantities, with more specific guidelines on the use of the imaging modalities and the timing of the imaging. The AAPM recommends that the postimplant evaluation should be performed at the optimum time for specific radionuclides. In addition, they encourage the use of a radiobiological model with a specific set of parameters to facilitate relative comparisons of treatment plans reported by different institutions using different loading patterns or radionuclides.

  10. AAPM Task Group 108: PET and PET/CT shielding requirements.

    PubMed

    Madsen, Mark T; Anderson, Jon A; Halama, James R; Kleck, Jeff; Simpkin, Douglas J; Votaw, John R; Wendt, Richard E; Williams, Lawrence E; Yester, Michael V

    2006-01-01

    The shielding of positron emission tomography (PET) and PET/CT (computed tomography) facilities presents special challenges. The 0.511 MeV annihilation photons associated with positron decay are much higher energy than other diagnostic radiations. As a result, barrier shielding may be required in floors and ceilings as well as adjacent walls. Since the patient becomes the radioactive source after the radiopharmaceutical has been administered, one has to consider the entire time that the subject remains in the clinic. In this report we present methods for estimating the shielding requirements for PET and PET/CT facilities. Information about the physical properties of the most commonly used clinical PET radionuclides is summarized, although the report primarily refers to fluorine-18. Typical PET imaging protocols are reviewed and exposure rates from patients are estimated including self-attenuation by body tissues and physical decay of the radionuclide. Examples of barrier calculations are presented for controlled and noncontrolled areas. Shielding for adjacent rooms with scintillation cameras is also discussed. Tables and graphs of estimated transmission factors for lead, steel, and concrete at 0.511 MeV are also included. Meeting the regulatory limits for uncontrolled areas can be an expensive proposition. Careful planning with the equipment vendor, facility architect, and a qualified medical physicist is necessary to produce a cost effective design while maintaining radiation safety standards. PMID:16485403

  11. SU-E-T-348: Verification MU Calculation for Conformal Radiotherapy with Multileaf Collimator Using Report AAPM TG 114

    SciTech Connect

    Adrada, A; Tello, Z; Medina, L; Garrigo, E; Venencia, D

    2014-06-01

    Purpose: The purpose of this work was to develop and validate an open source independent MU dose calculation software for 3D conformal radiotherapy with multileaf high and low resolution according to the report of AAPM TG 11 Methods: Treatment plans were done using Iplan v4.5 BrainLAB TPS. A 6MV photon beam produced by Primus and Novalis linear accelerators equipped with an Optifocus MLC and HDMLC, respectively. TPS dose calculation algorithms were pencil beam and Monte Carlo. 1082 treatments plans were selected for the study. The algorithm was written in free and open source CodeBlocks C++ platform. Treatment plans were imported by the software using RTP format. Equivalent size field is obtained from the positions of the leaves; the effective depth of calculation can be introduced by TPS's dosimetry report or automatically calculated starting from SSD. The inverse square law is calculated by the 3D coordinates of the isocenter and normalization point of the treatment plan. The dosimetric parameters TPR, Sc, Sp and WF are linearly interpolated. Results: 1082 plans of both machines were analyzed. The average uncertainty between the TPS and the independent calculation was −0.43% ± 2.42% [−7.90%, 7.50%]. Specifically for the Primus the variation obtained was −0.85% ± 2.53% and for the Novalis 0.00% ± 2.23%. Data show that 94.8% of the cases the uncertainty was less than or equal to 5%, while 98.9% is less than or equal to 6%. Conclusion: The developed software is appropriate for use in calculation of UM. This software can be obtained upon request.

  12. Evaluation of cassette-based digital radiography detectors using standardized image quality metrics: AAPM TG-150 Draft Image Detector Tests.

    PubMed

    Li, Guang; Greene, Travis C; Nishino, Thomas K; Willis, Charles E

    2016-09-08

    The purpose of this study was to evaluate several of the standardized image quality metrics proposed by the American Association of Physics in Medicine (AAPM) Task Group 150. The task group suggested region-of-interest (ROI)-based techniques to measure nonuniformity, minimum signal-to-noise ratio (SNR), number of anomalous pixels, and modulation transfer function (MTF). This study evaluated the effects of ROI size and layout on the image metrics by using four different ROI sets, assessed result uncertainty by repeating measurements, and compared results with two commercially available quality control tools, namely the Carestream DIRECTVIEW Total Quality Tool (TQT) and the GE Healthcare Quality Assurance Process (QAP). Seven Carestream DRX-1C (CsI) detectors on mobile DR systems and four GE FlashPad detectors in radiographic rooms were tested. Images were analyzed using MATLAB software that had been previously validated and reported. Our values for signal and SNR nonuniformity and MTF agree with values published by other investigators. Our results show that ROI size affects nonuniformity and minimum SNR measurements, but not detection of anomalous pixels. Exposure geometry affects all tested image metrics except for the MTF. TG-150 metrics in general agree with the TQT, but agree with the QAP only for local and global signal nonuniformity. The difference in SNR nonuniformity and MTF values between the TG-150 and QAP may be explained by differences in the calculation of noise and acquisition beam quality, respectively. TG-150's SNR nonuniformity metrics are also more sensitive to detector nonuniformity compared to the QAP. Our results suggest that fixed ROI size should be used for consistency because nonuniformity metrics depend on ROI size. Ideally, detector tests should be performed at the exact calibration position. If not feasible, a baseline should be established from the mean of several repeated measurements. Our study indicates that the TG-150 tests can be

  13. Evaluation of cassette-based digital radiography detectors using standardized image quality metrics: AAPM TG-150 Draft Image Detector Tests.

    PubMed

    Li, Guang; Greene, Travis C; Nishino, Thomas K; Willis, Charles E

    2016-01-01

    The purpose of this study was to evaluate several of the standardized image quality metrics proposed by the American Association of Physics in Medicine (AAPM) Task Group 150. The task group suggested region-of-interest (ROI)-based techniques to measure nonuniformity, minimum signal-to-noise ratio (SNR), number of anomalous pixels, and modulation transfer function (MTF). This study evaluated the effects of ROI size and layout on the image metrics by using four different ROI sets, assessed result uncertainty by repeating measurements, and compared results with two commercially available quality control tools, namely the Carestream DIRECTVIEW Total Quality Tool (TQT) and the GE Healthcare Quality Assurance Process (QAP). Seven Carestream DRX-1C (CsI) detectors on mobile DR systems and four GE FlashPad detectors in radiographic rooms were tested. Images were analyzed using MATLAB software that had been previously validated and reported. Our values for signal and SNR nonuniformity and MTF agree with values published by other investigators. Our results show that ROI size affects nonuniformity and minimum SNR measurements, but not detection of anomalous pixels. Exposure geometry affects all tested image metrics except for the MTF. TG-150 metrics in general agree with the TQT, but agree with the QAP only for local and global signal nonuniformity. The difference in SNR nonuniformity and MTF values between the TG-150 and QAP may be explained by differences in the calculation of noise and acquisition beam quality, respectively. TG-150's SNR nonuniformity metrics are also more sensitive to detector nonuniformity compared to the QAP. Our results suggest that fixed ROI size should be used for consistency because nonuniformity metrics depend on ROI size. Ideally, detector tests should be performed at the exact calibration position. If not feasible, a baseline should be established from the mean of several repeated measurements. Our study indicates that the TG-150 tests can be

  14. Rational Protocols

    NASA Astrophysics Data System (ADS)

    Cachin, Christian

    Security research continues to provide a plethora of new protocols and mechanisms; these solutions patch either existing vulnerabilities found in practical systems or solve hypothetical security problems in the sense that the problem is often conceived at the same time when the first solution is proposed. Yet only a very small fraction of this research is relevant to ordinary users in the sense that they are willing to actually deploy the technology.

  15. Calculations of two new dose metrics proposed by AAPM Task Group 111 using the measurements with standard CT dosimetry phantoms

    SciTech Connect

    Li, Xinhua; Zhang, Da; Liu, Bob

    2013-08-15

    Purpose: AAPM Task Group 111 proposed to measure the equilibrium dose-pitch product D-caret{sub eq} for scan modes involving table translation and the midpoint dose D{sub L}(0) for stationary-table modes on the central and peripheral axes of sufficiently long (e.g., at least 40 cm) phantoms. This paper presents an alternative approach to calculate both metrics using the measurements of scanning the standard computed tomographic (CT) dosimetry phantoms on CT scanners.Methods: D-caret{sub eq} was calculated from CTDI{sub 100} and ε(CTDI{sub 100}) (CTDI{sub 100} efficiency), and D{sub L}(0) was calculated from D-caret{sub eq} and the approach to equilibrium function H(L) =D{sub L}(0)/D{sub eq}, where D{sub eq} was the equilibrium dose. CTDI{sub 100} may be directly obtained from several sources (such as medical physicist's CT scanner performance evaluation or the IMPACT CT patient dosimetry calculator), or be derived from CTDI{sub Vol} using the central to peripheral CTDI{sub 100} ratio (R{sub 100}). The authors have provided the required ε(CTDI{sub 100}) and H(L) data in two previous papers [X. Li, D. Zhang, and B. Liu, Med. Phys. 39, 901–905 (2012); and ibid. 40, 031903 (10pp.) (2013)]. R{sub 100} was assessed for a series of GE, Siemens, Philips, and Toshiba CT scanners with multiple settings of scan field of view, tube voltage, and bowtie filter.Results: The calculated D{sub L}(0) and D{sub L}(0)/D{sub eq} in PMMA and water cylinders were consistent with the measurements on two GE CT scanners (LightSpeed 16 and VCT) by Dixon and Ballard [Med. Phys. 34, 3399–3413 (2007)], the measurements on a Siemens CT scanner (SOMATOM Spirit Power) by Descamps et al. [J. Appl. Clin. Med. Phys. 13, 293–302 (2012)], and the Monte Carlo simulations by Boone [Med. Phys. 36, 4547–4554 (2009)].Conclusions: D-caret{sub eq} and D{sub L}(0) can be calculated using the alternative approach. The authors have provided the required ε(CTDI{sub 100}) and H(L) data in two previous

  16. Dose calculation formalisms and consensus dosimetry parameters for intravascular brachytherapy dosimetry: Recommendations of the AAPM Therapy Physics Committee Task Group No. 149

    SciTech Connect

    Chiu-Tsao, Sou-Tung; Schaart, Dennis R.; Soares, Christopher G.; Nath, Ravinder

    2007-11-15

    Since the publication of AAPM Task Group 60 report in 1999, a considerable amount of dosimetry data for the three coronary brachytherapy systems in use in the United States has been reported. A subgroup, Task Group 149, of the AAPM working group on Special Brachytherapy Modalities (Bruce Thomadsen, Chair) was charged to develop recommendations for dose calculation formalisms and the related consensus dosimetry parameters. The recommendations of this group are presented here. For the Cordis {sup 192}Ir and Novoste {sup 90}Sr/{sup 90}Y systems, the original TG-43 formalism in spherical coordinates should be used along with the consensus values of the dose rate constant, geometry function, radial dose function, and anisotropy function for the single seeds. Contributions from the single seeds should be added linearly for the calculation of dose distributions from a source train. For the Guidant {sup 32}P wire system, the modified TG-43 formalism in cylindrical coordinates along with the recommended data for the 20 and 27 mm wires should be used. Data tables for the 6, 10, 14, 18, and 22 seed trains of the Cordis system, 30, 40, and 60 mm seed trains of the Novoste system, and the 20 and 27 mm wires of the Guidant system are presented along with our rationale and methodology for selecting the consensus data. Briefly, all available datasets were compared with each other and the consensus dataset was either an average of available data or the one obtained from the most densely populated study; in most cases this was a Monte Carlo calculation.

  17. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    SciTech Connect

    Wang, J; Chan, F; Newman, B; Larson, D; Leung, A; Fleischmann, D; Molvin, L; Marsh, D; Zorich, C; Phillips, L

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze the scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.

  18. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    SciTech Connect

    Drzymala, R. E.; Alvarez, P. E.; Bednarz, G.; Bourland, J. D.; DeWerd, L. A.; Ma, L.; Meltsner, S. G.; Neyman, G.; Novotny, J.; Petti, P. L.; Rivard, M. J.; Shiu, A. S.; Goetsch, S. J.

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods: Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS

  19. Estimating peak skin and eye lens dose from neuroperfusion examinations: Use of Monte Carlo based simulations and comparisons to CTDIvol, AAPM Report No. 111, and ImPACT dosimetry tool values

    PubMed Central

    Zhang, Di; Cagnon, Chris H.; Villablanca, J. Pablo; McCollough, Cynthia H.; Cody, Dianna D.; Zankl, Maria; Demarco, John J.; McNitt-Gray, Michael F.

    2013-01-01

    Purpose: CT neuroperfusion examinations are capable of delivering high radiation dose to the skin or lens of the eyes of a patient and can possibly cause deterministic radiation injury. The purpose of this study is to: (a) estimate peak skin dose and eye lens dose from CT neuroperfusion examinations based on several voxelized adult patient models of different head size and (b) investigate how well those doses can be approximated by some commonly used CT dose metrics or tools, such as CTDIvol, American Association of Physicists in Medicine (AAPM) Report No. 111 style peak dose measurements, and the ImPACT organ dose calculator spreadsheet. Methods: Monte Carlo simulation methods were used to estimate peak skin and eye lens dose on voxelized patient models, including GSF's Irene, Frank, Donna, and Golem, on four scanners from the major manufacturers at the widest collimation under all available tube potentials. Doses were reported on a per 100 mAs basis. CTDIvol measurements for a 16 cm CTDI phantom, AAPM Report No. 111 style peak dose measurements, and ImPACT calculations were performed for available scanners at all tube potentials. These were then compared with results from Monte Carlo simulations. Results: The dose variations across the different voxelized patient models were small. Dependent on the tube potential and scanner and patient model, CTDIvol values overestimated peak skin dose by 26%–65%, and overestimated eye lens dose by 33%–106%, when compared to Monte Carlo simulations. AAPM Report No. 111 style measurements were much closer to peak skin estimates ranging from a 14% underestimate to a 33% overestimate, and with eye lens dose estimates ranging from a 9% underestimate to a 66% overestimate. The ImPACT spreadsheet overestimated eye lens dose by 2%–82% relative to voxelized model simulations. Conclusions: CTDIvol consistently overestimates dose to eye lens and skin. The ImPACT tool also overestimated dose to eye lenses. As such they are still

  20. Considerations on the practical application of the size-specific dose estimation (SSDE) method of AAPM Report 204.

    PubMed

    Noferini, Linhsia; Fulcheri, Christian; Taddeucci, Adriana; Bartolini, Marco; Gori, Cesare

    2014-07-01

    Computed tomography (CT) is responsible for much of the radiation exposure to the population for medical purposes. The technique requires high doses that vary widely from center to center, and for different scanners and radiologists as well. In order to monitor doses to patients, the American Association of Physicists in Medicine has developed the size-specific dose estimate (SSDE), which consists of the determination of patient size dependent coefficients for converting the standard dosimetric index, CTDIvol, into an estimate of the dose actually absorbed by the patient. The present work deals with issues concerning the use of SSDE in the clinical practice. First the issue regarding how much SSDE varies when, for a given CT protocol, the scan covers slightly different volumes is addressed. Then, the differences among SSDE values derived from different patient size descriptors are investigated. For these purposes, data from a clinical archive are analyzed by an automatic procedure specifically developed for SSDE.

  1. Implementation of a Lateral TBI protocol in a Mexican Cancer Center

    NASA Astrophysics Data System (ADS)

    Mesa, Francisco; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko; Sosa, Modesto A.

    2008-08-01

    The development of a Lateral Total Body Irradiation protocol to be implemented at a High Specialty Medical Unit in Mexico as preparatory regimen for bone marrow transplant and treatment of several lymphomas is presented. This protocol was developed following AAPM specifications and has been validated for application at a cancer care center in United States. This protocol fundamentally focuses on patient care, avoiding instability and discomfort that may be encountered by other treatment regimes. In vivo dose verification with TLD-100 chips for each anatomical region of interest was utilized. TLD-100 chips were calibrated using a 6 MV photon beam for 10-120 cGy. Experimental results show TLD measurements with an error less than 1%. Standard deviations for calculated and measured doses for seven patients have been obtained. Data gathered for different levels of compensation indicate that a 3% measured tolerance level is acceptable. TLD point-dose measurements have been used to verify the dose beyond partial transmission lung blocks. Dose measurements beyond the lung block showed variation about 50% respects to prescribe dose. Midplane doses to the other anatomical sites were less than 2.5% respect of the prescribed dose.

  2. Implementation of a Lateral TBI protocol in a Mexican Cancer Center

    SciTech Connect

    Mesa, Francisco; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko; Sosa, Modesto A.

    2008-08-11

    The development of a Lateral Total Body Irradiation protocol to be implemented at a High Specialty Medical Unit in Mexico as preparatory regimen for bone marrow transplant and treatment of several lymphomas is presented. This protocol was developed following AAPM specifications and has been validated for application at a cancer care center in United States. This protocol fundamentally focuses on patient care, avoiding instability and discomfort that may be encountered by other treatment regimes. In vivo dose verification with TLD-100 chips for each anatomical region of interest was utilized. TLD-100 chips were calibrated using a 6 MV photon beam for 10-120 cGy. Experimental results show TLD measurements with an error less than 1%. Standard deviations for calculated and measured doses for seven patients have been obtained. Data gathered for different levels of compensation indicate that a 3% measured tolerance level is acceptable. TLD point-dose measurements have been used to verify the dose beyond partial transmission lung blocks. Dose measurements beyond the lung block showed variation about 50% respects to prescribe dose. Midplane doses to the other anatomical sites were less than 2.5% respect of the prescribed dose.

  3. Alternative parallel ring protocols

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.; Foudriat, E. C.; Maly, Kurt J.; Kale, V.

    1990-01-01

    Communication protocols are know to influence the utilization and performance of communication network. The effect of two token ring protocols on a gigabit network with multiple ring structure is investigated. In the first protocol, a mode sends at most one message on receiving a token. In the second protocol, a mode sends all the waiting messages when a token is received. The behavior of these protocols is shown to be highly dependent on the number of rings as well as the load in the network.

  4. SU-D-204-07: Comparison of AAPM TG150 Draft Image Receptor Tests with Vendor Automated QC Tests for Five Mobile DR Units

    SciTech Connect

    Li, G; Nishino, T; Greene, T; Willis, C

    2015-06-15

    Purpose: To determine the consistency of digital detector (DR) tests recommended by AAPM TG150 and tests provided by commercially available DirectView Total Quality Tool (TQT). Methods: The DR tests recommended by the TG150 Detector Subgroup[1] were performed on 4 new Carestream DRX-Revolution and one Carestream DRX1C retrofit of a GE AMX-4 that had been in service for three years. After detector calibration, flat-field images plus images of two bar patterns oriented parallel and perpendicular to the A-C axis, were acquired at conditions recommended by TG150. Raw images were harvested and then analyzed using a MATLAB software previously validated[2,3,4]. Data were analyzed using ROIs of two different dimensions: 1) 128 x 128 ROIs matching the detector electronics; and 2) 256 x 256 ROIs, each including 4 adjacent smaller ROIs. TG150 metrics from 128 x 128 ROIs were compared to TQT metrics, which are also obtained from 128 x 128 ROIs[5]. Results: The results show that both TG150 and TQT measurements were consistent among these detectors. Differences between TG150 and TQT values appear systematic. Compared with 128 x 128 ROIs, noise and SNR non-uniformity were lower with 256 x 256 ROIs, although signal non-uniformity was similar, indicating detectors were appropriately calibrated for gain and offset. MTF of the retrofit unit remained essentially the same between 2012 and 2015, but was inferior to the new units. The older generator focal spot is smaller (0.75mm vs. 1.2mm), and the SID for acquisition is 182cm as well, so focal spot dimensions cannot explain the difference. The difference in MTF may be secondary to differences in generator X-ray spectrum or by unannounced changes in detector architecture. Further investigation is needed. Conclusion: The study shows that both TG150 and TQT tests are consistent. The numerical value of some metrics are dependent on ROI size.

  5. A dosimetric uncertainty analysis for photon-emitting brachytherapy sources: report of AAPM Task Group No. 138 and GEC-ESTRO.

    PubMed

    DeWerd, Larry A; Ibbott, Geoffrey S; Meigooni, Ali S; Mitch, Michael G; Rivard, Mark J; Stump, Kurt E; Thomadsen, Bruce R; Venselaar, Jack L M

    2011-02-01

    This report addresses uncertainties pertaining to brachytherapy single-source dosimetry preceding clinical use. The International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) and the National Institute of Standards and Technology (NIST) Technical Note 1297 are taken as reference standards for uncertainty formalism. Uncertainties in using detectors to measure or utilizing Monte Carlo methods to estimate brachytherapy dose distributions are provided with discussion of the components intrinsic to the overall dosimetric assessment. Uncertainties provided are based on published observations and cited when available. The uncertainty propagation from the primary calibration standard through transfer to the clinic for air-kerma strength is covered first. Uncertainties in each of the brachytherapy dosimetry parameters of the TG-43 formalism are then explored, ending with transfer to the clinic and recommended approaches. Dosimetric uncertainties during treatment delivery are considered briefly but are not included in the detailed analysis. For low- and high-energy brachytherapy sources of low dose rate and high dose rate, a combined dosimetric uncertainty <5% (k=1) is estimated, which is consistent with prior literature estimates. Recommendations are provided for clinical medical physicists, dosimetry investigators, and source and treatment planning system manufacturers. These recommendations include the use of the GUM and NIST reports, a requirement of constancy of manufacturer source design, dosimetry investigator guidelines, provision of the lowest uncertainty for patient treatment dosimetry, and the establishment of an action level based on dosimetric uncertainty. These recommendations reflect the guidance of the American Association of Physicists in Medicine (AAPM) and the Groupe Européen de Curiethérapie-European Society for Therapeutic Radiology and Oncology (GEC-ESTRO) for their members and may also be used as

  6. A dosimetric uncertainty analysis for photon-emitting brachytherapy sources: report of AAPM Task Group No. 138 and GEC-ESTRO.

    PubMed

    DeWerd, Larry A; Ibbott, Geoffrey S; Meigooni, Ali S; Mitch, Michael G; Rivard, Mark J; Stump, Kurt E; Thomadsen, Bruce R; Venselaar, Jack L M

    2011-02-01

    This report addresses uncertainties pertaining to brachytherapy single-source dosimetry preceding clinical use. The International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) and the National Institute of Standards and Technology (NIST) Technical Note 1297 are taken as reference standards for uncertainty formalism. Uncertainties in using detectors to measure or utilizing Monte Carlo methods to estimate brachytherapy dose distributions are provided with discussion of the components intrinsic to the overall dosimetric assessment. Uncertainties provided are based on published observations and cited when available. The uncertainty propagation from the primary calibration standard through transfer to the clinic for air-kerma strength is covered first. Uncertainties in each of the brachytherapy dosimetry parameters of the TG-43 formalism are then explored, ending with transfer to the clinic and recommended approaches. Dosimetric uncertainties during treatment delivery are considered briefly but are not included in the detailed analysis. For low- and high-energy brachytherapy sources of low dose rate and high dose rate, a combined dosimetric uncertainty <5% (k=1) is estimated, which is consistent with prior literature estimates. Recommendations are provided for clinical medical physicists, dosimetry investigators, and source and treatment planning system manufacturers. These recommendations include the use of the GUM and NIST reports, a requirement of constancy of manufacturer source design, dosimetry investigator guidelines, provision of the lowest uncertainty for patient treatment dosimetry, and the establishment of an action level based on dosimetric uncertainty. These recommendations reflect the guidance of the American Association of Physicists in Medicine (AAPM) and the Groupe Européen de Curiethérapie-European Society for Therapeutic Radiology and Oncology (GEC-ESTRO) for their members and may also be used as

  7. A dosimetric uncertainty analysis for photon-emitting brachytherapy sources: Report of AAPM Task Group No. 138 and GEC-ESTRO

    PubMed Central

    DeWerd, Larry A.; Ibbott, Geoffrey S.; Meigooni, Ali S.; Mitch, Michael G.; Rivard, Mark J.; Stump, Kurt E.; Thomadsen, Bruce R.; Venselaar, Jack L. M.

    2011-01-01

    This report addresses uncertainties pertaining to brachytherapy single-source dosimetry preceding clinical use. The International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) and the National Institute of Standards and Technology (NIST) Technical Note 1297 are taken as reference standards for uncertainty formalism. Uncertainties in using detectors to measure or utilizing Monte Carlo methods to estimate brachytherapy dose distributions are provided with discussion of the components intrinsic to the overall dosimetric assessment. Uncertainties provided are based on published observations and cited when available. The uncertainty propagation from the primary calibration standard through transfer to the clinic for air-kerma strength is covered first. Uncertainties in each of the brachytherapy dosimetry parameters of the TG-43 formalism are then explored, ending with transfer to the clinic and recommended approaches. Dosimetric uncertainties during treatment delivery are considered briefly but are not included in the detailed analysis. For low- and high-energy brachytherapy sources of low dose rate and high dose rate, a combined dosimetric uncertainty <5% (k=1) is estimated, which is consistent with prior literature estimates. Recommendations are provided for clinical medical physicists, dosimetry investigators, and source and treatment planning system manufacturers. These recommendations include the use of the GUM and NIST reports, a requirement of constancy of manufacturer source design, dosimetry investigator guidelines, provision of the lowest uncertainty for patient treatment dosimetry, and the establishment of an action level based on dosimetric uncertainty. These recommendations reflect the guidance of the American Association of Physicists in Medicine (AAPM) and the Groupe Européen de Curiethérapie–European Society for Therapeutic Radiology and Oncology (GEC-ESTRO) for their members and may also be used

  8. A modern Monte Carlo investigation of the TG-43 dosimetry parameters for an {sup 125}I seed already having AAPM consensus data

    SciTech Connect

    Aryal, Prakash; Molloy, Janelle A.; Rivard, Mark J.

    2014-02-15

    Purpose: To investigate potential causes for differences in TG-43 brachytherapy dosimetry parameters in the existent literature for the model IAI-125A{sup 125}I seed and to propose new standard dosimetry parameters. Methods: The MCNP5 code was used for Monte Carlo (MC) simulations. Sensitivity of dose distributions, and subsequently TG-43 dosimetry parameters, was explored to reproduce historical methods upon which American Association of Physicists in Medicine (AAPM) consensus data are based. Twelve simulation conditions varying{sup 125}I coating thickness, coating mass density, photon interaction cross-section library, and photon emission spectrum were examined. Results: Varying{sup 125}I coating thickness, coating mass density, photon cross-section library, and photon emission spectrum for the model IAI-125A seed changed the dose-rate constant by up to 0.9%, about 1%, about 3%, and 3%, respectively, in comparison to the proposed standard value of 0.922 cGy h{sup −1} U{sup −1}. The dose-rate constant values by Solberg et al. [“Dosimetric parameters of three new solid core {sup 125}I brachytherapy sources,” J. Appl. Clin. Med. Phys. 3, 119–134 (2002)], Meigooni et al. [“Experimental and theoretical determination of dosimetric characteristics of IsoAid ADVANTAGE™ {sup 125}I brachytherapy source,” Med. Phys. 29, 2152–2158 (2002)], and Taylor and Rogers [“An EGSnrc Monte Carlo-calculated database of TG-43 parameters,” Med. Phys. 35, 4228–4241 (2008)] for the model IAI-125A seed and Kennedy et al. [“Experimental and Monte Carlo determination of the TG-43 dosimetric parameters for the model 9011 THINSeed™ brachytherapy source,” Med. Phys. 37, 1681–1688 (2010)] for the model 6711 seed were +4.3% (0.962 cGy h{sup −1} U{sup −1}), +6.2% (0.98 cGy h{sup −1} U{sup −1}), +0.3% (0.925 cGy h{sup −1} U{sup −1}), and −0.2% (0.921 cGy h{sup −1} U{sup −1}), respectively, in comparison to the proposed standard

  9. Determination of the Sensibility Factors for TLD-100 Powder on the Energy of X-Ray of 50, 250 kVp; 192Ir, 137Cs and 60Co

    SciTech Connect

    Loaiza, Sandra P.; Alvarez, Jose T.

    2006-09-08

    TLD-100 powder is calibrated in terms of absorbed dose to water Dw, using the protocols AAPM TG61, AAPM TG43 and IAEA-TRS 398, for the energy of RX 50, 250 kVp, 137Cs and 60Co respectively. The calibration curves, TLD Response R versus Dw, are fitted by weighted least square by a quadratic polynomials; which are validated with the lack of fit and the Anderson-Darling normality test. The slope of these curves corresponds to the sensibility factor: Fs R/DW, [Fs] = nC Gy-1. The expanded uncertainties U's for these factors are obtained from the ANOVA tables. Later, the Fs' values are interpolated using the effective energy hvefec for the 192Ir. The SSDL sent a set of capsules with powder TLD-100 for two Hospitals. These irradiated them a nominal dose of Dw = 2 Gy. The results determined at SSDL are: for the Hospital A the Dw is overestimated in order to 4.8% and the Hospital B underestimates it in the range from -1.4% to -17.5%.

  10. The RTS2 protocol

    NASA Astrophysics Data System (ADS)

    Kubánek, Petr; Jelínek, Martin; French, John; Prouza, Michal; Vítek, Stanislav; Castro-Tirado, Alberto J.; Reglero, Victor

    2008-07-01

    Remote Telescope System 2nd version (RTS2) is an open source project aimed at developing a software environment to control a fully robotic observatory. RTS2 consists of various components, which communicate via an ASCII based protocol. As the protocol was from the beginning designed as an observatory control system, it provides some unique features, which are hard to find in the other communication systems. These features include advanced synchronisation mechanisms and strategies for setting variables. This presentation describes the protocol and its unique features. It also assesses protocol performance, and provides examples how the RTS2 library can be used to quickly build an observatory control system.

  11. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  12. Montreal protocol: Business opportunites

    SciTech Connect

    1998-12-31

    The Montreal Protocol on Substances that Deplete the Ozone Layer was signed by 24 countries in 1987, establishing measures for controlling the production and consumption of ozone-depleting substances. This publication begins with some background information on ozone depletion and the history of the Protocol. It then describes aspects of the Protocol`s Multilateral Fund, created to assist developing countries to meet Protocol deadlines: Its administration, structure, and how projects are initiated. Names, addresses, and phone/fax numbers of Fund contacts are provided. Canadian projects under the Fund are then reviewed and opportunities for Canadian environmental companies are noted. Finally, information sheets are presented which summarize Fund-related Canadian bilateral projects undertaken to date.

  13. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  14. Reliable multicast protocol specifications protocol operations

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd; Whetten, Brian

    1995-01-01

    This appendix contains the complete state tables for Reliable Multicast Protocol (RMP) Normal Operation, Multi-RPC Extensions, Membership Change Extensions, and Reformation Extensions. First the event types are presented. Afterwards, each RMP operation state, normal and extended, is presented individually and its events shown. Events in the RMP specification are one of several things: (1) arriving packets, (2) expired alarms, (3) user events, (4) exceptional conditions.

  15. Precise clock synchronization protocol

    NASA Astrophysics Data System (ADS)

    Luit, E. J.; Martin, J. M. M.

    1993-12-01

    A distributed clock synchronization protocol is presented which achieves a very high precision without the need for very frequent resynchronizations. The protocol tolerates failures of the clocks: clocks may be too slow or too fast, exhibit omission failures and report inconsistent values. Synchronization takes place in synchronization rounds as in many other synchronization protocols. At the end of each round, clock times are exchanged between the clocks. Each clock applies a convergence function (CF) to the values obtained. This function estimates the difference between its clock and an average clock and corrects its clock accordingly. Clocks are corrected for drift relative to this average clock during the next synchronization round. The protocol is based on the assumption that clock reading errors are small with respect to the required precision of synchronization. It is shown that the CF resynchronizes the clocks with high precision even when relatively large clock drifts are possible. It is also shown that the drift-corrected clocks remain synchronized until the end of the next synchronization round. The stability of the protocol is proven.

  16. InterGroup Protocols

    2003-04-02

    Existing reliable ordered group communication protocols have been developed for local-area networks and do not in general scale well to a large number of nodes and wide-area networks. The InterGroup suite of protocols is a scalable group communication system that introduces an unusual approach to handling group membership, and supports a receiver-oriented selection of service. The protocols are intended for a wide-area network, with a large number of nodes, that has highly variable delays andmore » a high message loss rate, such as the Internet. The levels of the message delivery service range from unreliable unordered to reliable timestamp ordered.« less

  17. Optimal protocols for nonlocality distillation

    SciTech Connect

    Hoeyer, Peter; Rashid, Jibran

    2010-10-15

    Forster et al. recently showed that weak nonlocality can be amplified by giving the first protocol that distills a class of nonlocal boxes (NLBs) [Phys. Rev. Lett. 102, 120401 (2009)] We first show that their protocol is optimal among all nonadaptive protocols. We next consider adaptive protocols. We show that the depth-2 protocol of Allcock et al. [Phys. Rev. A 80, 062107 (2009)] performs better than previously known adaptive depth-2 protocols for all symmetric NLBs. We present a depth-3 protocol that extends the known region of distillable NLBs. We give examples of NLBs for which each of the Forster et al., the Allcock et al., and our protocols perform best. The understanding we develop is that there is no single optimal protocol for NLB distillation. The choice of which protocol to use depends on the noise parameters for the NLB.

  18. Comparison between the TRS-398 code of practice and the TG-51 dosimetry protocol for flattening filter free beams

    NASA Astrophysics Data System (ADS)

    Lye, J. E.; Butler, D. J.; Oliver, C. P.; Alves, A.; Lehmann, J.; Gibbons, F. P.; Williams, I. M.

    2016-07-01

    Dosimetry protocols for external beam radiotherapy currently in use, such as the IAEA TRS-398 and AAPM TG-51, were written for conventional linear accelerators. In these accelerators, a flattening filter is used to produce a beam which is uniform at water depths where the ionization chamber is used to measure the absorbed dose. Recently, clinical linacs have been implemented without the flattening filter, and published theoretical analysis suggested that with these beams a dosimetric error of order 0.6% could be expected for IAEA TRS-398, because the TPR20,10 beam quality index does not accurately predict the stopping power ratio (water to air) for the softer flattening-filter-free (FFF) beam spectra. We measured doses on eleven FFF linacs at 6 MV and 10 MV using both dosimetry protocols and found average differences of 0.2% or less. The expected shift due to stopping powers was not observed. We present Monte Carlo k Q calculations which show a much smaller difference between FFF and flattened beams than originally predicted. These results are explained by the inclusion of the added backscatter plates and build-up filters used in modern clinical FFF linacs, compared to a Monte Carlo model of an FFF linac in which the flattening filter is removed and no additional build-up or backscatter plate is added.

  19. Comparison between the TRS-398 code of practice and the TG-51 dosimetry protocol for flattening filter free beams.

    PubMed

    Lye, J E; Butler, D J; Oliver, C P; Alves, A; Lehmann, J; Gibbons, F P; Williams, I M

    2016-07-21

    Dosimetry protocols for external beam radiotherapy currently in use, such as the IAEA TRS-398 and AAPM TG-51, were written for conventional linear accelerators. In these accelerators, a flattening filter is used to produce a beam which is uniform at water depths where the ionization chamber is used to measure the absorbed dose. Recently, clinical linacs have been implemented without the flattening filter, and published theoretical analysis suggested that with these beams a dosimetric error of order 0.6% could be expected for IAEA TRS-398, because the TPR20,10 beam quality index does not accurately predict the stopping power ratio (water to air) for the softer flattening-filter-free (FFF) beam spectra. We measured doses on eleven FFF linacs at 6 MV and 10 MV using both dosimetry protocols and found average differences of 0.2% or less. The expected shift due to stopping powers was not observed. We present Monte Carlo k Q calculations which show a much smaller difference between FFF and flattened beams than originally predicted. These results are explained by the inclusion of the added backscatter plates and build-up filters used in modern clinical FFF linacs, compared to a Monte Carlo model of an FFF linac in which the flattening filter is removed and no additional build-up or backscatter plate is added. PMID:27366933

  20. WOODSTOVE DURABILITY TESTING PROTOCOL

    EPA Science Inventory

    The report discusses the development of an accelerated laboratory test to simulate in-home woodstove aging and degradation. nown as a stress test, the protocol determines the long-term durability of woodstove models in a 1- to 2-week time frame. wo avenues of research have been t...

  1. Protocols for distributive scheduling

    NASA Technical Reports Server (NTRS)

    Richards, Stephen F.; Fox, Barry

    1993-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of space shuttle mission planning.

  2. Generalized teleportation protocol

    SciTech Connect

    Gordon, Goren; Rigolin, Gustavo

    2006-04-15

    A generalized teleportation protocol (GTP) for N qubits is presented, where the teleportation channels are nonmaximally entangled and all the free parameters of the protocol are considered: Alice's measurement basis, her sets of acceptable results, and Bob's unitary operations. The full range of fidelity (F) of the teleported state and the probability of success (P{sub suc}) to obtain a given fidelity are achieved by changing these free parameters. A channel efficiency bound is found, where one can determine how to divide it between F and P{sub suc}. A one-qubit formulation is presented and then expanded to N qubits. A proposed experimental setup that implements the GTP is given using linear optics.

  3. Dysphonia risk screening protocol

    PubMed Central

    Nemr, Katia; Simões-Zenari, Marcia; da Trindade Duarte, João Marcos; Lobrigate, Karen Elena; Bagatini, Flavia Alves

    2016-01-01

    OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors) divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children), 29.25 (adult women), 22.75 (adult men), and 27.10 (seniors). CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics. PMID:27074171

  4. Mars Communication Protocols

    NASA Technical Reports Server (NTRS)

    Kazz, G. J.; Greenberg, E.

    2000-01-01

    Over the next decade, international plans and commitments are underway to develop an infrastructure at Mars to support future exploration of the red planet. The purpose of this infrastructure is to provide reliable global communication and navigation coverage for on-approach, landed, roving, and in-flight assets at Mars. The claim is that this infrastructure will: 1) eliminate the need of these assets to carry Direct to Earth (DTE) communications equipment, 2) significantly increase data return and connectivity, 3) enable small mission exploration of Mars without DTE equipment, 4) provide precision navigation i.e., 10 to 100m position resolution, 5) supply timing reference accurate to 10ms. This paper in particular focuses on two CCSDS recommendations for that infrastructure: CCSDS Proximity-1 Space Link Protocol and CCSDS File Delivery Protocol (CFDP). A key aspect of Mars exploration will be the ability of future missions to interoperate. These protocols establish a framework for interoperability by providing standard communication, navigation, and timing services. In addition, these services include strategies to recover gracefully from communication interruptions and interference while ensuring backward compatibility with previous missions from previous phases of exploration.

  5. Satellite Communications Using Commercial Protocols

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  6. Robust Optimization of Biological Protocols

    PubMed Central

    Flaherty, Patrick; Davis, Ronald W.

    2015-01-01

    When conducting high-throughput biological experiments, it is often necessary to develop a protocol that is both inexpensive and robust. Standard approaches are either not cost-effective or arrive at an optimized protocol that is sensitive to experimental variations. We show here a novel approach that directly minimizes the cost of the protocol while ensuring the protocol is robust to experimental variation. Our approach uses a risk-averse conditional value-at-risk criterion in a robust parameter design framework. We demonstrate this approach on a polymerase chain reaction protocol and show that our improved protocol is less expensive than the standard protocol and more robust than a protocol optimized without consideration of experimental variation. PMID:26417115

  7. Overview of wideband packet protocols

    NASA Astrophysics Data System (ADS)

    Sherif, M. H.

    1992-10-01

    Wideband packet networks operate at rates equal to, or higher than, 64 kb/s, but lower than the basic broadband rate of 150 Mb/s, on cables or satellite links. Wideband packet protocols are the transmission protocols for these networks. They define open interfaces that can be used for public and private ISDNs. The protocols are defined in CCITT Recommendations G.764 and G.765. This paper describes the objectives of the wideband protocols, and how the objectives were achieved.

  8. Communication complexity protocols for qutrits

    SciTech Connect

    Tamir, Boaz

    2007-03-15

    Consider a function where its entries are distributed among many parties. Suppose each party is allowed to send only a limited amount of information to a referee. The referee can use a classical protocol to compute the value of the global function. Is there a quantum protocol improving the results of all classical protocols? In a recent work Brukner et al. showed the deep connection between such problems and the theory of Bell inequalities. Here we generalize the theory to trits. There, the best classical protocol fails whereas the quantum protocol yields the correct answer.

  9. Optical Circuit Switched Protocol

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P. (Inventor)

    2000-01-01

    The present invention is a system and method embodied in an optical circuit switched protocol for the transmission of data through a network. The optical circuit switched protocol is an all-optical circuit switched network and includes novel optical switching nodes for transmitting optical data packets within a network. Each optical switching node comprises a detector for receiving the header, header detection logic for translating the header into routing information and eliminating the header, and a controller for receiving the routing information and configuring an all optical path within the node. The all optical path located within the node is solely an optical path without having electronic storage of the data and without having optical delay of the data. Since electronic storage of the header is not necessary and the initial header is eliminated by the first detector of the first switching node. multiple identical headers are sent throughout the network so that subsequent switching nodes can receive and read the header for setting up an optical data path.

  10. Protocol Architecture Model Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASA's four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. This report applies the methodology to three space Internet-based communications scenarios for future missions. CNS has conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. The scenarios are: Scenario 1: Unicast communications between a Low-Earth-Orbit (LEO) spacecraft inspace Internet node and a ground terminal Internet node via a Tracking and Data Rela Satellite (TDRS) transfer; Scenario 2: Unicast communications between a Low-Earth-Orbit (LEO) International Space Station and a ground terminal Internet node via a TDRS transfer; Scenario 3: Multicast Communications (or "Multicasting"), 1 Spacecraft to N Ground Receivers, N Ground Transmitters to 1 Ground Receiver via a Spacecraft.

  11. Licklider Transmission Protocol Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.; Krupiarz, Chris

    2011-01-01

    This software is an implementation of the Licklider Transmission Protocol (LTP), a communications protocol intended to support the Bundle Protocol in Delay-Tolerant Network (DTN) operations. LTP is designed to provide retransmission-based reliability over links characterized by extremely long message round-trip times and/or frequent interruptions in connectivity. Communication in interplanetary space is the most prominent example of this sort of environment, and LTP is principally aimed at supporting long-haul reliable transmission over deep-space RF links. Like any reliable transport service employing ARQ (Automatic Repeat re-Quests), LTP is stateful. In order to assure the reception of a block of data it has sent, LTP must retain for possible retransmission all portions of that block which might not have been received yet. In order to do so, it must keep track of which portions of the block are known to have been received so far, and which are not, together with any additional information needed for purposes of retransmitting part, or all, of the block. Long round-trip times mean substantial delay between the transmission of a block of data and the reception of an acknowledgement from the block s destination, signaling arrival of the block. If LTP postponed transmission of additional blocks of data until it received acknowledgement of the arrival of all prior blocks, valuable opportunities to use what little deep space transmission bandwidth is available would be forever lost. For this reason, LTP is based in part on a notion of massive state retention. Any number of requested transmission conversations (sessions) may be concurrently in flight at various displacements along the link between two LTP engines, and the LTP engines must necessarily retain transmission status and retransmission resources for all of them. Moreover, if any of the data of a given block are lost en route, it will be necessary to retain the state of that transmission during an additional

  12. Clinical Protocol Information System

    PubMed Central

    Wirtschafter, David D.; Gams, Richard; Ferguson, Carol; Blackwell, William; Boackle, Paul

    1980-01-01

    The Clinical Protocol Information System (CPIS) supports the clinical research and patient care objectives of the SouthEastern Cancer Study Group (SEG). The information system goals are to improve the evaluability of clinical trials, decrease the frequency of adverse patient events, implement drug toxicity surveillance, improve the availability of study data and demonstrate the criteria for computer networks that can impact on the general medical care of the community. Nodes in the network consist of Data General MicroNova MP-100 minicomputers that drive the interactive data dialogue and communicate with the network concentrator (another DG MicroNova) in Birmingham. Functions supported include: source data editing, care “advice,” care “audit,” care “explanation,” and treatment note printing. The complete database is updated nightly and resides on UAB's IBM 370/158-AP.

  13. Indoor air quality investigation protocols

    SciTech Connect

    Greene, R.E.; Williams, P.L.

    1996-10-01

    Over the past 10 to 15 years, an increasing number of complaints about discomfort and health effects related to indoor air quality (IAQ) have been reported. The increase in complaints has been accompanied by an increase in requests for IAQ investigations. This study presents an overview of the many IAQ investigation protocols published since 1984. For analysis, the protocols are divided into four categories: solution-oriented, building diagnostics, industrial hygiene, and epidemiology. In general, the protocols begin with general observations, proceed to collect more specific data as indicated, and end with conclusions and recommendations. A generic IAQ protocol is presented that incorporates the common aspects of the various protocols. All of the current protocols place heavy emphasis on the ventilation system during the investigation. A major problem affecting all of the current protocols is the lack of generally accepted IAQ standards. IN addition, the use of questionnaires, occupant interviews, and personal diaries (as well as the point in the investigation at which they are administered) differs among the protocols. Medical evaluations and verification procedures also differ among the protocols.

  14. Publishing protocols for partnered research.

    PubMed

    Hysong, Sylvia J; Woodard, LeChauncy; Garvin, Jennifer H; Murawsky, Jeffrey; Petersen, Laura A

    2014-12-01

    Published scientific protocols are advocated as a means of controlling bias in research reporting. Indeed, many journals require a study protocol with manuscript submission. However, publishing protocols of partnered research (PPR) can be challenging in light of the research model's dynamic nature, especially as no current reporting standards exist. Nevertheless, as these protocols become more prevalent, a priori documentation of methods in partnered research studies becomes increasingly important. Using as illustration a suite of studies aimed at improving coordination and communication in the primary care setting, we sought to identify challenges in publishing PPR relative to traditional designs, present alternative solutions to PPR publication, and propose an initial checklist of content to be included in protocols of partnered research. Challenges to publishing PPR include reporting details of research components intended to be co-created with operational partners, changes to sampling and entry strategy, and alignment of scientific and operational goals. Proposed solutions include emulating reporting standards of qualitative research, participatory action research, and adaptive trial designs, as well as embracing technological tools that facilitate publishing adaptive protocols, with version histories that are able to be updated as major protocol changes occur. Finally, we present a proposed checklist of reporting elements for partnered research protocols. PMID:25355092

  15. The simulation of communication protocols

    NASA Astrophysics Data System (ADS)

    de Carvalho Viola, F. E.

    1985-12-01

    Simulators for communication protocols specified in the ESTELLE and LC/1 languages are developed. The general principles of protocol simulation are reviewed; ESTELLE and LC/1 are characterized; a prototype LC/1 simulator based on predicate Petri nets is described; and a detailed specification for a driving interface for an ESTELLE simulator is given.

  16. Publishing protocols for partnered research.

    PubMed

    Hysong, Sylvia J; Woodard, LeChauncy; Garvin, Jennifer H; Murawsky, Jeffrey; Petersen, Laura A

    2014-12-01

    Published scientific protocols are advocated as a means of controlling bias in research reporting. Indeed, many journals require a study protocol with manuscript submission. However, publishing protocols of partnered research (PPR) can be challenging in light of the research model's dynamic nature, especially as no current reporting standards exist. Nevertheless, as these protocols become more prevalent, a priori documentation of methods in partnered research studies becomes increasingly important. Using as illustration a suite of studies aimed at improving coordination and communication in the primary care setting, we sought to identify challenges in publishing PPR relative to traditional designs, present alternative solutions to PPR publication, and propose an initial checklist of content to be included in protocols of partnered research. Challenges to publishing PPR include reporting details of research components intended to be co-created with operational partners, changes to sampling and entry strategy, and alignment of scientific and operational goals. Proposed solutions include emulating reporting standards of qualitative research, participatory action research, and adaptive trial designs, as well as embracing technological tools that facilitate publishing adaptive protocols, with version histories that are able to be updated as major protocol changes occur. Finally, we present a proposed checklist of reporting elements for partnered research protocols.

  17. Protocols.io: Virtual Communities for Protocol Development and Discussion.

    PubMed

    Teytelman, Leonid; Stoliartchouk, Alexei; Kindler, Lori; Hurwitz, Bonnie L

    2016-08-01

    The detailed know-how to implement research protocols frequently remains restricted to the research group that developed the method or technology. This knowledge often exists at a level that is too detailed for inclusion in the methods section of scientific articles. Consequently, methods are not easily reproduced, leading to a loss of time and effort by other researchers. The challenge is to develop a method-centered collaborative platform to connect with fellow researchers and discover state-of-the-art knowledge. Protocols.io is an open-access platform for detailing, sharing, and discussing molecular and computational protocols that can be useful before, during, and after publication of research results.

  18. Trial of a proposed protocol for constancy control of digital mammography systems

    SciTech Connect

    Pedersen, Kristin; Landmark, Ingrid Dypvik

    2009-12-15

    Purpose: Evaluate the utility of tests in a proposed protocol for constancy control of digital mammography systems. Methods: The protocol contained tests for image acquisition, mechanical function and safety, monitors and printers, and viewing conditions. Nine sites with digital systems from four equipment manufacturers were recruited. Dedicated PMMA test objects and Excel spreadsheets were developed. Quantitative measurements were done on processed images for systems where these images were the ones most readily available. For daily assessment of the automatic exposure control system, a homogeneous PMMA phantom was exposed under clinical conditions. The mAs and signal to noise ratio (SNR) were recorded, the deviation from a target value calculated, and the resulting image inspected for artifacts. For thickness tracking, the signal difference to noise ratio obtained for three thicknesses was calculated. Detector uniformity was assessed through comparison of SNR values for regions of interest in the center and corners of an image of a homogeneous test object. Mechanical function and safety control included a compression test, a checklist for mechanical aspects, and control of field alignment. Monitor performance was evaluated by visual inspection of the AAPM TG 18 QC test image [E. Samei et al., ''Assessment of display performance for medical imaging systems,'' Task Group 18 (Madison, WI, April 2005)]. Results: For quantitative parameters, target values and tolerance limits were established. Test results exceeding the limits were registered. Most systems exhibited stable mAs values, indicating that the tolerance limit of {+-}10% was readily achievable. The SNR also showed little variation, indicating that the tolerance limit of {+-}20% was too wide. At one site, a defective grid caused artifacts that were visible in the test images. The monitor controls proved more difficult to implement due to both difficulties importing and displaying the test image, and the

  19. Absorbed dose to water reference dosimetry using solid phantoms in the context of absorbed-dose protocols

    SciTech Connect

    Seuntjens, Jan; Olivares, Marina; Evans, Michael; Podgorsak, Ervin

    2005-09-15

    For reasons of phantom material reproducibility, the absorbed dose protocols of the American Association of Physicists in Medicine (AAPM) (TG-51) and the International Atomic Energy Agency (IAEA) (TRS-398) have made the use of liquid water as a phantom material for reference dosimetry mandatory. In this work we provide a formal framework for the measurement of absorbed dose to water using ionization chambers calibrated in terms of absorbed dose to water but irradiated in solid phantoms. Such a framework is useful when there is a desire to put dose measurements using solid phantoms on an absolute basis. Putting solid phantom measurements on an absolute basis has distinct advantages in verification measurements and quality assurance. We introduce a phantom dose conversion factor that converts a measurement made in a solid phantom and analyzed using an absorbed dose calibration protocol into absorbed dose to water under reference conditions. We provide techniques to measure and calculate the dose transfer from solid phantom to water. For an Exradin A12 ionization chamber, we measured and calculated the phantom dose conversion factor for six Solid Water{sup TM} phantoms and for a single Lucite phantom for photon energies between {sup 60}Co and 18 MV photons. For Solid Water{sup TM} of certified grade, the difference between measured and calculated factors varied between 0.0% and 0.7% with the average dose conversion factor being low by 0.4% compared with the calculation whereas for Lucite, the agreement was within 0.2% for the one phantom examined. The composition of commercial plastic phantoms and their homogeneity may not always be reproducible and consistent with assumed composition. By comparing measured and calculated phantom conversion factors, our work provides methods to verify the consistency of a given plastic for the purpose of clinical reference dosimetry.

  20. [Diagnostic protocol and special tests].

    PubMed

    Bellia, M; Pennarola, R

    2008-01-01

    Diagnostic protocol and special tests to ionizing radiations have a preventive function in the medical surveillance of the exposed worker. This protocol must be provided with laboratory and special test assessing fitness for working at the risk of ionizing radiations. The health of workers must be compatible with working conditions and radiation risk. This healthiness of workers is evalued in the time to give an assessment fitness for working at ionizing radiations. For this purpose the basic diagnostic protocol must guarantee minimum information about state of organs and apparatus in addition to the normality of the metabolisms. The diagnostic protocol of the exposed worker to ionizing radiations must be adapted to the specific clinical situation so as to finally make a costs-benefits balance sheet. PMID:19288808

  1. EPA Protocol Gas Verification Program

    EPA Science Inventory

    Accurate compressed gas calibration standards are needed to calibrate continuous emission monitors (CEMs) and ambient air quality monitors that are being used for regulatory purposes. US Environmental Protection Agency (EPA) established its traceability protocol to ensure that co...

  2. QUALITY CONTROL - VARIABILITY IN PROTOCOLS

    EPA Science Inventory

    The EPA Risk Reduction Engineering Laboratory’s Quality Assurance Office, which published the popular pocket guide Preparing Perfect Project Plans, is now introducing another quality assurance reference aid. The document Variability in Protocols (VIP) was initially designed as a ...

  3. Protocols.io: Virtual Communities for Protocol Development and Discussion

    PubMed Central

    Stoliartchouk, Alexei; Kindler, Lori; Hurwitz, Bonnie L.

    2016-01-01

    The detailed know-how to implement research protocols frequently remains restricted to the research group that developed the method or technology. This knowledge often exists at a level that is too detailed for inclusion in the methods section of scientific articles. Consequently, methods are not easily reproduced, leading to a loss of time and effort by other researchers. The challenge is to develop a method-centered collaborative platform to connect with fellow researchers and discover state-of-the-art knowledge. Protocols.io is an open-access platform for detailing, sharing, and discussing molecular and computational protocols that can be useful before, during, and after publication of research results. PMID:27547938

  4. Protocols.io: Virtual Communities for Protocol Development and Discussion.

    PubMed

    Teytelman, Leonid; Stoliartchouk, Alexei; Kindler, Lori; Hurwitz, Bonnie L

    2016-08-01

    The detailed know-how to implement research protocols frequently remains restricted to the research group that developed the method or technology. This knowledge often exists at a level that is too detailed for inclusion in the methods section of scientific articles. Consequently, methods are not easily reproduced, leading to a loss of time and effort by other researchers. The challenge is to develop a method-centered collaborative platform to connect with fellow researchers and discover state-of-the-art knowledge. Protocols.io is an open-access platform for detailing, sharing, and discussing molecular and computational protocols that can be useful before, during, and after publication of research results. PMID:27547938

  5. SU-E-J-113: The Influence of Optimizing Pediatric CT Simulator Protocols On the Treatment Dose Calculation in Radiotherapy

    SciTech Connect

    Zhang, Y; Zhang, J; Hu, Q; Tie, J; Wu, H; Deng, J

    2014-06-01

    Purpose: To investigate the possibility of applying optimized scanning protocols for pediatric CT simulation by quantifying the dosimetric inaccuracy introduced by using a fixed HU to density conversion. Methods: The images of a CIRS electron density reference phantom (Model 062) were acquired by a Siemens CT simulator (Sensation Open) using the following settings of tube voltage and beam current: 120 kV/190mA (the reference protocol used to calibrate CT for our treatment planning system (TPS)); Fixed 190mA combined with all available kV: 80, 100, and 140; fixed 120 kV and various current from 37 to 444 mA (scanner extremes) with interval of 30 mA. To avoid the HU uncertainty of point sampling in the various inserts of known electron densities, the mean CT numbers of the central cylindrical volume were calculated using DICOMan software. The doses per 100 MU to the reference point (SAD=100cm, Depth=10cm, Field=10X10cm, 6MV photon beam) in a virtual cubic phantom (30X30X30cm) were calculated using Eclipse TPS (calculation model: AcurosXB-11031) by assigning the CT numbers to HU of typical materials acquired by various protocols. Results: For the inserts of densities less than muscle, CT number fluctuations of all protocols were within the tolerance of 10 HU as accepted by AAPM-TG66. For more condensed materials, fixed kV yielded stable HU with any mA combination where largest disparities were found in 1750mg/cc insert: HU{sub reference}=1801(106.6cGy), HU{sub minimum}=1799 (106.6cGy, error{sub dose}=0.00%), HU{sub maximum}=1815 (106.8cGy, error{sub dose}=0.19%). Yet greater disagreements were observed with increasing density when kV was modified: HU{sub minimum}=1646 (104.5cGy, error{sub dose}=- 1.97%), HU{sub maximum}=2487 (116.4cGy, error{sub dose}=9.19%) in 1750mg/cc insert. Conclusion: Without affecting treatment dose calculation, personalized mA optimization of CT simulator can be conducted by fixing kV for a better cost-effectiveness of imaging dose and quality

  6. Lightweight Distance Bounding Protocol against Relay Attacks

    NASA Astrophysics Data System (ADS)

    Kim, Jin Seok; Cho, Kookrae; Yum, Dae Hyun; Hong, Sung Je; Lee, Pil Joong

    Traditional authentication protocols are based on cryptographic techniques to achieve identity verification. Distance bounding protocols are an enhanced type of authentication protocol built upon both signal traversal time measurement and cryptographic techniques to accomplish distance verification as well as identity verification. A distance bounding protocol is usually designed to defend against the relay attack and the distance fraud attack. As there are applications to which the distance fraud attack is not a serious threat, we propose a streamlined distance bounding protocol that focuses on the relay attack. The proposed protocol is more efficient than previous protocols and has a low false acceptance rate under the relay attack.

  7. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy

    2000-01-01

    This document describes a simple XML-based protocol that can be used for producers of events to communicate with consumers of events. The protocol described here is not meant to be the most efficient protocol, the most logical protocol, or the best protocol in any way. This protocol was defined quickly and it's intent is to give us a reasonable protocol that we can implement relatively easily and then use to gain experience in distributed event services. This experience will help us evaluate proposals for event representations, XML-based encoding of information, and communication protocols. The next section of this document describes how we represent events in this protocol and then defines the two events that we choose to use for our initial experiments. These definitions are made by example so that they are informal and easy to understand. The following section then proceeds to define the producer-consumer protocol we have agreed upon for our initial experiments.

  8. Developing protocols for obstetric emergencies.

    PubMed

    Roth, Cheryl K; Parfitt, Sheryl E; Hering, Sandra L; Dent, Sarah A

    2014-01-01

    There is potential for important steps to be missed in emergency situations, even in the presence of many health care team members. Developing a clear plan of response for common emergencies can ensure that no tasks are redundant or omitted, and can create a more controlled environment that promotes positive health outcomes. A multidisciplinary team was assembled in a large community hospital to create protocols that would help ensure optimum care and continuity of practice in cases of postpartum hemorrhage, shoulder dystocia, emergency cesarean surgical birth, eclamptic seizure and maternal code. Assignment of team roles and responsibilities led to the evolution of standardized protocols for each emergency situation.

  9. FTP Extensions for Variable Protocol Specification

    NASA Technical Reports Server (NTRS)

    Allman, Mark; Ostermann, Shawn

    2000-01-01

    The specification for the File Transfer Protocol (FTP) assumes that the underlying network protocols use a 32-bit network address and a 16-bit transport address (specifically IP version 4 and TCP). With the deployment of version 6 of the Internet Protocol, network addresses will no longer be 32-bits. This paper species extensions to FTP that will allow the protocol to work over a variety of network and transport protocols.

  10. FIELD SAMPLING PROTOCOLS AND ANALYSIS

    EPA Science Inventory

    I have been asked to speak again to the environmental science class regarding actual research scenarios related to my work at Kerr Lab. I plan to discuss sampling protocols along with various field analyses performed during sampling activities. Many of the students have never see...

  11. A Student Teamwork Induction Protocol

    ERIC Educational Resources Information Center

    Kamau, Caroline; Spong, Abigail

    2015-01-01

    Faulty group processes have harmful effects on performance but there is little research about intervention protocols to pre-empt them in higher education. This naturalistic experiment compared a control cohort with an inducted cohort. The inducted cohort attended a workshop, consultations, elected a leader and used tools (a group log and group…

  12. Bundle Security Protocol for ION

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.; Birrane, Edward J.; Krupiarz, Christopher

    2011-01-01

    This software implements bundle authentication, conforming to the Delay-Tolerant Networking (DTN) Internet Draft on Bundle Security Protocol (BSP), for the Interplanetary Overlay Network (ION) implementation of DTN. This is the only implementation of BSP that is integrated with ION.

  13. SU-E-I-34: Evaluating Use of AEC to Lower Dose for Lung Cancer Screening CT Protocols

    SciTech Connect

    Arbique, G; Anderson, J; Guild, J; Duan, X; Malguria, N; Omar, H; Brewington, C; Zhang, D

    2015-06-15

    Purpose: The National Lung Screening Trial mandated manual low dose CT technique factors, where up to a doubling of radiation output could be used over a regular to large patient size range. Recent guidance from the AAPM and ACR for lung cancer CT screening recommends radiation output adjustment for patient size either through AEC or a manual technique chart. This study evaluated the use of AEC for output control and dose reduction. Methods: The study was performed on a multidetector helical CT scanner (Aquillion ONE, Toshiba Medical) equipped with iterative reconstruction (ADIR-3D), AEC was adjusted with a standard deviation (SD) image quality noise index. The protocol SD parameter was incrementally increased to reduce patient population dose while image quality was evaluated by radiologist readers scoring the clinical utility of images on a Likert scale. Results: Plots of effective dose vs. body size (water cylinder diameter reported by the scanner) demonstrate monotonic increase in patient dose with increasing patient size. At the initial SD setting of 19 the average CTDIvol for a standard size patient was ∼ 2.0 mGy (1.2 mSv effective dose). This was reduced to ∼1.0 mGy (0.5 mSv) at an SD of 25 with no noticeable reduction in clinical utility of images as demonstrated by Likert scoring. Plots of effective patient diameter and BMI vs body size indicate that these metrics could also be used for manual technique charts. Conclusion: AEC offered consistent and reliable control of radiation output in this study. Dose for a standard size patient was reduced to one-third of the 3 mGy CTDIvol limit required for ACR accreditation of lung cancer CT screening. Gary Arbique: Research Grant, Toshiba America Medical Systems; Cecelia Brewington: Research Grant, Toshiba America Medical Systems; Di Zhang: Employee, Toshiba America Medical Systems.

  14. Implementation and evaluation of a protocol management system for automated review of CT protocols.

    PubMed

    Grimes, Joshua; Leng, Shuai; Zhang, Yi; Vrieze, Thomas; McCollough, Cynthia

    2016-09-08

    Protocol review is important to decrease the risk of patient injury and increase the consistency of CT image quality. A large volume of CT protocols makes manual review labor-intensive, error-prone, and costly. To address these challenges, we have developed a software system for automatically managing and monitoring CT proto-cols on a frequent basis. This article describes our experiences in the implementation and evaluation of this protocol monitoring system. In particular, we discuss various strategies for addressing each of the steps in our protocol-monitoring workflow, which are: maintaining an accurate set of master protocols, retrieving protocols from the scanners, comparing scanner protocols to master protocols, reviewing flagged differences between the scanner and master protocols, and updating the scanner and/or master protocols. In our initial evaluation focusing only on abdo-men and pelvis protocols, we detected 309 modified protocols in a 24-week trial period. About one-quarter of these modified protocols were determined to contain inappropriate (i.e., erroneous) protocol parameter modifications that needed to be corrected on the scanner. The most frequently affected parameter was the series description, which was inappropriately modified 47 times. Two inappropriate modifications were made to the tube current, which is particularly important to flag as this parameter impacts both radiation dose and image quality. The CT protocol changes detected in this work provide strong motivation for the use of an automated CT protocol quality control system to ensure protocol accuracy and consistency.

  15. Implementation and evaluation of a protocol management system for automated review of CT protocols.

    PubMed

    Grimes, Joshua; Leng, Shuai; Zhang, Yi; Vrieze, Thomas; McCollough, Cynthia

    2016-01-01

    Protocol review is important to decrease the risk of patient injury and increase the consistency of CT image quality. A large volume of CT protocols makes manual review labor-intensive, error-prone, and costly. To address these challenges, we have developed a software system for automatically managing and monitoring CT proto-cols on a frequent basis. This article describes our experiences in the implementation and evaluation of this protocol monitoring system. In particular, we discuss various strategies for addressing each of the steps in our protocol-monitoring workflow, which are: maintaining an accurate set of master protocols, retrieving protocols from the scanners, comparing scanner protocols to master protocols, reviewing flagged differences between the scanner and master protocols, and updating the scanner and/or master protocols. In our initial evaluation focusing only on abdo-men and pelvis protocols, we detected 309 modified protocols in a 24-week trial period. About one-quarter of these modified protocols were determined to contain inappropriate (i.e., erroneous) protocol parameter modifications that needed to be corrected on the scanner. The most frequently affected parameter was the series description, which was inappropriately modified 47 times. Two inappropriate modifications were made to the tube current, which is particularly important to flag as this parameter impacts both radiation dose and image quality. The CT protocol changes detected in this work provide strong motivation for the use of an automated CT protocol quality control system to ensure protocol accuracy and consistency. PMID:27685112

  16. Chapter 14: Chiller Evaluation Protocol

    SciTech Connect

    Tiessen, A.

    2014-09-01

    This protocol defines a chiller measure as a project that directly impacts equipment within the boundary of a chiller plant. A chiller plant encompasses a chiller--or multiple chillers--and associated auxiliary equipment. This protocol primarily covers electric-driven chillers and chiller plants. It does not include thermal energy storage and absorption chillers fired by natural gas or steam, although a similar methodology may be applicable to these chilled water system components. Chillers provide mechanical cooling for commercial, institutional, multiunit residential, and industrial facilities. Cooling may be required for facility heating, ventilation, and air conditioning systems or for process cooling loads (e.g., data centers, manufacturing process cooling). The vapor compression cycle, or refrigeration cycle, cools water in the chilled water loop by absorbing heat and rejecting it to either a condensing water loop (water cooled chillers) or to the ambient air (air-cooled chillers).

  17. Neonatal euthanasia: The Groningen Protocol.

    PubMed

    Vizcarrondo, Felipe E

    2014-11-01

    For the past thirty years, voluntary euthanasia and physician-assisted suicide of adult patients have been common practice in the Netherlands. Neonatal euthanasia was recently legalized in the Netherlands and the Groningen Protocol (GP) was developed to regulate the practice. Supporters claim compliance with the GP criteria makes neonatal euthanasia ethically permissible. An examination of the criteria used by the Protocol to justify the euthanasia of seriously ill neonates reveals the criteria are not based on firm moral principles. The taking of the life of a seriously ill person is not the solution to the pain and suffering of the dying process. It is the role of the medical professional to care for the ailing patient with love and compassion, always preserving the person's dignity. Neonatal euthanasia is not ethically permissible. PMID:25473136

  18. Building America House Simulation Protocols

    SciTech Connect

    Hendron, Robert; Engebrecht, Cheryn

    2010-09-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  19. Multiple protocol fluorometer and method

    DOEpatents

    Kolber, Zbigniew S.; Falkowski, Paul G.

    2000-09-19

    A multiple protocol fluorometer measures photosynthetic parameters of phytoplankton and higher plants using actively stimulated fluorescence protocols. The measured parameters include spectrally-resolved functional and optical absorption cross sections of PSII, extent of energy transfer between reaction centers of PSII, F.sub.0 (minimal), F.sub.m (maximal) and F.sub.v (variable) components of PSII fluorescence, photochemical and non-photochemical quenching, size of the plastoquinone (PQ) pool, and the kinetics of electron transport between Q.sub.a and PQ pool and between PQ pool and PSI. The multiple protocol fluorometer, in one embodiment, is equipped with an excitation source having a controlled spectral output range between 420 nm and 555 nm and capable of generating flashlets having a duration of 0.125-32 .mu.s, an interval between 0.5 .mu.s and 2 seconds, and peak optical power of up to 2 W/cm.sup.2. The excitation source is also capable of generating, simultaneous with the flashlets, a controlled continuous, background illumination.

  20. The reliable multicast protocol application programming interface

    NASA Technical Reports Server (NTRS)

    Montgomery , Todd; Whetten, Brian

    1995-01-01

    The Application Programming Interface for the Berkeley/WVU implementation of the Reliable Multicast Protocol is described. This transport layer protocol is implemented as a user library that applications and software buses link against.

  1. Integrating protocol schedules with patients' personal calendars.

    PubMed

    Civan, Andrea; Gennari, John H; Pratt, Wanda

    2006-01-01

    We propose a new approach for integrating protocol care schedules into patients' personal calendars. This approach could provide patients with greater control over their current and future scheduling demands as they seek and receive protocol-based care. PMID:17238511

  2. A Wiki Based CT Protocol Management System.

    PubMed

    Szczykutowicz, Timothy P; Rubert, Nicholas; Belden, Daryn; Ciano, Amanda; Duplissis, Andrew; Hermanns, Ashley; Monette, Stephen; Saldivar, Elliott Janssen

    2015-01-01

    At the University of Wisconsin Madison Department of Radiology, CT protocol management requires maintenance of thousands of parameters for each scanner. Managing CT protocols is further complicated by the unique configurability of each scanner. Due to recent Joint Commission requirements, now all CT protocol changes must be documented and reviewed by a site's CT protocol optimization team. The difficulty of managing the CT protocols was not in assembling the protocols, but in managing and implementing changes. This is why a wiki based solution for protocol management was implemented. A wiki inherently keeps track of all changes, logging who made the changes and when, allowing for editing and viewing permissions to be controlled, as well as allowing protocol changes to be instantly relayed to all scanner locations.

  3. Developing family planning nurse practitioner protocols.

    PubMed

    Hawkins, J W; Roberto, D

    1984-01-01

    This article focuses on the process of development of protocols for family planning nurse practitioners. A rationale for the use of protocols, a definition of the types and examples, and the pros and cons of practice with protocols are presented. A how-to description for the development process follows, including methods and a suggested tool for critique and evaluation. The aim of the article is to assist nurse practitioners in developing protocols for their practice.

  4. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 33 2012-07-01 2012-07-01 false Protocol. 792.120 Section 792.120 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Protocol for and Conduct of A Study § 792.120 Protocol. (a)...

  5. Cryptanalysis on Cheng et al. protocol

    NASA Astrophysics Data System (ADS)

    Thakur, Tejeshwari

    2016-06-01

    Deployment of new node in any wireless sensor network is a sensitive task. This is the reason why, an Access Control Protocol is required in WSN. In this paper, we demonstrate that Access Control Protocol proposed by Cheng et al.[1] for Wireless Sensor Network is insecure. The reason is that this protocol fails to resist the active attack.

  6. 21 CFR 312.83 - Treatment protocols.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 5 2013-04-01 2013-04-01 false Treatment protocols. 312.83 Section 312.83 Food...-debilitating Illnesses § 312.83 Treatment protocols. If the preliminary analysis of phase 2 test results appears promising, FDA may ask the sponsor to submit a treatment protocol to be reviewed under...

  7. 21 CFR 312.83 - Treatment protocols.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Treatment protocols. 312.83 Section 312.83 Food...-debilitating Illnesses § 312.83 Treatment protocols. If the preliminary analysis of phase 2 test results appears promising, FDA may ask the sponsor to submit a treatment protocol to be reviewed under...

  8. 21 CFR 312.83 - Treatment protocols.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 5 2012-04-01 2012-04-01 false Treatment protocols. 312.83 Section 312.83 Food...-debilitating Illnesses § 312.83 Treatment protocols. If the preliminary analysis of phase 2 test results appears promising, FDA may ask the sponsor to submit a treatment protocol to be reviewed under...

  9. 21 CFR 312.83 - Treatment protocols.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 5 2014-04-01 2014-04-01 false Treatment protocols. 312.83 Section 312.83 Food...-debilitating Illnesses § 312.83 Treatment protocols. If the preliminary analysis of phase 2 test results appears promising, FDA may ask the sponsor to submit a treatment protocol to be reviewed under...

  10. New practical treadmill protocol for clinical use.

    PubMed

    Wolthuis, R A; Froelicher, V F; Fischer, J; Noguera, I; Davis, G; Stewart, A J; Triebwasser, J H

    1977-05-01

    A new continuous treadmill protocol (USAFSAM) has been designed using a constant treadmill speed (3.3 miles/hour) and regular equal increments in treadmill grade (5%/3min). The constant treadmill speed requires only initial adaptation in patient stride, reduces technician adjustments and produces less electrocardiographic motion artifact than do protocols using multiple or higher treadmill speeds, or both. The regular equal increments in treadmill grade are easy to implement and provide a larger number of work loads than do protocols that are discontinuous or require larger changes in work load. The USAFSAM protocol was compared with the older Balke-Ware protocol in 26 healthy men (aged 30 to 59 years). Each fasting subject completed two maximal treadmill tests from each protocol. Measurements included minute heart rate from the electrocardiogram, auscultatory blood pressures and oxygen consumption obtained with standard techniques. Similarities in between-protocol measurements for submaximal and maximal treadmill efforts were impressive; differences were small and unimportant. Further, both protocols showed equal reproducibility for the measurements noted. Importantly, time to maximal effort was reduced by 24% with the USAFSAM protocol. The USAFSAM treadmill protocol has since been used in more than 500 clinical and screening examinations, thus confirming its advantages and practicality for routine clinical stress testing. Normal reference values previously established for the Balke-Ware protocol are shown to apply to the new USAFSAM protocol as well. PMID:857630

  11. Reactive broadcasting protocol for video on demand

    NASA Astrophysics Data System (ADS)

    Paris, Jehan-Francois; Carter, Steven W.; Long, Darrell D. E.

    1999-12-01

    We propose a reactive broadcasting protocol that addresses the problem of distributing moderately popular videos in a more efficient fashion. Like all efficient broadcasting protocols, reactive broadcasting assumes that the customer set-top box has enough local storage to store at least one half of each video being watched. Unlike other broadcasting protocols, reactive broadcasting only broadcasts the later portions of each video. the initial segment of each video is distributed on demand using a stream tapping protocol. Our simulations show that reactive broadcasting outperforms both conventional broadcasting protocols and pure stream tapping for a wide range of video request rates.

  12. Nonblocking and orphan free message logging protocols

    NASA Technical Reports Server (NTRS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-01-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  13. INEEL AIR MODELING PROTOCOL ext

    SciTech Connect

    C. S. Staley; M. L. Abbott; P. D. Ritter

    2004-12-01

    Various laws stemming from the Clean Air Act of 1970 and the Clean Air Act amendments of 1990 require air emissions modeling. Modeling is used to ensure that air emissions from new projects and from modifications to existing facilities do not exceed certain standards. For radionuclides, any new airborne release must be modeled to show that downwind receptors do not receive exposures exceeding the dose limits and to determine the requirements for emissions monitoring. For criteria and toxic pollutants, emissions usually must first exceed threshold values before modeling of downwind concentrations is required. This document was prepared to provide guidance for performing environmental compliance-driven air modeling of emissions from Idaho National Engineering and Environmental Laboratory facilities. This document assumes that the user has experience in air modeling and dose and risk assessment. It is not intended to be a "cookbook," nor should all recommendations herein be construed as requirements. However, there are certain procedures that are required by law, and these are pointed out. It is also important to understand that air emissions modeling is a constantly evolving process. This document should, therefore, be reviewed periodically and revised as needed. The document is divided into two parts. Part A is the protocol for radiological assessments, and Part B is for nonradiological assessments. This document is an update of and supersedes document INEEL/INT-98-00236, Rev. 0, INEEL Air Modeling Protocol. This updated document incorporates changes in some of the rules, procedures, and air modeling codes that have occurred since the protocol was first published in 1998.

  14. Planetary Protection Alternate Protocol Certification

    NASA Astrophysics Data System (ADS)

    Baker, Amy; Barengoltz, Jack; Tisdale, David

    The talk presents a standardized approach for new method certification or alterative testing protocol (ATP) certification against the existing U.S. Planetary Protection Standards. In consideration of new method certification there are two phases of activities that are relevant to the certification process. The first is sample acquisition which typically incorporates swab or wipes sampling on relevant hardware, associated facilities and ground support equipment. The sampling methods introduce considerations of field sampling efficiency as it relates to spore distribution on the spacecraft, spacecraft material influences on the ability of the swab or wipe to remove spores from the hardware, the types of swabs and wipes used (polyester, cotton, macrofoam), and human sampling influences. The second portion of a new protocol certification looks specifically at the lab work-up or analysis of the samples provided to the laboratory. Variables in this process include selection of appropriate biomarkers, extraction efficiencies (removal of spores or constituents of interest from the sampling device), and a method's ability to accurately determine the number of spores present in the sample with a statistically valid level of confidence as described by parameters such as precision, accuracy, robustness, specificity and sensitivity. Considerations for alternative testing protocols such as those which utilize bioburden reduction techniques include selection of appropriate biomarkers for testing, test materials and a defined statistical approach that provides sufficient scientific data to support the modification of an existing NASA specification or the generation of a new NASA specification. Synergies between the U.S. and European Space Agency approaches will also be discussed.

  15. Effective Protocols for Mobile Communications and Networking

    SciTech Connect

    Espinoza, J.; Sholander, P.; Van Leeuwen, B,

    1998-12-01

    This report examines methods of mobile communications with an emphasis on mobile computing and wireless communications. Many of the advances in communications involve the use of Internet Protocol (IP), Asynchronous Transfer Mode (ATM), and ad hoc network protocols. However, many of the advances in these protocols have been focused on wired communications. Recently much focus has been directed at advancing communication technology in the area of mobile wireless networks. This report discusses various protocols used in mobile communications and proposes a number of extensions to existing protocols. A detailed discussion is also included on desirable protocol characteristics and evaluation criteria. In addition, the report includes a discussion on several network simulation tools that maybe used to evaluate network protocols.

  16. Protocolized Resuscitation of Burn Patients.

    PubMed

    Cancio, Leopoldo C; Salinas, Jose; Kramer, George C

    2016-10-01

    Fluid resuscitation of burn patients is commonly initiated using modified Brooke or Parkland formula. The fluid infusion rate is titrated up or down hourly to maintain adequate urine output and other endpoints. Over-resuscitation leads to morbid complications. Adherence to paper-based protocols, flow sheets, and clinical practice guidelines is associated with decreased fluid resuscitation volumes and complications. Computerized tools assist providers. Although completely autonomous closed-loop control of resuscitation has been demonstrated in animal models of burn shock, the major advantages of open-loop and decision-support systems are identifying trends, enhancing situational awareness, and encouraging burn team communication. PMID:27600131

  17. A structured data transfer protocol

    NASA Technical Reports Server (NTRS)

    Barrett, P.; Rots, A.

    1992-01-01

    The transfer of data between different computers and programs can be a major obstacle during data analysis. We present a new data transfer protocol which is based on a simple structure containing a value, an error, and a unit. Each of these members can be arrays or another structure. The ability to nest structures allows for the concept of objects. When using an object-oriented language such as C++, reference can be made to the object name instead of each element explicitly. Prototype code has been written which implements the basic design with enhancements planned for the future.

  18. Mars Sample Quarantine Protocol Workshop

    NASA Technical Reports Server (NTRS)

    DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)

    1999-01-01

    The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.

  19. Analyzing the effect of routing protocols on media access control protocols in radio networks

    SciTech Connect

    Barrett, C. L.; Drozda, M.; Marathe, A.; Marathe, M. V.

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  20. Protocols for calibrating multibeam sonar.

    PubMed

    Foote, Kenneth G; Chu, Dezhang; Hammar, Terence R; Baldwin, Kenneth C; Mayer, Larry A; Hufnagle, Lawrence C; Jech, J Michael

    2005-04-01

    Development of protocols for calibrating multibeam sonar by means of the standard-target method is documented. Particular systems used in the development work included three that provide the water-column signals, namely the SIMRAD SM2000/90- and 200-kHz sonars and RESON SeaBat 8101 sonar, with operating frequency of 240 kHz. Two facilities were instrumented specifically for the work: a sea well at the Woods Hole Oceanographic Institution and a large, indoor freshwater tank at the University of New Hampshire. Methods for measuring the transfer characteristics of each sonar, with transducers attached, are described and illustrated with measurement results. The principal results, however, are the protocols themselves. These are elaborated for positioning the target, choosing the receiver gain function, quantifying the system stability, mapping the directionality in the plane of the receiving array and in the plane normal to the central axis, measuring the directionality of individual beams, and measuring the nearfield response. General preparations for calibrating multibeam sonars and a method for measuring the receiver response electronically are outlined. Advantages of multibeam sonar calibration and outstanding problems, such as that of validation of the performance of multibeam sonars as configured for use, are mentioned.

  1. A Unified Fault-Tolerance Protocol

    NASA Technical Reports Server (NTRS)

    Miner, Paul; Gedser, Alfons; Pike, Lee; Maddalon, Jeffrey

    2004-01-01

    Davies and Wakerly show that Byzantine fault tolerance can be achieved by a cascade of broadcasts and middle value select functions. We present an extension of the Davies and Wakerly protocol, the unified protocol, and its proof of correctness. We prove that it satisfies validity and agreement properties for communication of exact values. We then introduce bounded communication error into the model. Inexact communication is inherent for clock synchronization protocols. We prove that validity and agreement properties hold for inexact communication, and that exact communication is a special case. As a running example, we illustrate the unified protocol using the SPIDER family of fault-tolerant architectures. In particular we demonstrate that the SPIDER interactive consistency, distributed diagnosis, and clock synchronization protocols are instances of the unified protocol.

  2. [The research protocol. Part I].

    PubMed

    Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2015-01-01

    One of the principal aims in research is the publication of the study in scientific journals. This implies two challenges: the first one, the election of an adequate research design, and the second one, the concrete and simple wording of the results for the study to be accepted in the most appropriate journal according to the scope. Despite numerous supporting documents are available for both issues, the publication process is long, tiresome, and can discourage the most enthusiastic researcher. This is the first of a series of articles with the objective to describe the steps from the research question to the publication of the study. First of all, the importance of the research design will be addressed. The structure of the protocol is essential to achieve the objectives, and provides a way to organize the investigation in a logic, comprehensible and efficient manner.

  3. [Climate change and Kyoto protocol].

    PubMed

    Ergasti, G; Pippia, V; Murzilli, G; De Luca D'Alessandro, E

    2009-01-01

    Due to industrial revolution and the heavy use of fossil fuels, the concentration of greenhouse gases in the atmosphere has increased dramatically during the last hundred years, and this has lead to an increase in mean global temperature. The environmental consequences of this are: the melting of the ice caps, an increase in mean sea-levels, catastrophic events such as floodings, hurricanes and earthquakes, changes to the animal and vegetable kingdoms, a growth in vectors and bacteria in water thus increasing the risk of infectious diseases and damage to agriculture. The toxic effects of the pollution on human health are both acute and chronic. The Kyoto Protocol is an important step in the campaign against climatic changes but it is not sufficient. A possible solution might be for the States which produce the most of pollution to adopt a better political stance for the environment and to use renewable resources for the production of energy.

  4. Canine adenovirus downstream processing protocol.

    PubMed

    Puig, Meritxell; Piedra, Jose; Miravet, Susana; Segura, María Mercedes

    2014-01-01

    Adenovirus vectors are efficient gene delivery tools. A major caveat with vectors derived from common human adenovirus serotypes is that most adults are likely to have been exposed to the wild-type virus and exhibit active immunity against the vectors. This preexisting immunity limits their clinical success. Strategies to circumvent this problem include the use of nonhuman adenovirus vectors. Vectors derived from canine adenovirus type 2 (CAV-2) are among the best-studied representatives. CAV-2 vectors are particularly attractive for the treatment of neurodegenerative disorders. In addition, CAV-2 vectors have shown great promise as oncolytic agents in virotherapy approaches and as vectors for recombinant vaccines. The rising interest in CAV-2 vectors calls for the development of scalable GMP compliant production and purification strategies. A detailed protocol describing a complete scalable downstream processing strategy for CAV-2 vectors is reported here. Clarification of CAV-2 particles is achieved by microfiltration. CAV-2 particles are subsequently concentrated and partially purified by ultrafiltration-diafiltration. A Benzonase(®) digestion step is carried out between ultrafiltration and diafiltration operations to eliminate contaminating nucleic acids. Chromatography purification is accomplished in two consecutive steps. CAV-2 particles are first captured and concentrated on a propyl hydrophobic interaction chromatography column followed by a polishing step using DEAE anion exchange monoliths. Using this protocol, high-quality CAV-2 vector preparations containing low levels of contamination with empty viral capsids and other inactive vector forms are typically obtained. The complete process yield was estimated to be 38-45 %. PMID:24132487

  5. Proposal for a Simple and Efficient Monthly Quality Management Program Assessing the Consistency of Robotic Image-Guided Small Animal Radiation Systems.

    PubMed

    Brodin, N Patrik; Guha, Chandan; Tomé, Wolfgang A

    2015-11-01

    Modern pre-clinical radiation therapy (RT) research requires high precision and accurate dosimetry to facilitate the translation of research findings into clinical practice. Several systems are available that provide precise delivery and on-board imaging capabilities, highlighting the need for a quality management program (QMP) to ensure consistent and accurate radiation dose delivery. An ongoing, simple, and efficient QMP for image-guided robotic small animal irradiators used in pre-clinical RT research is described. Protocols were developed and implemented to assess the dose output constancy (based on the AAPM TG-61 protocol), cone-beam computed tomography (CBCT) image quality and object representation accuracy (using a custom-designed imaging phantom), CBCT-guided target localization accuracy and consistency of the CBCT-based dose calculation. To facilitate an efficient read-out and limit the user dependence of the QMP data analysis, a semi-automatic image analysis and data representation program was developed using the technical computing software MATLAB. The results of the first 6-mo experience using the suggested QMP for a Small Animal Radiation Research Platform (SARRP) are presented, with data collected on a bi-monthly basis. The dosimetric output constancy was established to be within ±1 %, the consistency of the image resolution was within ±0.2 mm, the accuracy of CBCT-guided target localization was within ±0.5 mm, and dose calculation consistency was within ±2 s (±3%) per treatment beam. Based on these results, this simple quality assurance program allows for the detection of inconsistencies in dosimetric or imaging parameters that are beyond the acceptable variability for a reliable and accurate pre-clinical RT system, on a monthly or bi-monthly basis. PMID:26425981

  6. Layered protocols in voice interaction with computers

    NASA Astrophysics Data System (ADS)

    Taylor, M. M.

    1987-02-01

    The Layered Protocol model for human computer interfaces is described, with special reference to the problems of voice input and output. In a layered protocol, each level passes virtual messages back and forth between human and computer. These virtual messages are realized in the form of interchanges at the level below. The protocol at a level is analogous to the syntax of a sentence, in that it is the method by which the content of a message can be given an agreed interpretation. Each protocol can be designed or evaluated independently of all the others in an interface. The stability of a protocol is determined by its response delays and by the channel capacity of the lower level protocols that support its messages. Sometimes an unstable protocol can be stabilized and speeded by reducing the message rate of the supporting protocols. Users have been observed to do this intuitively. Voice input provides special problems because of the relatively high error probability inherent in the recognizer: errors in other modalities are likely to be due to operator fault. This tends to lead to unwarranted distrust of voice input, and to demands for types of feedback that are probably inappropriate to the level of protocol to which the recognizer is suited. Voice output can be used by the computer to initiate protocols, or to provide a response channel for protocols under conditions where the user's eyes are otherwise occupied. Consideration of protocol demands helps to clarify the requirements for precision in recognition, and for the characteristics of computer responses to voice input; it helps also in judging appropriate conditions for the use of voice output.

  7. Status Report on the UNIDROIT Space Protocol

    NASA Astrophysics Data System (ADS)

    Larsen, Paul B.

    2002-01-01

    In my status report on the UNIDROIT Space Protocol I will describe the history and purpose of the Space Protocol; I will state the Protocol's relationship to the UNIDROIT Convention on International Interests in Mobile Equipment in particular after the 2001 Cape Town Diplomatic Conference on the Convention. I will describe the COPUOS study of possible conflicts with the existing space law treaties and explain UNIDROIT's objective of avoiding conflicts between existing space law and the Space Protocol. Finally I will describe future steps to be taken.

  8. Quantum three-pass cryptography protocol

    NASA Astrophysics Data System (ADS)

    Yang, Li; Wu, Ling-An; Liu, Songhao

    2002-09-01

    We present a new kind of quantum cryptography protocol based on Shamir's three-pass protocol of classical cryptography, which allows the transmission of qubits directly and secretly via the aid of an unjammable classical channel. In this protocol we implement the encryption and decryption transformations via rotations on the Poincare sphere of the photons polarization parameters. The key technique is that Bob's encryption rotation must be commutative with Alice s decryption rotation; this means that the axes of these two rotations must be parallel. We also present a security analysis of the protocol under a man-in-the-middle attack.

  9. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  10. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically

  11. 21 CFR 1301.18 - Research protocols.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... to 21 U.S.C. 355(i) and 21 CFR 130.3, I, (Name and Address of IND Sponsor) submitted a Notice of... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Research protocols. 1301.18 Section 1301.18 Food..., DISTRIBUTORS, AND DISPENSERS OF CONTROLLED SUBSTANCES Registration § 1301.18 Research protocols. (a) A...

  12. Delay Tolerant Networking - Bundle Protocol Simulation

    NASA Technical Reports Server (NTRS)

    SeGui, John; Jenning, Esther

    2006-01-01

    In this paper, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the useof MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions.

  13. STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM

    EPA Science Inventory

    The manual, in support of the Florida Radon Research Program, contains standard protocols for key measurements where data quality is vital to the program. t contains two sections. he first section, soil measurements, contains field sampling protocols for soil gas permeability and...

  14. Open commit protocols tolerating commission failures

    SciTech Connect

    Rothermel, K.; Pappe, S. )

    1993-06-01

    To ensure atomicity of transactions in disturbed systems so-called 2-phase commit (2PC) protocols have been proposed. The basic assumption of these protocols is that the processing nodes involved in transactions are [open quotes]sane,[close quotes] i.e., they only fail with omission failures, and nodes eventually recover from failures. Unfortunately, this assumption is not realistic for so-called Open Distributed Systems (ODSs), in which nodes may have totally different reliability characteristics. In ODSs, nodes can be classified into trusted nodes (e.g., a banking server) and nontrusted nodes (e.g., a home PC requesting a remote banking service). While trusted nodes are assumed to be sane, nontrusted nodes may fail permanently and even cause commission failures to occur. In this paper, we propose a family of 2PC protocols that tolerate any number of omission failures at trusted nodes and any number of commission and omission failures at nontrusted nodes. The proposed protocols ensure that (at least) the trusted nodes participating in a transaction eventually terminate the transaction in a consistent manner. Unlike Byzantine commit protocols, our protocols do not incorporate mechanisms for achieving Byzantine agreement, which has advantages in terms of complexity: Our protocols have the same or only a slightly higher message complexity than traditional 2PC protocols. 31 refs., 10 figs., 3 tabs.

  15. Communication protocol standards for space data systems

    NASA Technical Reports Server (NTRS)

    Hooke, Adrian J.; Desjardins, Richard

    1990-01-01

    The main elements and requirements of advanced space data networks are identified. The communication protocol standards for use on space missions during the coming decades are described. In particular, the blending of high-performance space-unique data transmission techniques with off-the-shelf open systems interconnection (OSI) protocols is described.

  16. 21 CFR 1301.18 - Research protocols.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Research protocols. 1301.18 Section 1301.18 Food... duration of the project. (v) Location where the research will be conducted. (vi) Statement of the security... security provisions (as proscribed in paragraph (a)(2)(vi) of this section for a research protocol) to,...

  17. 21 CFR 1301.18 - Research protocols.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Research protocols. 1301.18 Section 1301.18 Food... duration of the project. (v) Location where the research will be conducted. (vi) Statement of the security... security provisions (as proscribed in paragraph (a)(2)(vi) of this section for a research protocol) to,...

  18. Massive transfusion and massive transfusion protocol

    PubMed Central

    Patil, Vijaya; Shetmahajan, Madhavi

    2014-01-01

    Haemorrhage remains a major cause of potentially preventable deaths. Rapid transfusion of large volumes of blood products is required in patients with haemorrhagic shock which may lead to a unique set of complications. Recently, protocol based management of these patients using massive transfusion protocol have shown improved outcomes. This section discusses in detail both management and complications of massive blood transfusion. PMID:25535421

  19. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Test protocol. 1212.4 Section 1212.4... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child test panel. (1) The test to determine if a multi-purpose lighter is resistant to successful...

  20. 16 CFR 1210.4 - Test protocol.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Test protocol. 1210.4 Section 1210.4... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses...

  1. Massive transfusion and massive transfusion protocol.

    PubMed

    Patil, Vijaya; Shetmahajan, Madhavi

    2014-09-01

    Haemorrhage remains a major cause of potentially preventable deaths. Rapid transfusion of large volumes of blood products is required in patients with haemorrhagic shock which may lead to a unique set of complications. Recently, protocol based management of these patients using massive transfusion protocol have shown improved outcomes. This section discusses in detail both management and complications of massive blood transfusion.

  2. Cryptanalysis of the arbitrated quantum signature protocols

    SciTech Connect

    Gao Fei; Qin Sujuan; Guo Fenzhuo; Wen Qiaoyan

    2011-08-15

    As a new model for signing quantum messages, arbitrated quantum signature (AQS) has recently received a lot of attention. In this paper we study the cryptanalysis of previous AQS protocols from the aspects of forgery and disavowal. We show that in these protocols the receiver, Bob, can realize existential forgery of the sender's signature under known message attack. Bob can even achieve universal forgery when the protocols are used to sign a classical message. Furthermore, the sender, Alice, can successfully disavow any of her signatures by simple attack. The attack strategies are described in detail and some discussions about the potential improvements of the protocols are given. Finally we also present several interesting topics on AQS protocols that can be studied in future.

  3. Accuracy of NHANES periodontal examination protocols.

    PubMed

    Eke, P I; Thornton-Evans, G O; Wei, L; Borgnakke, W S; Dye, B A

    2010-11-01

    This study evaluates the accuracy of periodontitis prevalence determined by the National Health and Nutrition Examination Survey (NHANES) partial-mouth periodontal examination protocols. True periodontitis prevalence was determined in a new convenience sample of 454 adults ≥ 35 years old, by a full-mouth "gold standard" periodontal examination. This actual prevalence was compared with prevalence resulting from analysis of the data according to the protocols of NHANES III and NHANES 2001-2004, respectively. Both NHANES protocols substantially underestimated the prevalence of periodontitis by 50% or more, depending on the periodontitis case definition used, and thus performed below threshold levels for moderate-to-high levels of validity for surveillance. Adding measurements from lingual or interproximal sites to the NHANES 2001-2004 protocol did not improve the accuracy sufficiently to reach acceptable sensitivity thresholds. These findings suggest that NHANES protocols produce high levels of misclassification of periodontitis cases and thus have low validity for surveillance and research.

  4. Security Weaknesses in Arbitrated Quantum Signature Protocols

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Zhang, Kejia; Cao, Tianqing

    2014-01-01

    Arbitrated quantum signature (AQS) is a cryptographic scenario in which the sender (signer), Alice, generates the signature of a message and then a receiver (verifier), Bob, can verify the signature with the help of a trusted arbitrator, Trent. In this paper, we point out there exist some security weaknesses in two AQS protocols. Our analysis shows Alice can successfully disavow any of her signatures by a simple attack in the first protocol. Furthermore, we study the security weaknesses of the second protocol from the aspects of forgery and disavowal. Some potential improvements of this kind of protocols are given. We also design a new method to authenticate a signature or a message, which makes AQS protocols immune to Alice's disavowal attack and Bob's forgery attack effectively.

  5. Cryptanalysis of the arbitrated quantum signature protocols

    NASA Astrophysics Data System (ADS)

    Gao, Fei; Qin, Su-Juan; Guo, Fen-Zhuo; Wen, Qiao-Yan

    2011-08-01

    As a new model for signing quantum messages, arbitrated quantum signature (AQS) has recently received a lot of attention. In this paper we study the cryptanalysis of previous AQS protocols from the aspects of forgery and disavowal. We show that in these protocols the receiver, Bob, can realize existential forgery of the sender's signature under known message attack. Bob can even achieve universal forgery when the protocols are used to sign a classical message. Furthermore, the sender, Alice, can successfully disavow any of her signatures by simple attack. The attack strategies are described in detail and some discussions about the potential improvements of the protocols are given. Finally we also present several interesting topics on AQS protocols that can be studied in future.

  6. The EXACT description of biomedical protocols

    PubMed Central

    Soldatova, Larisa N.; Aubrey, Wayne; King, Ross D.; Clare, Amanda

    2008-01-01

    Motivation: Many published manuscripts contain experiment protocols which are poorly described or deficient in information. This means that the published results are very hard or impossible to repeat. This problem is being made worse by the increasing complexity of high-throughput/automated methods. There is therefore a growing need to represent experiment protocols in an efficient and unambiguous way. Results: We have developed the Experiment ACTions (EXACT) ontology as the basis of a method of representing biological laboratory protocols. We provide example protocols that have been formalized using EXACT, and demonstrate the advantages and opportunities created by using this formalization. We argue that the use of EXACT will result in the publication of protocols with increased clarity and usefulness to the scientific community. Availability: The ontology, examples and code can be downloaded from http://www.aber.ac.uk/compsci/Research/bio/dss/EXACT/ Contact: Larisa Soldatova lss@aber.ac.uk PMID:18586727

  7. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of

  8. STANDARD OPERATING PROTOCOLS FOR DECOMMISSIONING

    SciTech Connect

    Foss, D. L.; Stevens, J. L.; Gerdeman, F. W.

    2002-02-25

    Decommissioning projects at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites are conducted under project-specific decision documents, which involve extensive preparation time, public comment periods, and regulatory approvals. Often, the decision documents must be initiated at least one year before commencing the decommissioning project, and they are expensive and time consuming to prepare. The Rocky Flats Environmental Technology Site (RFETS) is a former nuclear weapons production plant at which hazardous substances and wastes were released or disposed during operations. As a result of the releases, RFETS was placed on the National Priorities List in 1989, and is conducting cleanup activities under a federal facilities compliance agreement. Working closely with interested stakeholders and state and federal regulatory agencies, RFETS has developed and implemented an improved process for obtaining the approvals. The key to streamlining the approval process has been the development of sitewide decision documents called Rocky Flats Cleanup Agreement Standard Operating Protocols or ''RSOPs.'' RSOPs have broad applicability, and could be used instead of project-specific documents. Although no two decommissioning projects are exactly the same and they may vary widely in contamination and other hazards, the basic steps taken for cleanup are usually similar. Because of this, using RSOPs is more efficient than preparing a separate project-specific decision documents for each cleanup action. Over the Rocky Flats cleanup life cycle, using RSOPs has the potential to: (1) Save over 5 million dollars and 6 months on the site closure schedule; (2) Eliminate preparing one hundred and twenty project-specific decision documents; and (3) Eliminate writing seventy-five closure description documents for hazardous waste unit closure and corrective actions.

  9. Per-packet reservation MAC protocols

    NASA Astrophysics Data System (ADS)

    Hrasnica, Halid

    2006-10-01

    Recent and future communications networks have to provide QoS guarantees for a rapidly growing number of various telecommunications services. Therefore, various communications systems, such as wireless and fixed access networks, apply reservation MAC protocols, providing a good network utilization, which is particularly important access networks with typically limited data rates, and ensuring realization of different QoS guarantees for various telecommunications services. This is important because of a hard competition among communications technologies applied in the access area. The considered MAC protocols apply a per-packet reservation method to avoid the transmission gaps caused by per-burst reservation, and accordingly to achieve a better network utilization. However, the per-packet reservation increases the network load caused by the signaling, which calls for an efficient resource sharing strategy in the signaling channel. There are two basic solutions for capacity sharing in the signaling channel: random access, usually using slotted ALOHA, and dedicated access, realized by a polling method. Performance improvement of basic protocols can be carried out in different ways; by protocol extensions, a combination of different protocol solutions, and the application of adaptive protocols providing a change of access parameters according to the current network status. The best network performance is achieved by application of two-step reservation protocol, which combined with additional features, such as appropriate signaling procedure, priority and fairness mechanisms, and combined reservation domains, can fulfill requirements of services with high QoS demands.

  10. New logistics protocols for distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Taylor, Darrin; Morrison, John; Katz, Warren; Felton, Erik; Herman, Deborah A.

    1995-06-01

    In today's environment, the transportation and maintenance of military forces is nearly as important as combat operations. Rapid deployment to regions of low-intensity conflict will become a very common training scenario for the U.S. military. Thus it is desirable to apply distributed simulation technology to train logistics personnel in their combat support roles. Currently, distributed interactive simulation (DIS) only contains rudimentary logistics protocols. This paper introduces new protocols designed to handle the logistics problem. The Newtonian protocol takes a physics-based approach to modeling interactions on the simulation network. This protocol consists of a family of protocol data units (PDUs) which are used to communicate forces in different circumstances. The protocol implements a small set of physical relations. This represents a flexible and general mechanism to describe battlefield interactions between network entities. The migratory object protocol (MOP) family addresses the transfer of control. General mechanisms provide the means to simulate resupply, repair, and maintenance of entities at any level of abstraction (individual soldier to division). It can also increase the fidelity of mine laying, enable handover of weapons for terminal guidance, allow for the distribution of aggregate-level simulation entities, provide capabilities for the simulation of personnel, etc.

  11. Snakebite management in Iran: Devising a protocol

    PubMed Central

    Monzavi, Seyed Mostafa; Dadpour, Bita; Afshari, Reza

    2014-01-01

    Background: Snakebite in Iran has been a health concern. However, management of snakebite is not standardized and varies from center to center. This study is aimed at devising an evidence-based comprehensive protocol for snakebite management in Iran, to reduce unnecessary variations in practice. Materials and Methods: A narrative search in electronic databases was performed. Fifty peer-reviewed articles, guidelines, and textbooks were reviewed and practical details were extracted. Our currently used protocol in the Mashhad Toxicology Center was supplemented with this information. Consequently an improved wide-range protocol was developed. The protocol was then discussed and amended within a focus group comprised of medical toxicologists and internal medicine specialists. The amended version was finally discussed with expert physicians specialized in different areas of medicine, to be optimized by supplementing other specific considerations. Results: During a one-year process, the protocol was finalized. The final version of the protocol, which was designed in six steps, comprised of three components: A schematic algorithm, a severity grading scale, and instructions for supportive and adjunctive treatments. The algorithm pertains to both Viperidae and Elapidae snakebite envenomations and consists of a planned course of action and dosing of antivenom, based on the severity of the envenomation. Conclusion: Snakebite envenomation is a clinical toxicologic emergency, which needs to be treated in a timely and organized manner. Hence, a multi-aspect protocol was designed to improve the clinical outcomes, reduce unnecessary administration of antivenom, and help physicians make more proper clinical judgments. PMID:24778670

  12. Quantum key distribution protocol using random bases

    NASA Astrophysics Data System (ADS)

    Meslouhi, A.; Amellal, H.; Hassouni, Y.; El Baz, M.; El Allati, A.

    2016-04-01

    In order to enhance the quantum key distribution (QKD) security, a new protocol, “QKDPRB” based on random bases is proposed. It consists of using standard encoding bases moving circularly with a variable rotational angle α which depends on angular velocity ω(t); thus, the traditional bases turn into relative ones. To prove the security and the efficiency of the protocol, we present a universal demonstration which proves a high level security of the proposed protocol, even in the presence of the intercept and resend attack. Finally, the QKDPRB may improve the security of QKD.

  13. Performance evaluation of TCP over ABT protocols

    NASA Astrophysics Data System (ADS)

    Ata, Shingo; Murata, Masayuki; Miyahara, Hideo

    1998-10-01

    ABT is promising for effectively transferring a highly bursty data traffic in ATM networks. Most of past studies focused on the data transfer capability of ABT within the ATM layer. In actual, however, we need to consider the upper layer transport protocol since the transport layer protocol also supports a network congestion control mechanism. One such example is TCP, which is now widely used in the Internet. In this paper, we evaluate the performance of TCP over ABT protocols. Simulation results show that the retransmission mechanism of ABT can effectively overlay the TCP congestion control mechanism so that TCP operates in a stable fashion and works well only as an error recovery mechanism.

  14. Protocol dependence of the jamming transition

    NASA Astrophysics Data System (ADS)

    Bertrand, Thibault; Behringer, Robert P.; Chakraborty, Bulbul; O'Hern, Corey S.; Shattuck, Mark D.

    2016-01-01

    We propose a theoretical framework for predicting the protocol dependence of the jamming transition for frictionless spherical particles that interact via repulsive contact forces. We study isostatic jammed disk packings obtained via two protocols: isotropic compression and simple shear. We show that for frictionless systems, all jammed packings can be obtained via either protocol. However, the probability to obtain a particular jammed packing depends on the packing-generation protocol. We predict the average shear strain required to jam initially unjammed isotropically compressed packings from the density of jammed packings, shape of their basins of attraction, and path traversed in configuration space. We compare our predictions to simulations of shear strain-induced jamming and find quantitative agreement. We also show that the packing fraction range, over which shear strain-induced jamming occurs, tends to zero in the large system limit for frictionless packings with overdamped dynamics.

  15. Authentication Protocol using Quantum Superposition States

    SciTech Connect

    Kanamori, Yoshito; Yoo, Seong-Moo; Gregory, Don A.; Sheldon, Frederick T

    2009-01-01

    When it became known that quantum computers could break the RSA (named for its creators - Rivest, Shamir, and Adleman) encryption algorithm within a polynomial-time, quantum cryptography began to be actively studied. Other classical cryptographic algorithms are only secure when malicious users do not have sufficient computational power to break security within a practical amount of time. Recently, many quantum authentication protocols sharing quantum entangled particles between communicators have been proposed, providing unconditional security. An issue caused by sharing quantum entangled particles is that it may not be simple to apply these protocols to authenticate a specific user in a group of many users. An authentication protocol using quantum superposition states instead of quantum entangled particles is proposed. The random number shared between a sender and a receiver can be used for classical encryption after the authentication has succeeded. The proposed protocol can be implemented with the current technologies we introduce in this paper.

  16. Protocol to Exploit Waiting Resources for UASNs.

    PubMed

    Hung, Li-Ling; Luo, Yung-Jeng

    2016-01-01

    The transmission speed of acoustic waves in water is much slower than that of radio waves in terrestrial wireless sensor networks. Thus, the propagation delay in underwater acoustic sensor networks (UASN) is much greater. Longer propagation delay leads to complicated communication and collision problems. To solve collision problems, some studies have proposed waiting mechanisms; however, long waiting mechanisms result in low bandwidth utilization. To improve throughput, this study proposes a slotted medium access control protocol to enhance bandwidth utilization in UASNs. The proposed mechanism increases communication by exploiting temporal and spatial resources that are typically idle in order to protect communication against interference. By reducing wait time, network performance and energy consumption can be improved. A performance evaluation demonstrates that when the data packets are large or sensor deployment is dense, the energy consumption of proposed protocol is less than that of existing protocols as well as the throughput is higher than that of existing protocols.

  17. NREL Test-to-Failure Protocol (Presentation)

    SciTech Connect

    Hacke, P.

    2012-03-01

    The presentation describes the test-to-failure protocol that was developed and piloted at NREL, stressing PV modules with multiple applications of damp heat (with bias) and thermal cycling until they fail.

  18. Protocol to Exploit Waiting Resources for UASNs.

    PubMed

    Hung, Li-Ling; Luo, Yung-Jeng

    2016-01-01

    The transmission speed of acoustic waves in water is much slower than that of radio waves in terrestrial wireless sensor networks. Thus, the propagation delay in underwater acoustic sensor networks (UASN) is much greater. Longer propagation delay leads to complicated communication and collision problems. To solve collision problems, some studies have proposed waiting mechanisms; however, long waiting mechanisms result in low bandwidth utilization. To improve throughput, this study proposes a slotted medium access control protocol to enhance bandwidth utilization in UASNs. The proposed mechanism increases communication by exploiting temporal and spatial resources that are typically idle in order to protect communication against interference. By reducing wait time, network performance and energy consumption can be improved. A performance evaluation demonstrates that when the data packets are large or sensor deployment is dense, the energy consumption of proposed protocol is less than that of existing protocols as well as the throughput is higher than that of existing protocols. PMID:27005624

  19. Oncotherapy: A System for Requesting Chemotherapy Protocols.

    PubMed

    Righi, Laura Vera

    2015-01-01

    A clinical decision support system is able to provide oncologists with suitable treatment options at the moment of decision making regarding which chemotherapy protocol is the best to apply to a particular oncological case. The National Cancer Institute has created a Guidelines Committee that establishes therapeutical options for each clinical case. The Health Informatics Department has developed Oncotherapy, a knowledge database that incorporates information provided by the Guidelines Committee. Oncotherapy includes a tailored information repository to provide oncologists in the public health system with the chemotherapy protocols available given three types of data: clinical diagnosis, clinical stage and therapy criteria. The protocol selected by the treating oncologist is sent back to Oncotherapy, which may create new knowledge that can be incorporated into the knowledge database. In this way, the system supports making the best decision according to the chemotherapy protocol options available. Furthermore, it can warn of errors that could result from mistakenly chosen therapies. PMID:26262420

  20. Routing protocols in wireless sensor networks.

    PubMed

    Villalba, Luis Javier García; Orozco, Ana Lucila Sandoval; Cabrera, Alicia Triviño; Abbas, Cláudia Jacy Barenco

    2009-01-01

    The applications of wireless sensor networks comprise a wide variety of scenarios. In most of them, the network is composed of a significant number of nodes deployed in an extensive area in which not all nodes are directly connected. Then, the data exchange is supported by multihop communications. Routing protocols are in charge of discovering and maintaining the routes in the network. However, the appropriateness of a particular routing protocol mainly depends on the capabilities of the nodes and on the application requirements. This paper presents a review of the main routing protocols proposed for wireless sensor networks. Additionally, the paper includes the efforts carried out by Spanish universities on developing optimization techniques in the area of routing protocols for wireless sensor networks. PMID:22291515

  1. 21 CFR 58.120 - Protocol.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., body weight range, sex, source of supply, species, strain, substrain, and age of the test system. (5... methods to be used. (b) All changes in or revisions of an approved protocol and the reasons...

  2. Routing Protocols in Wireless Sensor Networks

    PubMed Central

    Villalba, Luis Javier García; Orozco, Ana Lucila Sandoval; Cabrera, Alicia Triviño; Abbas, Cláudia Jacy Barenco

    2009-01-01

    The applications of wireless sensor networks comprise a wide variety of scenarios. In most of them, the network is composed of a significant number of nodes deployed in an extensive area in which not all nodes are directly connected. Then, the data exchange is supported by multihop communications. Routing protocols are in charge of discovering and maintaining the routes in the network. However, the appropriateness of a particular routing protocol mainly depends on the capabilities of the nodes and on the application requirements. This paper presents a review of the main routing protocols proposed for wireless sensor networks. Additionally, the paper includes the efforts carried out by Spanish universities on developing optimization techniques in the area of routing protocols for wireless sensor networks. PMID:22291515

  3. Evolution of Natural Attenuation Evaluation Protocols

    EPA Science Inventory

    Traditionally the evaluation of the efficacy of natural attenuation was based on changes in contaminant concentrations and mass reduction. Statistical tools and models such as Bioscreen provided evaluation protocols which now are being approached via other vehicles including m...

  4. A practical quantum bit commitment protocol

    NASA Astrophysics Data System (ADS)

    Arash Sheikholeslam, S.; Aaron Gulliver, T.

    2012-01-01

    In this paper, we introduce a new quantum bit commitment protocol which is secure against entanglement attacks. A general cheating strategy is examined and shown to be practically ineffective against the proposed approach.

  5. Putting the Human Back in the Protocol

    NASA Astrophysics Data System (ADS)

    Christianson, Bruce

    Hello, everyone, and welcome to the 14th International Security Protocols Workshop. I’m going to start with a quotation from someone who, at least in principle, is in charge of a very different security community than ours:

  6. A Look Back at the Montreal Protocol

    NASA Video Gallery

    The Montreal Protocol is an international treaty designed to protect the ozone layer. This video takes a look back at how scientists, industry leaders, and policy makers came together to regulate C...

  7. Protocol dependence of the jamming transition.

    PubMed

    Bertrand, Thibault; Behringer, Robert P; Chakraborty, Bulbul; O'Hern, Corey S; Shattuck, Mark D

    2016-01-01

    We propose a theoretical framework for predicting the protocol dependence of the jamming transition for frictionless spherical particles that interact via repulsive contact forces. We study isostatic jammed disk packings obtained via two protocols: isotropic compression and simple shear. We show that for frictionless systems, all jammed packings can be obtained via either protocol. However, the probability to obtain a particular jammed packing depends on the packing-generation protocol. We predict the average shear strain required to jam initially unjammed isotropically compressed packings from the density of jammed packings, shape of their basins of attraction, and path traversed in configuration space. We compare our predictions to simulations of shear strain-induced jamming and find quantitative agreement. We also show that the packing fraction range, over which shear strain-induced jamming occurs, tends to zero in the large system limit for frictionless packings with overdamped dynamics. PMID:26871137

  8. Absolute dosimetry on a dynamically scanned sample for synchrotron radiotherapy using graphite calorimetry and ionization chambers

    NASA Astrophysics Data System (ADS)

    Lye, J. E.; Harty, P. D.; Butler, D. J.; Crosbie, J. C.; Livingstone, J.; Poole, C. M.; Ramanathan, G.; Wright, T.; Stevenson, A. W.

    2016-06-01

    The absolute dose delivered to a dynamically scanned sample in the Imaging and Medical Beamline (IMBL) on the Australian Synchrotron was measured with a graphite calorimeter anticipated to be established as a primary standard for synchrotron dosimetry. The calorimetry was compared to measurements using a free-air chamber (FAC), a PTW 31 014 Pinpoint ionization chamber, and a PTW 34 001 Roos ionization chamber. The IMBL beam height is limited to approximately 2 mm. To produce clinically useful beams of a few centimetres the beam must be scanned in the vertical direction. In practice it is the patient/detector that is scanned and the scanning velocity defines the dose that is delivered. The calorimeter, FAC, and Roos chamber measure the dose area product which is then converted to central axis dose with the scanned beam area derived from Monte Carlo (MC) simulations and film measurements. The Pinpoint chamber measures the central axis dose directly and does not require beam area measurements. The calorimeter and FAC measure dose from first principles. The calorimetry requires conversion of the measured absorbed dose to graphite to absorbed dose to water using MC calculations with the EGSnrc code. Air kerma measurements from the free air chamber were converted to absorbed dose to water using the AAPM TG-61 protocol. The two ionization chambers are secondary standards requiring calibration with kilovoltage x-ray tubes. The Roos and Pinpoint chambers were calibrated against the Australian primary standard for air kerma at the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA). Agreement of order 2% or better was obtained between the calorimetry and ionization chambers. The FAC measured a dose 3-5% higher than the calorimetry, within the stated uncertainties.

  9. Absolute dosimetry on a dynamically scanned sample for synchrotron radiotherapy using graphite calorimetry and ionization chambers.

    PubMed

    Lye, J E; Harty, P D; Butler, D J; Crosbie, J C; Livingstone, J; Poole, C M; Ramanathan, G; Wright, T; Stevenson, A W

    2016-06-01

    The absolute dose delivered to a dynamically scanned sample in the Imaging and Medical Beamline (IMBL) on the Australian Synchrotron was measured with a graphite calorimeter anticipated to be established as a primary standard for synchrotron dosimetry. The calorimetry was compared to measurements using a free-air chamber (FAC), a PTW 31 014 Pinpoint ionization chamber, and a PTW 34 001 Roos ionization chamber. The IMBL beam height is limited to approximately 2 mm. To produce clinically useful beams of a few centimetres the beam must be scanned in the vertical direction. In practice it is the patient/detector that is scanned and the scanning velocity defines the dose that is delivered. The calorimeter, FAC, and Roos chamber measure the dose area product which is then converted to central axis dose with the scanned beam area derived from Monte Carlo (MC) simulations and film measurements. The Pinpoint chamber measures the central axis dose directly and does not require beam area measurements. The calorimeter and FAC measure dose from first principles. The calorimetry requires conversion of the measured absorbed dose to graphite to absorbed dose to water using MC calculations with the EGSnrc code. Air kerma measurements from the free air chamber were converted to absorbed dose to water using the AAPM TG-61 protocol. The two ionization chambers are secondary standards requiring calibration with kilovoltage x-ray tubes. The Roos and Pinpoint chambers were calibrated against the Australian primary standard for air kerma at the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA). Agreement of order 2% or better was obtained between the calorimetry and ionization chambers. The FAC measured a dose 3-5% higher than the calorimetry, within the stated uncertainties. PMID:27192396

  10. Absolute dosimetry on a dynamically scanned sample for synchrotron radiotherapy using graphite calorimetry and ionization chambers

    NASA Astrophysics Data System (ADS)

    Lye, J. E.; Harty, P. D.; Butler, D. J.; Crosbie, J. C.; Livingstone, J.; Poole, C. M.; Ramanathan, G.; Wright, T.; Stevenson, A. W.

    2016-06-01

    The absolute dose delivered to a dynamically scanned sample in the Imaging and Medical Beamline (IMBL) on the Australian Synchrotron was measured with a graphite calorimeter anticipated to be established as a primary standard for synchrotron dosimetry. The calorimetry was compared to measurements using a free-air chamber (FAC), a PTW 31 014 Pinpoint ionization chamber, and a PTW 34 001 Roos ionization chamber. The IMBL beam height is limited to approximately 2 mm. To produce clinically useful beams of a few centimetres the beam must be scanned in the vertical direction. In practice it is the patient/detector that is scanned and the scanning velocity defines the dose that is delivered. The calorimeter, FAC, and Roos chamber measure the dose area product which is then converted to central axis dose with the scanned beam area derived from Monte Carlo (MC) simulations and film measurements. The Pinpoint chamber measures the central axis dose directly and does not require beam area measurements. The calorimeter and FAC measure dose from first principles. The calorimetry requires conversion of the measured absorbed dose to graphite to absorbed dose to water using MC calculations with the EGSnrc code. Air kerma measurements from the free air chamber were converted to absorbed dose to water using the AAPM TG-61 protocol. The two ionization chambers are secondary standards requiring calibration with kilovoltage x-ray tubes. The Roos and Pinpoint chambers were calibrated against the Australian primary standard for air kerma at the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA). Agreement of order 2% or better was obtained between the calorimetry and ionization chambers. The FAC measured a dose 3–5% higher than the calorimetry, within the stated uncertainties.

  11. A Novel Technique for Image-Guided Local Heart Irradiation in the Rat

    PubMed Central

    Sharma, Sunil; Moros, Eduardo G.; Boerma, Marjan; Sridharan, Vijayalakshmi; Han, Eun Young; Clarkson, Richard; Hauer-Jensen, Martin; Corry, Peter M.

    2014-01-01

    In radiotherapy treatment of thoracic, breast and chest wall tumors, the heart may be included (partially or fully) in the radiation field. As a result, patients may develop radiation-induced heart disease (RIHD) several years after exposure to radiation. There are few methods available to prevent or reverse RIHD and the biological mechanisms remain poorly understood. In order to further study the effects of radiation on the heart, we developed a model of local heart irradiation in rats using an image-guided small animal conformal radiation therapy device (SACRTD) developed at our institution. First, Monte Carlo based simulations were used to design an appropriate collimator. EBT-2 films were used to measure relative dosimetry, and the absolute dose rate at the isocenter was measured using the AAPM protocol TG-61. The hearts of adult male Sprague-Dawley rats were irradiated with a total dose of 21 Gy. For this purpose, rats were anesthetized with isoflurane and placed in a custom-made vertical rat holder. Each heart was irradiated with a 3-beam technique (one AP field and 2 lateral fields), with each beam delivering 7 Gy. For each field, the heart was visualized with a digital flat panel X-ray imager and placed at the isocenter of the 1.8 cm diameter beam. In biological analysis of radiation exposure, immunohistochemistry showed γH2Ax foci and nitrotyrosine throughout the irradiated hearts but not in the lungs. Long-term follow-up of animals revealed histopathological manifestations of RIHD, including myocardial degeneration and fibrosis. The results demonstrate that the rat heart irradiation technique using the SACRTD was successful and that surrounding untargeted tissues were spared, making this approach a powerful tool for in vivo radiobiological studies of RIHD. Functional and structural changes in the rat heart after local irradiation are ongoing. PMID:24000983

  12. Protocol Development | Division of Cancer Prevention

    Cancer.gov

    The chemoprevention Phase I and II consortia must submit Letters of Intent for review and approval prior to the submission and review of the protocol. Letter of Intent (LOI) Process The chemoprevention Phase I and II consortia must submit Letters of Intent for review and approval prior to the submission and review of the protocol. DCP will solicit Letters of Intent from investigators who want to conduct clinical trials with specific agents. |

  13. Broadening and Simplifying the First SETI Protocol

    NASA Astrophysics Data System (ADS)

    Michaud, M. A. G.

    The Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence, known informally as the First SETI Protocol, is the primary existing international guidance on this subject. During the fifteen years since the document was issued, several people have suggested revisions or additional protocols. This article proposes a broadened and simplified text that would apply to the detection of alien technology in our solar system as well as to electromagnetic signals from more remote sources.

  14. Field Monitoring Protocol: Heat Pump Water Heaters

    SciTech Connect

    Sparn, B.; Earle, L.; Christensen, D.; Maguire, J.; Wilson, E.; Hancock, E.

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  15. Protocol for communications in potentially noisy environments

    DOEpatents

    Boyd, Gerlad M.; Farrow, Jeffrey

    2016-02-09

    A communications protocol that is designed for transmission of data in networks that are subjected to harsh conditions is described herein. A network includes a plurality of devices, where the devices comprise respective nodes. The nodes are in communication with one another by way of a central network hub. The protocol causes the nodes to transmit data over a network bus at different data rates depending upon whether the nodes are operating normally or an arbitration procedure has been invoked.

  16. Field Monitoring Protocol. Heat Pump Water Heaters

    SciTech Connect

    Sparn, B.; Earle, L.; Christensen, D.; Maguire, J.; Wilson, E.; Hancock, C. E.

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  17. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  18. The Variable Rate Intravenous Insulin Infusion Protocol.

    PubMed

    Collard, Benjamin; Sturgeon, Jonathan; Patel, Natasha; Asharia, Shabbar

    2014-01-01

    Insulin use among inpatients is high and associated with severe and regular medication errors. An initial baseline audit showed a wide variation in the prescription of intravenous insulin within the trust. These included variation in the choice of fluid prescribed, electrolyte levels not consistently checked, handwritten illegible prescriptions, and varying parameters set for adjustment of the prescription. A Variable Rate Intravenous Insulin Infusion protocol (VRIII)) was introduced to standardize intravenous insulin prescription throughout the trust by all members of the clinical team. We looked at and measured uptake and effects of the VRIII protocol in improving standardization of insulin prescription for inpatients on insulin at St George's NHS trust. The protocol was uploaded to the intranet to allow access 24 hours a day and the staff educated about it. The VRIII protocol was routinely used successfully throughout the trust. Any initial problems were addressed through education of clinical staff. The protocol has shown decreased prescribing and administrative errors, whilst demonstrating good glucose and electrolyte control. Use of a standardized protocol helps reduce medication errors and demonstrates good glycaemic control. Regular and continued education of clinical staff is necessary to maintain its efficacy. PMID:26734228

  19. Protocol for Communication Networking for Formation Flying

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren

    2009-01-01

    An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in

  20. The Space Communications Protocol Standards Program

    NASA Astrophysics Data System (ADS)

    Jeffries, Alan; Hooke, Adrian J.

    1994-11-01

    In the fall of 1992 NASA and the Department of Defense chartered a technical team to explore the possibility of developing a common set of space data communications standards for potential dual-use across the U.S. national space mission support infrastructure. The team focused on the data communications needs of those activities associated with on-lined control of civil and military aircraft. A two-pronged approach was adopted: a top-down survey of representative civil and military space data communications requirements was conducted; and a bottom-up analysis of available standard data communications protocols was performed. A striking intersection of civil and military space mission requirements emerged, and an equally striking consensus on the approach towards joint civil and military space protocol development was reached. The team concluded that wide segments of the U.S. civil and military space communities have common needs for: (1) an efficient file transfer protocol; (2) various flavors of underlying data transport service; (3) an optional data protection mechanism to assure end-to-end security of message exchange; and (4) an efficient internetworking protocol. These recommendations led to initiating a program to develop a suite of protocols based on these findings. This paper describes the current status of this program.

  1. Standardized North American marsh bird monitoring protocol

    USGS Publications Warehouse

    Conway, Courtney J.

    2011-01-01

    Little is known about the population status of many marsh-dependent birds in North America but recent efforts have focused on collecting more reliable information and estimates of population trends. As part of that effort, a standardized survey protocol was developed in 1999 that provided guidance for conducting marsh bird surveys throughout North America such that data would be consistent among locations. The original survey protocol has been revised to provide greater clarification on many issues as the number of individuals using the protocol has grown. The Standardized North American Marsh Bird Monitoring Protocol instructs surveyors to conduct an initial 5-minute passive point-count survey followed by a series of 1-minute segments during which marsh bird calls are broadcast into the marsh following a standardized approach. Surveyors are instructed to record each individual bird from the suite of 26 focal species that are present in their local area on separate lines of a datasheet and estimate the distance to each bird. Also, surveyors are required to record whether each individual bird was detected within each 1-minute subsegment of the survey. These data allow analysts to use several different approaches for estimating detection probability. The Standardized North American Marsh Bird Monitoring Protocol provides detailed instructions that explain the field methods used to monitor marsh birds in North America.

  2. Collaborative patient care protocols: a development process.

    PubMed

    Blaufuss, J; Wynn, J; Hujcs, M

    1993-01-01

    Computerization of these protocols is in progress. This project is funded for one year with projected completion in January 1994. This study will form a framework in which further research can be completed. Utilizing protocols will allow the measurement of nursing decision making by testing relationships between parameters and interventions and by identifying rules for decision making. For example, questions that may be answered include which physiologic parameters do clinicians treat and in what order or priority, as well as what is the impact on patient outcomes in regard to cost of care and complications. Computerized patient care protocols can be further developed to meet patient-specific needs. A computerized data base will facilitate managing large amounts of patient data and tailoring instructions to these patients. One of the goals of this project was to measure the feasibility of developing computerized patient care protocols and implementing them in a critical care setting. Eventually, this experience will facilitate implementing computerized protocols at other sites. An additional benefit is the ability to implement continuous quality improvement strategies in a prospective manner rather than by retrospective review. PMID:10171735

  3. Sensitive localized surface plasmon resonance multiplexing protocols.

    PubMed

    Jia, Kun; Bijeon, Jean L; Adam, Pierre M; Ionescu, Rodica E

    2012-09-18

    Herein are reported two new protocols to obtain different zones of localized surface plasmon resonance (LSPR) gold nanostructures on single glass substrate by using a vacuum evaporation technique followed by a high-temperature annealing (550 °C). The thickness of the gold film, considered as the essential parameter to determine specific LSPR properties, is successfully modulated. In the first protocol, a metal mask is integrated onto the glass substrate during vacuum evaporation to vary the gold film thickness by a "shadowing effect", while in the second protocol several evaporation cycles (up to four cycles) at predefined areas onto the single substrate are performed. The resulting gold-modified samples are characterized using a transmission UV-vis extinction optical setup and scanning electron microscopy (SEM). The size distribution histograms of nanoparticles are also acquired. By employing the first protocol, thanks to the presence of different zones of gold nanoparticles on a single substrate, optimized LSPR responses to different (bio)functionalization zones are rapidly screened. Independently, the second protocol exhibited an excellent correlation between the nominative evaporated gold film thickness, gold nanoparticle sizes, and plasmonic properties (resonant wavelength and peak amplitude). Such substrates are further used in the construction of LSPR immunosensors for the detection of atrazine herbicide.

  4. The Interplanetary Overlay Networking Protocol Accelerator

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Torgerson, Jordan L.; Clare, Loren P.

    2008-01-01

    A document describes the Interplanetary Overlay Networking Protocol Accelerator (IONAC) an electronic apparatus, now under development, for relaying data at high rates in spacecraft and interplanetary radio-communication systems utilizing a delay-tolerant networking protocol. The protocol includes provisions for transmission and reception of data in bundles (essentially, messages), transfer of custody of a bundle to a recipient relay station at each step of a relay, and return receipts. Because of limitations on energy resources available for such relays, data rates attainable in a conventional software implementation of the protocol are lower than those needed, at any given reasonable energy-consumption rate. Therefore, a main goal in developing the IONAC is to reduce the energy consumption by an order of magnitude and the data-throughput capability by two orders of magnitude. The IONAC prototype is a field-programmable gate array that serves as a reconfigurable hybrid (hardware/ firmware) system for implementation of the protocol. The prototype can decode 108,000 bundles per second and encode 100,000 bundles per second. It includes a bundle-cache static randomaccess memory that enables maintenance of a throughput of 2.7Gb/s, and an Ethernet convergence layer that supports a duplex throughput of 1Gb/s.

  5. STATISTICAL PRINCIPLES FOR PROSPECTIVE STUDY PROTOCOLS:

    PubMed Central

    Langberg, Henning

    2012-01-01

    In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means, risk differences, and other quantities that convey information. One of the goals in biomedical research is to develop parsimonious models ‐ meaning as simple as possible. This approach is valid if the subsequent research report (the article) is written independent of whether the results are “statistically significant” or not. In the present paper we outline the considerations and suggestions on how to build a trial protocol, with an emphasis on having a rigorous protocol stage, always leading to a full article manuscript, independent of statistical findings. We conclude that authors, who find (rigorous) protocol writing too troublesome, will realize that they have already written the first half of the final paper if they follow these recommendations; authors simply need to change the protocols future tense into past tense. Thus, the aim of this clinical commentary is to describe and explain the statistical principles for trial protocols in terms of design, analysis, and reporting of findings. PMID:23091782

  6. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  7. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  8. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval.

  9. Automated plan-recognition of chemotherapy protocols

    PubMed Central

    Bhatia, Haresh; Levy, Mia

    2011-01-01

    Cancer patients are often treated with multiple sequential chemotherapy protocols ranging in complexity from simple to highly complex patterns of multiple repeating drugs. Clinical documentation procedures that focus on details of single drug events, however, make it difficult for providers and systems to efficiently abstract the sequence and nature of treatment protocols. We have developed a data driven method for cancer treatment plan recognition that takes as input pharmacy chemotherapy dispensing records and produces the sequence of identified chemotherapy protocols. Compared to a manually annotated gold standard, our method was 75% accurate and 80% precise for a breast cancer testing set (110 patients, 2,029 drug events), and 54% accurate and 63% precise for a lung cancer testing set (53 patients, 670 drug events). This method for cancer treatment plan recognition may provide clinicians and systems an abstracted view of the patient’s treatment history. PMID:22195061

  10. Efficient Controlled Quantum Secure Direct Communication Protocols

    NASA Astrophysics Data System (ADS)

    Patwardhan, Siddharth; Moulick, Subhayan Roy; Panigrahi, Prasanta K.

    2016-07-01

    We study controlled quantum secure direct communication (CQSDC), a cryptographic scheme where a sender can send a secret bit-string to an intended recipient, without any secure classical channel, who can obtain the complete bit-string only with the permission of a controller. We report an efficient protocol to realize CQSDC using Cluster state and then go on to construct a (2-3)-CQSDC using Brown state, where a coalition of any two of the three controllers is required to retrieve the complete message. We argue both protocols to be unconditionally secure and analyze the efficiency of the protocols to show it to outperform the existing schemes while maintaining the same security specifications.

  11. Chapter 22: Compressed Air Evaluation Protocol

    SciTech Connect

    Benton, N.

    2014-11-01

    Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: high-efficiency/variable speed drive (VSD) compressor replacing modulating compressor; compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.

  12. Chapter 15: Commercial New Construction Protocol

    SciTech Connect

    Keates, S.

    2014-09-01

    This protocol is intended to describe the recommended method when evaluating the whole-building performance of new construction projects in the commercial sector. The protocol focuses on energy conservation measures (ECMs) measures (or packages of measures) where evaluators can best analyze impacts using building simulation. These ECMs typically require the use of calibrated building simulations under Option D of the International Performance Measurement and Verification Protocol. Examples of such measures include Leadership in Energy and Environmental Design building certification, novel and/or efficient heating, ventilation, and air conditioning system designs, and extensive building controls systems. In general, it is best to evaluate any ECM (or set of measures) expected to significantly interact with other systems within the building and with savings sensitive to seasonal variations in weather.

  13. An object-oriented communication protocol

    NASA Astrophysics Data System (ADS)

    Chapman, Lee J.

    1990-08-01

    OOC is a high-level (OSI application layer and above) controls communication protocol. While natural languages are too complex for the controls environment, computer protocols are often insufficiently expressive. OOC is an attempt to balance simplicity and expressivity, and is sufficiently flexible to express data acquisition, control requests, alarm messages and error messages in a straightforward generic way. OOC supports dynamic creation of objects. It can be used in networks, for intertask and even for intratask communication. OOC, the protocol, is supported by OOC, the code, written in portable C. The lower level of this code supports a tagged-data scheme, with tags as elementary as INTEGER and BOOLEAN and as complex as OBJECT and MESSAGE. These tagged data can be evaluated in a LISP-like way.

  14. A class-chest for deriving transport protocols

    SciTech Connect

    Strayer, W.T.

    1996-10-01

    Development of new transport protocols or protocol algorithms suffers from the complexity of the environment in which they are intended to run. Modeling techniques attempt to avoid this by simulating the environment. Another approach to promoting rapid prototyping of protocols and protocol algorithms is to provide a pre-built infrastructure that is common to transport protocols, so that the focus is placed on the protocol-specific aspects. The Meta-Transport Library is a library of C++ base classes that implement or abstract out the mundane functions of a protocol, new protocol implementations are derived from base classes. The result is a fully viable user- level transport protocol implementation, with emphasis on modularity. The collection of base classes form a ``class-chest`` of tools .from which protocols can be developed and studied with as little change to a normal UNIX environment as possible.

  15. Cryptanalysis of the Quantum Group Signature Protocols

    NASA Astrophysics Data System (ADS)

    Zhang, Ke-Jia; Sun, Ying; Song, Ting-Ting; Zuo, Hui-Juan

    2013-11-01

    Recently, the researches of quantum group signature (QGS) have attracted a lot of attentions and some typical protocols have been designed for e-payment system, e-government, e-business, etc. In this paper, we analyze the security of the quantum group signature with the example of two novel protocols. It can be seen that both of them cannot be implemented securely since the arbitrator cannot solve the disputes fairly. In order to show that, some possible attack strategies, which can be used by the malicious participants, are proposed. Moreover, the further discussions of QGS are presented finally, including some insecurity factors and improved ideas.

  16. A Protocol for Evaluating Contextual Design Principles

    PubMed Central

    Stamps, Arthur

    2014-01-01

    This paper explains how scientific data can be incorporated into urban design decisions, such as evaluating contextual design principles. The recommended protocols are based on the Cochrane Reviews that have been widely used in medical research. The major concepts of a Cochrane Review are explained, as well as the underlying mathematics. The underlying math is meta-analysis. Data are reported for three applications and seven contextual design policies. It is suggested that use of the Cochrane protocols will be of great assistance to planners by providing scientific data that can be used to evaluate the efficacies of contextual design policies prior to implementing those policies. PMID:25431448

  17. Building multiservice Internet protocol virtual private networks

    NASA Astrophysics Data System (ADS)

    Cheung, William

    1999-11-01

    Multiservice Internet Protocol-based Virtual Private Networks (MIP- VPNs) with Quality of Service (QoS) are becoming a reality due to the availability of new standards from the Internet Engineering Task Force (IETF). This paper describes how components including security models, IP tunneling protocols, and service differentiation schemes fit together in order to construct such a VPN. First, the concept and rationale of VPN is presented, followed by a discussion of its supporting components. A comparison is made among the various VPN technologies.

  18. Social Protocols for Agile Virtual Teams

    NASA Astrophysics Data System (ADS)

    Picard, Willy

    Despite many works on collaborative networked organizations (CNOs), CSCW, groupware, workflow systems and social networks, computer support for virtual teams is still insufficient, especially support for agility, i.e. the capability of virtual team members to rapidly and cost efficiently adapt the way they interact to changes. In this paper, requirements for computer support for agile virtual teams are presented. Next, an extension of the concept of social protocol is proposed as a novel model supporting agile interactions within virtual teams. The extended concept of social protocol consists of an extended social network and a workflow model.

  19. In Brief: Kyoto Protocol moves forward

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2004-10-01

    The Russian cabinet's 30 September endorsement of the Kyoto Protocol to the United Nations Framework Convention on Climate Change (UNFCCC) likely clears the way for the treaty's ratification by that country's parliament and for its entry into force. The protocol enters into force when not less than 55 Parties to the Convention, including industrialized countries (so called ``Annex I Parties'') which accounted in total for at least 55 % of the total carbon dioxide emissions for 1990 from that group, officially have agreed to the treaty.

  20. The Parasol Protocol for computational mutagenesis.

    PubMed

    Aronica, P G A; Verma, C; Popovic, B; Leatherbarrow, R J; Gould, I R

    2016-07-01

    To aid in the discovery and development of peptides and proteins as therapeutic agents, a virtual screen can be used to predict trends and direct workflow. We have developed the Parasol Protocol, a dynamic method implemented using the AMBER MD package, for computational site-directed mutagenesis. This tool can mutate between any pair of amino acids in a computationally expedient, automated manner. To demonstrate the potential of this methodology, we have employed the protocol to investigate a test case involving stapled peptides, and have demonstrated good agreement with experiment. PMID:27255759

  1. NADA protocol: integrative acupuncture in addictions.

    PubMed

    Carter, Kenneth; Olshan-Perlmutter, Michelle

    2014-01-01

    National Acupuncture Detoxification Association (NADA) acupuncture is a simple, standardized, 1- to 5-point auricular needling protocol that originated as a grass-roots response to the opiate epidemic of the 1970s. NADA acupuncture is increasingly recognized as a universally useful intervention in the treatment of addictions specifically and in behavior health more generally. It is recognized as a best practice in the treatment of substance use disorders. Integrative programs using the NADA protocol are likely to see improvements in engagement, retention, decreased drug cravings, anxiety, and less physical symptoms.

  2. The Kyoto Protocol: A business perspective

    SciTech Connect

    Malin, C.B.

    1998-01-19

    Governments have made a tentative start in responding to climate change. In marathon negotiating sessions that extended into an extra day Dec. 1--11 in Kyoto, Japan, representatives from more than 160 governments hammered out the Kyoto Protocol to the United Nations Framework Convention on Climate Change (FCCC). The protocol calls for developed countries to reduce emissions of greenhouse gases (GHGs) on averaged by 5.2% below 1990 levels by the years 2008--2012. Developing countries have no new obligations. The paper discusses the agreement, ratification, future questions, business role, and the challenge.

  3. Quantum Private Comparison Protocol with Linear Optics

    NASA Astrophysics Data System (ADS)

    Luo, Qing-bin; Yang, Guo-wu; She, Kun; Li, Xiaoyu

    2016-09-01

    In this paper, we propose an innovative quantum private comparison(QPC) protocol based on partial Bell-state measurement from the view of linear optics, which enabling two parties to compare the equality of their private information with the help of a semi-honest third party. Partial Bell-state measurement has been realized by using only linear optical elements in experimental measurement-device-independent quantum key distribution(MDI-QKD) schemes, which makes us believe that our protocol can be realized in the near future. The security analysis shows that the participants will not leak their private information.

  4. Interpolation of recurrence and hashing entanglement distillation protocols

    SciTech Connect

    Vollbrecht, Karl Gerd H.; Verstraete, Frank

    2005-06-15

    We construct interesting entanglement distillation protocols by interpolating between the recurrence and hashing protocols. This leads to asymptotic two-way distillation protocols, resulting in an improvement of the distillation rate for all mixed Bell diagonal entangled states, even for the ones with very high fidelity. We also present a method for how entanglement-assisted distillation protocol can be converted into nonentanglement-assisted protocols with the same yield.

  5. From Postpartum Haemorrhage Guideline to Local Protocol: A Study of Protocol Quality.

    PubMed

    Woiski, Mallory D; van Vugt, Helena C; Dijkman, Anneke; Grol, Richard P; Marcus, Abraham; Middeldorp, Johanna M; Mol, Ben W; Mols, Femke; Oudijk, Martijn A; Porath, Martina; Scheepers, Hubertina J; Hermens, Rosella P

    2016-10-01

    Objective Postpartum hemorrhage (PPH) has a continuously rising incidence worldwide, suggesting suboptimal care. An important step in optimizing care is the translation of evidence-based guidelines into comprehensive hospital protocols. However, knowledge about the quality of these protocols is lacking. The objective of this study was to evaluate the quality of PPH-protocols on structure and content in the Netherlands. Methods We performed an observational multicenter study. Eighteen PPH-protocols from 3 University Hospitals (UH), 8 Teaching Hospitals (TH) and 7 Non-Teaching hospitals (NTH) throughout the Netherlands were acquired. The structure of the PPH-protocols was assessed using the Appraisal of Guidelines for Research and Evaluation (AGREE-II) Instrument. The content was appraised using previously developed quality indicators, based on international guidelines and Advance-Trauma-Life-Support (ATLS)-based course instructions. Results The quality of the protocols for postpartum hemorrhage for both structure and content varied widely between different hospitals, but all of them showed room for improvement. The protocols scored mainly below average on the different items of the AGREE-II instrument (8 of the 10 items scored <4 on a 1-7 scale). Regarding the content, adoption of guideline recommendations in protocols was 46 %. In addition, a timely indication of 'when to perform' a recommendation was lacking in three-fourths of the items. Conclusion This study shows that the quality of the PPH-protocols for both structure and content in the Netherlands is suboptimal. This makes adherence to the guideline and ATLS-based course instructions difficult. PMID:27395381

  6. Evaluating Computer-Tutors: A Protocol Study.

    ERIC Educational Resources Information Center

    Strickland, James

    A protocol study investigated whether computer tutors (programs that interactively guide writers while they freewrite with a word processing program) promote or hinder a richer understanding of the composing process. The analysis focused on writers' attitudes toward computer tutors in the invention process. Data were collected by tape recording a…

  7. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... results and acceptance criterion. To determine whether a surrogate multi-purpose lighter resists operation... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child test panel. (1) The test to determine if a multi-purpose lighter is resistant to successful...

  8. A Bayesian approach to optimizing cryopreservation protocols.

    PubMed

    Sambu, Sammy

    2015-01-01

    Cryopreservation is beset with the challenge of protocol alignment across a wide range of cell types and process variables. By taking a cross-sectional assessment of previously published cryopreservation data (sample means and standard errors) as preliminary meta-data, a decision tree learning analysis (DTLA) was performed to develop an understanding of target survival using optimized pruning methods based on different approaches. Briefly, a clear direction on the decision process for selection of methods was developed with key choices being the cooling rate, plunge temperature on the one hand and biomaterial choice, use of composites (sugars and proteins as additional constituents), loading procedure and cell location in 3D scaffolding on the other. Secondly, using machine learning and generalized approaches via the Naïve Bayes Classification (NBC) method, these metadata were used to develop posterior probabilities for combinatorial approaches that were implicitly recorded in the metadata. These latter results showed that newer protocol choices developed using probability elicitation techniques can unearth improved protocols consistent with multiple unidimensionally-optimized physical protocols. In conclusion, this article proposes the use of DTLA models and subsequently NBC for the improvement of modern cryopreservation techniques through an integrative approach.

  9. Protocols for growing plant symbioses; mycorrhiza.

    PubMed

    Schultze, Michael

    2013-01-01

    Arbuscular mycorrhizal symbiosis is receiving increased attention as a potential contributor to sustainable crop plant nutrition. This chapter details a set of protocols for plant growth to study the development and physiology of the arbuscular mycorrhizal symbiosis, and how to establish root organ cultures for the production of axenic inoculum.

  10. Reliable multicasting in the Xpress Transport Protocol

    SciTech Connect

    Atwood, J.W.; Catrina, O.; Fenton, J.; Strayer, W.T.

    1996-12-01

    The Xpress Transport Protocol (XTP) is designed to meet the needs of distributed, real-time, and multimedia systems. This paper describes the genesis of recent improvements to XTP that provide mechanisms for reliable management of multicast groups, and gives details of the mechanisms used.

  11. A Geographical Heuristic Routing Protocol for VANETs.

    PubMed

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-01-01

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation). PMID:27669254

  12. 40 CFR 161.70 - Acceptable protocols.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ingredient, mixture, or product. Accordingly, failure to follow a suggested protocol will not invalidate a... part. Readers should note, however, that certain of the OECD recommended test standards, such as test duration and selection of test species, are less restrictive than those recommended by EPA. Therefore,...

  13. 21 CFR 312.30 - Protocol amendments.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Protocol amendments. 312.30 Section 312.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS FOR HUMAN USE INVESTIGATIONAL NEW DRUG APPLICATION Investigational New Drug Application (IND) §...

  14. 21 CFR 312.83 - Treatment protocols.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Treatment protocols. 312.83 Section 312.83 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS FOR HUMAN USE INVESTIGATIONAL NEW DRUG APPLICATION Drugs Intended to Treat Life-threatening and...

  15. The Vocational Assessment Protocol: Development and Validation.

    ERIC Educational Resources Information Center

    Thomas, Dale F.; Menz, Fredrick E.

    This report describes a 48-month project which developed, field tested, and evaluated the utility of the Vocational Assessment Protocol (VAP) for use with persons with traumatic brain injury resulting in a severe and persistent disability. The VAP is intended to assist in the community-based vocational rehabilitation of these individuals. The VAP…

  16. A Geographical Heuristic Routing Protocol for VANETs.

    PubMed

    Urquiza-Aguiar, Luis; Tripp-Barba, Carolina; Aguilar Igartua, Mónica

    2016-09-23

    Vehicular ad hoc networks (VANETs) leverage the communication system of Intelligent Transportation Systems (ITS). Recently, Delay-Tolerant Network (DTN) routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR) protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation).

  17. An international computer protocol standard is essential

    SciTech Connect

    Marks, J.

    1994-02-01

    This article examines the need for the development of an international communication protocol to avoid building or buying customized interfaces or gateways in order to connect two separate vendor's devices to the same computer. The article discuss the need for standards and details one electric cooperative's experience in converting their automated mapping and facilities management system to EPRI sponsored Utility Communications Architecture.

  18. Measurement Protocols for Optimized Fuel Assembly Tags

    SciTech Connect

    Gerlach, David C.; Mitchell, Mark R.; Reid, Bruce D.; Gesh, Christopher J.; Hurley, David E.

    2008-11-01

    This report describes the measurement protocols for optimized tags that can be applied to standard fuel assemblies used in light water reactors. This report describes work performed by the authors at Pacific Northwest National Laboratory for NA-22 as part of research to identify specific signatures that can be developed to support counter-proliferation technologies.

  19. Direct data access protocols benchmarking on DPM

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Devresse, Adrien; Keeble, Oliver; Mancinelli, Valentina

    2015-12-01

    The Disk Pool Manager is an example of a multi-protocol, multi-VO system for data access on the Grid that went though a considerable technical evolution in the last years. Among other features, its architecture offers the opportunity of testing its different data access frontends under exactly the same conditions, including hardware and backend software. This characteristic inspired the idea of collecting monitoring information from various testbeds in order to benchmark the behaviour of the HTTP and Xrootd protocols for the use case of data analysis, batch or interactive. A source of information is the set of continuous tests that are run towards the worldwide endpoints belonging to the DPM Collaboration, which accumulated relevant statistics in its first year of activity. On top of that, the DPM releases are based on multiple levels of automated testing that include performance benchmarks of various kinds, executed regularly every day. At the same time, the recent releases of DPM can report monitoring information about any data access protocol to the same monitoring infrastructure that is used to monitor the Xrootd deployments. Our goal is to evaluate under which circumstances the HTTP-based protocols can be good enough for batch or interactive data access. In this contribution we show and discuss the results that our test systems have collected under the circumstances that include ROOT analyses using TTreeCache and stress tests on the metadata performance.

  20. 40 CFR 792.120 - Protocol.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Justification for selection of the test system. (6) Where applicable, the number, body weight, sex, source of... than established by the specifications. (10) The route of administration and the reason for its choice... changes in or revisions of an approved protocol and the reasons therefor shall be documented, signed...

  1. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... selection of the test system. (6) Where applicable, the number, body weight range, sex, source of supply... specifications. (10) The route of administration and the reason for its choice. (11) Each dosage level, expressed... of an approved protocol and the reasons therefore shall be documented, signed by the study...

  2. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... selection of the test system. (6) Where applicable, the number, body weight range, sex, source of supply... specifications. (10) The route of administration and the reason for its choice. (11) Each dosage level, expressed... of an approved protocol and the reasons therefore shall be documented, signed by the study...

  3. 40 CFR 160.120 - Protocol.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... selection of the test system. (6) Where applicable, the number, body weight range, sex, source of supply... specifications. (10) The route of administration and the reason for its choice. (11) Each dosage level, expressed... of an approved protocol and the reasons therefore shall be documented, signed by the study...

  4. Teledermatology protocol for screening of Skin Cancer*

    PubMed Central

    Piccoli, Maria Fernanda; Amorim, Bruna Dücker Bastos; Wagner, Harley Miguel; Nunes, Daniel Holthausen

    2015-01-01

    BACKGROUND Telemedicine refers to the use of technology as improvement of healthcare delivery to places where distance becomes an obstacle. Its use represents a great potential for dermatology, a specialty whose visual analysis phase is essential in diagnosis. OBJECTIVES To analyze the compatibility index of skin cancer diagnoses between primary care and teledermatology, and to validate a protocol for standardization of digital imaging to obtain the reports in teledermatology. METHODS An observational cross-sectional study developed through the census of 333 examination requests, received between January/2012 and July/2012, in the Center for Telemedicine and Telehealth of SES-SC. We used a protocol for photographic lesion standardization, consisting of three steps (panoramic photo, close-up with ruler and dermoscopy). After collection, the data were sent to a virtual site on the Internet, and recorded with the use of an electronic health record containing the images, the skin phototype and demographic characteristics. RESULTS The level of compatibility between the diagnosis of skin cancer in Santa Catarina's primary care and the diagnosis proposed by teledermatology was 19.02%. Proportionally, it was 21.21% for BCC, 44.44% for SCC and 6.98% for MM. The protocol was statistically significant (p <0.05), with an OR of 38.77. CONCLUSION The rate of diagnostic compatibility of skin cancer was low and the use of the protocol optimized the chance of validating requests for examination. PMID:25830990

  5. A Generic Archive Protocol and an Implementation

    NASA Astrophysics Data System (ADS)

    Jordan, J. M.; Jennings, D. G.; McGlynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.

    1993-01-01

    Archiving vast amounts of data has become a major part of every scientific space mission today. GRASP, the Generic Retrieval/Ar\\-chive Services Protocol, addresses the question of how to archive the data collected in an environment where the underlying hardware archives and computer hosts may be rapidly changing.

  6. 16 CFR 1210.4 - Test protocol.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of children from each 100-child test panel, photograph(s) or video tape to show how the lighter was... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses...

  7. 16 CFR 1210.4 - Test protocol.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of children from each 100-child test panel, photograph(s) or video tape to show how the lighter was... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses...

  8. 16 CFR 1210.4 - Test protocol.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of children from each 100-child test panel, photograph(s) or video tape to show how the lighter was... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses...

  9. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child... by children uses a panel of children to test a surrogate multi-purpose lighter representing the... of a child before the child participates in the test. (2) The test shall be conducted using at...

  10. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child... by children uses a panel of children to test a surrogate multi-purpose lighter representing the... of a child before the child participates in the test. (2) The test shall be conducted using at...

  11. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child... by children uses a panel of children to test a surrogate multi-purpose lighter representing the... of a child before the child participates in the test. (2) The test shall be conducted using at...

  12. 16 CFR 1210.4 - Test protocol.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of children from each 100-child test panel, photograph(s) or video tape to show how the lighter was... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses...

  13. Montreal Protocol benefits simulated with CCM SOCOL

    NASA Astrophysics Data System (ADS)

    Egorova, T.; Rozanov, E.; Gröbner, J.; Hauser, M.; Schmutz, W.

    2012-07-01

    Ozone depletion is caused by the anthropogenic increase of halogen containing species in the atmosphere, which results in the enhancement of the concentration of reactive chlorine and bromine in the stratosphere. To reduce the influence of anthropogenic ozone-depleting substances (ODS), the Montreal Protocol was agreed by Governments in 1987, with several Amendments adopted later. In order to assess the benefits of the Montreal Protocol and its Amendments (MPA) on ozone and UV radiation, two different runs of the chemistry-climate model (CCM) SOCOL have been carried out. The first run was driven by the emission of ozone depleting substances (ODS) prescribed according to the restrictions of the Montreal Protocol and all its Amendments. For the second run we allow the ODS to grow by 3% annually. We find that the MPA would have saved up to 80% of the global annual total ozone by the end of the 21st century. Our calculations also show substantial changes in surface temperature and precipitations that could occur in the world without MPA implementations. To illustrate the changes in UV radiation at the surface and to emphasize certain features which can only be seen for some particular regions if the influence of the cloud cover changes is accounted for, we calculate geographical distribution of the erythemally weighted irradiance (Eery). For the no Montreal Protocol simulation Eery increases by factor of 4 to 16 between the 1970s and 2100. For the scenario including the Montreal Protocol it is found that UV radiation starts to decrease in 2000, with continuous decline of 5% to 10% at middle latitudes in the Northern and Southern hemispheres.

  14. Protocol for determining bull trout presence

    USGS Publications Warehouse

    Peterson, James; Dunham, Jason B.; Howell, Philip; Thurow, Russell; Bonar, Scott

    2002-01-01

    The Western Division of the American Fisheries Society was requested to develop protocols for determining presence/absence and potential habitat suitability for bull trout. The general approach adopted is similar to the process for the marbled murrelet, whereby interim guidelines are initially used, and the protocols are subsequently refined as data are collected. Current data were considered inadequate to precisely identify suitable habitat but could be useful in stratifying sampling units for presence/absence surveys. The presence/absence protocol builds on previous approaches (Hillman and Platts 1993; Bonar et al. 1997), except it uses the variation in observed bull trout densities instead of a minimum threshold density and adjusts for measured differences in sampling efficiency due to gear types and habitat characteristics. The protocol consists of: 1. recommended sample sizes with 80% and 95% detection probabilities for juvenile and resident adult bull trout for day and night snorkeling and electrofishing adjusted for varying habitat characteristics for 50m and 100m sampling units, 2. sampling design considerations, including possible habitat characteristics for stratification, 3. habitat variables to be measured in the sampling units, and 3. guidelines for training sampling crews. Criteria for habitat strata consist of coarse, watershed-scale characteristics (e.g., mean annual air temperature) and fine-scale, reach and habitat-specific features (e.g., water temperature, channel width). The protocols will be revised in the future using data from ongoing presence/absence surveys, additional research on sampling efficiencies, and development of models of habitat/species occurrence.

  15. Bayesian adaptive survey protocols for resource management

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of

  16. The braided single-stage protocol for quantum secure communication

    NASA Astrophysics Data System (ADS)

    Darunkar, Bhagyashri; Verma, Pramode K.

    2014-05-01

    This paper presents the concept and implementation of a Braided Single-stage Protocol for quantum secure communication. The braided single-stage protocol is a multi-photon tolerant secure protocol. This multi-photon tolerant protocol has been implemented in the laboratory using free-space optics technology. The proposed protocol capitalizes on strengths of the three-stage protocol and extends it with a new concept of braiding. This protocol overcomes the limitations associated with the three-stage protocol in the following ways: It uses the transmission channel only once as opposed to three times in the three-stage protocol, and it is invulnerable to man-in-the-middle attack. This paper also presents the error analysis resulting from the misalignment of the devices in the implementation. The experimental results validate the efficient use of transmission resources and improvement in the data transfer rate.

  17. Hybrid protocol of remote implementations of quantum operations

    SciTech Connect

    Zhao Ningbo; Wang Anmin

    2007-12-15

    We propose a protocol of remote implementations of quantum operations by hybridizing bidirectional quantum-state teleportation (BQST) [Huelga et al., Phys. Rev. A 63, 042303 (2001)] and the Wang protocol [Wang, Phys. Rev. A 74, 032317 (2006)]. The protocol is available for remote implementations of quantum operations in the restricted sets specified in the paper. We also give a proof of the protocol and point out its optimization. As an extension, this hybrid protocol can be reduced to the BQST and Wang protocols.

  18. Protocol Directed Patient Care using a Computer

    PubMed Central

    Blum, B.; Lenhard, R.; McColligan, E.

    1980-01-01

    The Johns Hopkins Oncology Center has developed a clinical information system which assists in the care of the 2,000 patients currently under treatment at the Center. The system maintains a data base containing a summary diagnostic and treatment history plus complete tabulations of laboratory results, therapies, and other clinical findings. These data are organized and displayed in formats which aid decision-making. For the past year the Center has been working with an extension to the data system which produces daily care plans for each inpatient and outpatient treated at the Center. These plans are a function of the disease, treatment protocol, and current clinical status of each patient. This paper describes the objectives, organization, and experience to date with the use of computer generated plans for protocol directed patient care.

  19. An Evaluation of UDP Transport Protocols

    SciTech Connect

    Carter, S

    2004-12-21

    Although the speed of LAN and WAN networking is growing at an exponential rate, the applications that use those networks have not followed suit. With fiber optic interconnects, gigahertz processor speeds, and 10 gigabit per second network interface cards, hardware does not seem to be the limiting factor. It is becoming increasingly obvious that the protocols that are the basis of networking today are ill-suited to a new generation of networking technology. For this reason, Oak Ridge National Laboratory is particularly interested in improving bulk transfers over high-bandwidth, high-latency networks because of its involvement in storage and in the transfer of data for cutting-edge scientific applications. This report summarizes our evaluation of a new group of protocols specifically designed to get more useful bandwidth from today's high speed, wide area networks.

  20. UV Impacts Avoided by the Montreal Protocol

    NASA Technical Reports Server (NTRS)

    Newman, Paul; McKenzie, Richard

    2010-01-01

    Temporal and geographical variabilities in the future "World Expected" UV environment are compared with the "World Avoided", which would have occurred without the Montreal Protocol on protection of the ozone layer and its subsequent amendments and adjustments. Based on calculations of clear-sky UV irradiances, the effects of the Montreal Protocol have been hugely beneficial to avoid the health risks, such as skin cancer, which are associated with high UV, while there is only a small increase in health risks, such as vitamin D deficiency, that are associated with low UV. However, interactions with climate change may lead to changes in cloud and albedo, and possibly behavioural changes which could also be important.

  1. Da Vinci robot emergency undocking protocol.

    PubMed

    O'Sullivan, O E; O'Sullivan, S; Hewitt, M; O'Reilly, B A

    2016-09-01

    The role of robot-assisted surgery across gynaecology is evolving with increasing numbers of procedures being undertaken with varying degrees of complexity. While the risk of conversion is low at approximately 1 %, the reasons for conversion are variable. These range from technical issues with the robot, surgical complications such as haemorrhage and anaesthetics issues such as an inability to ventilate the patient adequately. While many conversions to open or laparoscopic approach are not due to life-threatening indications, it is important that the theatre staff are aware of the indication and can perform an emergency undocking as effectively, efficiently and safely as possible when the need arises. Unfortunately, there is a paucity of the literature available outlining such protocols. For this reason, we developed an emergency undocking protocol clearly outlining the role of each theatre staff member and the need for clear concise communication. PMID:27126584

  2. Multipass Steering Protocols at Jefferson Lab

    SciTech Connect

    Ryan Bodenstein; Michael Tiefenback

    2007-06-22

    The CEBAF recirculating accelerator consists of two CW superconducting RF linacs, through which an electron beam is accelerated for up to 5 passes. Focusing and steering elements affect each pass differently, requiring a multipass steering protocol to correct the orbits. Perturbations include lens misalignments (including long-term ground motion), BPM offsets, and focusing and steering from RF fields inside the cavities. A previous treatment of this problem assumed all perturbations were localized at the quadrupoles and the absence of x-y coupling. Having analyzed the problem and characterized the solutions, we developed an empirical iterative protocol to compare against previous results in the presence of skew fields and cross-plane coupling. We plan to characterize static and acceleration-dependent components of the beam line perturbations to allow systematic and rapid configuration of the accelerator at different linac energy gains.

  3. Staining protocol for organotypic hippocampal slice cultures.

    PubMed

    Gogolla, Nadine; Galimberti, Ivan; DePaola, Vincenzo; Caroni, Pico

    2006-01-01

    This protocol details a method to immunostain organotypic slice cultures from mouse hippocampus. The cultures are based on the interface method, which does not require special equipment, is easy to execute and yields slice cultures that can be imaged repeatedly, from the time of isolation at postnatal day 6-9 up to 6 months in vitro. The preserved tissue architecture facilitates the analysis of defined hippocampal synapses, cells and entire projections. Time-lapse imaging is based on transgenes expressed in the mice or on constructs introduced through transfection or viral vectors; it can reveal processes that develop over periods ranging from seconds to months. Subsequent to imaging, the slices can be processed for immunocytochemistry to collect further information about the imaged structures. This protocol can be completed in 3 d.

  4. UV impacts avoided by the Montreal Protocol.

    PubMed

    Newman, Paul A; McKenzie, Richard

    2011-07-01

    Temporal and geographical variabilities in the future "world expected" UV environment are compared with the "world avoided", which would have occurred without the Montreal Protocol on Substances That Deplete the Ozone Layer and its subsequent amendments and adjustments. Based on calculations of clear-sky UV irradiances, the effects of the Montreal Protocol have been hugely beneficial to avoid the health risks, such as skin cancer, which are associated with high UV, while there is only a small increase in health risks, such as vitamin D deficiency, that are associated with low UV. However, interactions with climate change may lead to changes in cloud and albedo, and possibly behavioural changes that could also be important.

  5. Montreal Protocol Benefits simulated with CCM SOCOL

    NASA Astrophysics Data System (ADS)

    Egorova, T.; Rozanov, E.; Gröbner, J.; Hauser, M.; Schmutz, W.

    2013-04-01

    Ozone depletion is caused by the anthropogenic increase of halogen-containing species in the atmosphere, which results in the enhancement of the concentration of reactive chlorine and bromine in the stratosphere. To reduce the influence of anthropogenic ozone-depleting substances (ODS), the Montreal Protocol was agreed by Governments in 1987, with several Amendments and Adjustments adopted later. In order to assess the benefits of the Montreal Protocol and its Amendments and Adjustments (MPA) on ozone and UV radiation, two different runs of the chemistry-climate model (CCM) SOCOL have been carried out. The first run was driven by the emission of ozone depleting substances (ODS) prescribed according to the restrictions of the MPA. For the second run we allow the ODS to grow by 3% annually. We find that the MPA would have saved up to 80% of the global annual total ozone by the end of the 21st century. Our calculations also show substantial changes of the stratospheric circulation pattern as well as in surface temperature and precipitations that could occur in the world without MPA implementations. To illustrate the changes in UV radiation at the surface and to emphasise certain features, which can only be seen for some particular regions if the influence of the cloud cover changes is accounted for, we calculate geographical distribution of the erythemally weighted irradiance (Eery). For the no Montreal Protocol simulation Eery increases by factor of 4 to 16 between the 1970s and 2100. For the scenario including the Montreal Protocol it is found that UV radiation starts to decrease in 2000, with continuous decline of 5% to 10% at middle latitudes in the both Northern and Southern Hemispheres.

  6. Mars Sample Handling Protocol Workshop Series

    NASA Technical Reports Server (NTRS)

    Rummel, John D. (Editor); Race, Margaret S. (Editor); Acevedo, Sara (Technical Monitor)

    2000-01-01

    This document is the report resulting from the first workshop of the series on development of the criteria for a Mars sample handling protocol. Workshop 1 was held in Bethesda, Maryland on March 20-22, 2000. This report serves to document the proceedings of Workshop 1; it summarizes relevant background information, provides an overview of the deliberations to date, and helps frame issues that will need further attention or resolution in upcoming workshops. Specific recommendations are not part of this report.

  7. D-RATS 2011: RAFT Protocol Overview

    NASA Technical Reports Server (NTRS)

    Utz, Hans

    2011-01-01

    A brief overview presentation on the protocol used during the D-RATS2011 field test for file transfer from the field-test robots at Black Point Lava Flow AZ to Johnson Space Center, Houston TX over a simulated time-delay. The file transfer actually uses a commercial implementation of an open communications standard. The focus of the work lies on how to make the state of the distributed system observable.

  8. Protocol for mosquito rearing (A. gambiae).

    PubMed

    Das, Suchismita; Garver, Lindsey; Dimopoulos, George

    2007-01-01

    This protocol describes mosquito rearing in the insectary. The insectary rooms are maintained at 28 degrees C and approximately 80% humidity, with a 12 hr. day/night cycle. For this procedure, you'll need mosquito cages, 10% sterile sucrose solution, paper towels, beaker, whatman filter paper, glass feeders, human blood and serum, water bath, parafilm, distilled water, clean plastic trays, mosquito food (described below), mosquito net to cover the trays, vacuum, and a collection chamber to collect adults. PMID:18979019

  9. Building America House Simulation Protocols (Revised)

    SciTech Connect

    Hendron, R.; Engebrecht, C.

    2010-10-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  10. Avian study protocols and wind energy development

    SciTech Connect

    Fisher, K.

    1995-12-01

    This paper identifies the need to develop and use standardized avian study protocols to determine avian impacts at new and existing wind energy facilities. This will allow data collected from various sites to be correlated for better understanding wind energy related avian impacts. Factors contributing to an increased interest in wind energy facilities by electric utilities include: (1) Increased demand for electricity;(2) increased constraints on traditional electrical generating facilities (i.e. hydroelectric and nuclear power plants);(3) improved wind turbine technology. During the 1980`s generous tax credits spawned the development of wind energy facilities, known as wind farms, in California. Commercial scale wind farm proposals are being actively considered in states across the country - Washington, Oregon, Wyoming, Wisconsin, Texas, and Vermont to name a few. From the wind farms in California the unexpected issue of avian impacts, especially to birds-of-prey, or raptor, surfaced and continues to plague the wind industry. However, most of the avian studies did not followed a standardized protocol or methodology and, therefore, data is unavailable to analyze and compare impacts at different sites or with differing technologies and configurations. Effective mitigation can not be designed and applied until these differences are understood. The Bonneville Power Administration is using comparable avian study protocols to collect data for two environmental impact statements being prepared for two separate wind farm proposals. Similar protocol will be required for any other avian impact analysis performed by the agency on proposed or existing wind farms. The knowledge gained from these studies should contribute to a better understanding of avian interactions with wind energy facilities and the identification of effective mitigation measures.

  11. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    SciTech Connect

    Houck, F.; Rosenthal, M.; Wulf, N.

    2010-05-25

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member States and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.

  12. Protocols for Authorized Release of Concrete

    SciTech Connect

    Smith, Agatha Marie; Meservey, Richard Harlan; Chen, S.Y.; Powell, James Edward; PArker, F.

    2000-06-01

    Much of the clean or slightly contaminated concrete from Decontamination and Decommissioning (D&D) activities could be re-used. Currently, there is no standardized approach, or protocol, for managing the disposition of such materials. Namely, all potential disposition options for concrete, including authorized release for re-use, are generally not fully evaluated in D&D projects, so large quantities have been unduly disposed of as low-level radioactive waste. As a result, costs of D&D have become prohibitively high, hindering expedient cleanup of surplus facilities. The ability to evaluate and implement the option of authorized release of concrete from demolition would result in significant cost savings, while maintaining protection of environmental health and safety, across the Department of Energy (DOE) complex. The Idaho National Engineering and Environmental Laboratory (INEEL), Argonne National Laboratory East (ANL-E), and Vanderbilt University have teamed to develop a protocol for the authorized release of concrete, based on the existing DOE guidance of Order 5400.5, that applies across the DOE complex. The protocol will provide a streamlined method for assessing risks and costs, and reaching optimal disposal options, including re-use of the concrete within the DOE system.

  13. Analysis of Security Protocols for Mobile Healthcare.

    PubMed

    Wazid, Mohammad; Zeadally, Sherali; Das, Ashok Kumar; Odelu, Vanga

    2016-11-01

    Mobile Healthcare (mHealth) continues to improve because of significant improvements and the decreasing costs of Information Communication Technologies (ICTs). mHealth is a medical and public health practice, which is supported by mobile devices (for example, smartphones) and, patient monitoring devices (for example, various types of wearable sensors, etc.). An mHealth system enables healthcare experts and professionals to have ubiquitous access to a patient's health data along with providing any ongoing medical treatment at any time, any place, and from any device. It also helps the patient requiring continuous medical monitoring to stay in touch with the appropriate medical staff and healthcare experts remotely. Thus, mHealth has become a major driving force in improving the health of citizens today. First, we discuss the security requirements, issues and threats to the mHealth system. We then present a taxonomy of recently proposed security protocols for mHealth system based on features supported and possible attacks, computation cost and communication cost. Our detailed taxonomy demonstrates the strength and weaknesses of recently proposed security protocols for the mHealth system. Finally, we identify some of the challenges in the area of security protocols for mHealth systems that still need to be addressed in the future to enable cost-effective, secure and robust mHealth systems.

  14. Analysis of Security Protocols for Mobile Healthcare.

    PubMed

    Wazid, Mohammad; Zeadally, Sherali; Das, Ashok Kumar; Odelu, Vanga

    2016-11-01

    Mobile Healthcare (mHealth) continues to improve because of significant improvements and the decreasing costs of Information Communication Technologies (ICTs). mHealth is a medical and public health practice, which is supported by mobile devices (for example, smartphones) and, patient monitoring devices (for example, various types of wearable sensors, etc.). An mHealth system enables healthcare experts and professionals to have ubiquitous access to a patient's health data along with providing any ongoing medical treatment at any time, any place, and from any device. It also helps the patient requiring continuous medical monitoring to stay in touch with the appropriate medical staff and healthcare experts remotely. Thus, mHealth has become a major driving force in improving the health of citizens today. First, we discuss the security requirements, issues and threats to the mHealth system. We then present a taxonomy of recently proposed security protocols for mHealth system based on features supported and possible attacks, computation cost and communication cost. Our detailed taxonomy demonstrates the strength and weaknesses of recently proposed security protocols for the mHealth system. Finally, we identify some of the challenges in the area of security protocols for mHealth systems that still need to be addressed in the future to enable cost-effective, secure and robust mHealth systems. PMID:27640159

  15. 21 CFR 814.19 - Product development protocol (PDP).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) MEDICAL DEVICES PREMARKET APPROVAL OF MEDICAL DEVICES General § 814.19 Product development protocol (PDP). A class III device for which a product development protocol has been declared completed by FDA...

  16. 21 CFR 814.19 - Product development protocol (PDP).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) MEDICAL DEVICES PREMARKET APPROVAL OF MEDICAL DEVICES General § 814.19 Product development protocol (PDP). A class III device for which a product development protocol has been declared completed by FDA...

  17. 21 CFR 814.19 - Product development protocol (PDP).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) MEDICAL DEVICES PREMARKET APPROVAL OF MEDICAL DEVICES General § 814.19 Product development protocol (PDP). A class III device for which a product development protocol has been declared completed by FDA...

  18. SPIRIT 2013 Statement: defining standard protocol items for clinical trials.

    PubMed

    Chan, An-Wen; Tetzlaff, Jennifer M; Altman, Douglas G; Laupacis, Andreas; Gøtzsche, Peter C; Krle A-Jerić, Karmela; Hrobjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A; Dore, Caroline J; Parulekar, Wendy R; Summerskill, William S M; Groves, Trish; Schulz, Kenneth F; Sox, Harold C; Rockhold, Frank W; Rennie, Drummond; Moher, David

    2015-12-01

    The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders. PMID:27440100

  19. METHODS AND ANALYSES FOR IMPLEMENTING NATURAL ATTENUATION PROTOCOLS

    EPA Science Inventory

    Technical protocols for evaluating natural attenuation at petroleum hydrocarbon and chlorinated solvent contaminated sites specify the analysis of electron acceptors and metabolic by-products for identifying and quantifying natural attenuation processes. However, these protocols ...

  20. 21 CFR 814.19 - Product development protocol (PDP).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) MEDICAL DEVICES PREMARKET APPROVAL OF MEDICAL DEVICES General § 814.19 Product development protocol (PDP). A class III device for which a product development protocol has been declared completed by FDA...

  1. A Model Based Security Testing Method for Protocol Implementation

    PubMed Central

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation. PMID:25105163

  2. Study on the conversion and test of protocols

    NASA Astrophysics Data System (ADS)

    Choi, Y.

    1984-06-01

    The conversion of protocols nonadapted to the open systems interconnection (OSI) architecture into compatibility with the OSI system and the test of the equipment for converting protocols are studied. Specification and validation of protocols are reviewed. The test of the X-21 procedure and the test of the signal management equipment, parts of the Telecom project, are analyzed. A specification method based on extended finite state machines is developed. A test system for protocol conversion equipment is described.

  3. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Internet Protocol version... Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting officer must insert the clause at 3452.239-70 (Internet protocol version 6 (IPv6)) in all...

  4. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Internet Protocol version 6... for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting officer must insert the clause at 3452.239-70 (Internet protocol version 6 (IPv6)) in all...

  5. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Internet Protocol version... Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting officer must insert the clause at 3452.239-70 (Internet protocol version 6 (IPv6)) in all...

  6. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Internet Protocol version... Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting officer must insert the clause at 3452.239-70 (Internet protocol version 6 (IPv6)) in all...

  7. 21 CFR 814.19 - Product development protocol (PDP).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Product development protocol (PDP). 814.19 Section...) MEDICAL DEVICES PREMARKET APPROVAL OF MEDICAL DEVICES General § 814.19 Product development protocol (PDP). A class III device for which a product development protocol has been declared completed by FDA...

  8. Tendencies in the development of utilization protocols, seminar report

    NASA Astrophysics Data System (ADS)

    Roveri, A.

    1983-07-01

    The open system interconnections model (OSI) structured approach to specify protocols is discussed. A stratification in seven levels characterizes the architecture of the system. Transfer level protocol characterization of utilization protocols in the OSI model, data flow control, and presentation and application levels are described.

  9. The Interlibrary Loan Protocol: An OSI Solution to ILL Messaging.

    ERIC Educational Resources Information Center

    Turner, Fay

    1990-01-01

    Discusses the interlibrary loan (ILL) protocol, a standard based on the principles of the Open Systems Interconnection (OSI) Reference Model. Benefits derived from protocol use are described, the status of the protocol as an international standard is reviewed, and steps taken by the National Library of Canada to facilitate migration to an ILL…

  10. A Biometric Authenticated Key Agreement Protocol for Secure Token

    NASA Astrophysics Data System (ADS)

    Yoon, Eun-Jun; Yoo, Kee-Young

    This letter proposes a robust biometric authenticated key agreement (BAKA) protocol for a secure token to provide strong security and minimize the computation cost of each participant. Compared with other related protocols, the proposed BAKA protocol not only is secure against well-known cryptographical attacks but also provides various functionality and performance requirements.

  11. DEVELOPMENT OF MODELING PROTOCOLS FOR USE IN DETERMINING SEDIMENT TMDLS

    EPA Science Inventory

    Modeling protocols for use in determining sediment TMDLs are being developed to provide the Office of Water, Regions and the States with assistance in determining TMDLs for sediment impaired water bodies. These protocols will supplement the protocols developed by the Office of W...

  12. Spacelab system analysis: The modified free access protocol: An access protocol for communication systems with periodic and Poisson traffic

    NASA Technical Reports Server (NTRS)

    Ingels, Frank; Owens, John; Daniel, Steven

    1989-01-01

    The protocol definition and terminal hardware for the modified free access protocol, a communications protocol similar to Ethernet, are developed. A MFA protocol simulator and a CSMA/CD math model are also developed. The protocol is tailored to communication systems where the total traffic may be divided into scheduled traffic and Poisson traffic. The scheduled traffic should occur on a periodic basis but may occur after a given event such as a request for data from a large number of stations. The Poisson traffic will include alarms and other random traffic. The purpose of the protocol is to guarantee that scheduled packets will be delivered without collision. This is required in many control and data collection systems. The protocol uses standard Ethernet hardware and software requiring minimum modifications to an existing system. The modification to the protocol only affects the Ethernet transmission privileges and does not effect the Ethernet receiver.

  13. BLUE-protocol and FALLS-protocol: two applications of lung ultrasound in the critically ill.

    PubMed

    Lichtenstein, Daniel A

    2015-06-01

    This review article describes two protocols adapted from lung ultrasound: the bedside lung ultrasound in emergency (BLUE)-protocol for the immediate diagnosis of acute respiratory failure and the fluid administration limited by lung sonography (FALLS)-protocol for the management of acute circulatory failure. These applications require the mastery of 10 signs indicating normal lung surface (bat sign, lung sliding, A-lines), pleural effusions (quad and sinusoid sign), lung consolidations (fractal and tissue-like sign), interstitial syndrome (lung rockets), and pneumothorax (stratosphere sign and the lung point). These signs have been assessed in adults, with diagnostic accuracies ranging from 90% to 100%, allowing consideration of ultrasound as a reasonable bedside gold standard. In the BLUE-protocol, profiles have been designed for the main diseases (pneumonia, congestive heart failure, COPD, asthma, pulmonary embolism, pneumothorax), with an accuracy > 90%. In the FALLS-protocol, the change from A-lines to lung rockets appears at a threshold of 18 mm Hg of pulmonary artery occlusion pressure, providing a direct biomarker of clinical volemia. The FALLS-protocol sequentially rules out obstructive, then cardiogenic, then hypovolemic shock for expediting the diagnosis of distributive (usually septic) shock. These applications can be done using simple grayscale machines and one microconvex probe suitable for the whole body. Lung ultrasound is a multifaceted tool also useful for decreasing radiation doses (of interest in neonates where the lung signatures are similar to those in adults), from ARDS to trauma management, and from ICUs to points of care. If done in suitable centers, training is the least of the limitations for making use of this kind of visual medicine. PMID:26033127

  14. Bell Inequalities, Experimental Protocols and Contextuality

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2015-07-01

    In this paper we give additional arguments in favor of the point of view that the violation of Bell, CHSH and CH inequalities is not due to a mysterious non locality of nature. We concentrate on an intimate relation between a protocol of a random experiment and a probabilistic model which is used to describe it. We discuss in a simple way differences between attributive joint probability distributions and generalized joint probability distributions of outcomes from distant experiments which depend on how the pairing of these outcomes is defined. We analyze in detail experimental protocols implied by local realistic and stochastic hidden variable models and show that they are incompatible with the protocols used in spin polarization correlation experiments. We discuss also the meaning of "free will", differences between quantum and classical filters, contextuality of Kolmogorov models, contextuality of quantum theory (QT) and show how this contextuality has to be taken into account in probabilistic models trying to explain in an intuitive way the predictions of QT. The long range imperfect correlations between the clicks of distant detectors can be explained by partially preserved correlations between the signals created by a source. These correlations can only be preserved if the clicks are produced in a local and deterministic way depending on intrinsic parameters describing signals and measuring devices in the moment of the measurement. If an act of a measurement was irreducibly random they would be destroyed. It seems to indicate that QT may be in fact emerging from some underlying more detailed theory of physical phenomena. If this was a case then there is a chance to find in time series of experimental data some fine structures not predicted by QT. This would be a major discovery because it would not only prove that QT does not provide a complete description of individual physical systems but it would prove that it is not predictably complete.

  15. Towards a formalism for conversation protocols using joint intention theory

    SciTech Connect

    Kumar, Sanjeev; Huber, Marcus J.; Cohen, Philip R.; McGee, David R. )

    2001-12-01

    Conversation protocols are meant to achieve certain tasks or to bring about certain state of affairs in the world. Therefore, one may identify the landmarks or the state of affairs that must be brought about during the execution of a protocol in order to achieve its goal. Accordingly, the most important aspect of protocols is these landmarks rather than the communicative actions needed to achieve the landmarks. We show that families of conversation protocols can be expressed formally as partially ordered landmarks where each landmark is characterized by propositions that are true in the state represented by that landmark. Dialogue in natural languages is regarded as joint activity. Conversation protocols in multi-agent systems are treated as dialogue templates and are composed using speech acts from natural language dialogues. As such, we treat conversation protocols as joint action expressions and gainfully apply existing formal theories of dialogue, specifically the Joint Intention Theory, to protocols and their compositions. Conversation protocols may require agents to communicate with groups as well as individuals. However, most contemporary agent communication languages, notably FIPA and KQML, have either no provision or no well-defined semantics for group communication. Furthermore, the research on protocols so far does not correctly incorporate groups into the protocols. We give a formal semantics to group communicative acts and use it to handle group communication in a formal treatment of protocols.

  16. SPP: A data base processor data communications protocol

    NASA Technical Reports Server (NTRS)

    Fishwick, P. A.

    1983-01-01

    The design and implementation of a data communications protocol for the Intel Data Base Processor (DBP) is defined. The protocol is termed SPP (Service Port Protocol) since it enables data transfer between the host computer and the DBP service port. The protocol implementation is extensible in that it is explicitly layered and the protocol functionality is hierarchically organized. Extensive trace and performance capabilities have been supplied with the protocol software to permit optional efficient monitoring of the data transfer between the host and the Intel data base processor. Machine independence was considered to be an important attribute during the design and implementation of SPP. The protocol source is fully commented and is included in Appendix A of this report.

  17. Formal Analysis of Two Buyer-Seller Watermarking Protocols

    NASA Astrophysics Data System (ADS)

    Williams, David M.; Treharne, Helen; Ho, Anthony T. S.; Waller, Adrian

    In this paper we demonstrate how the formal model constructed in our previous work [1], can be modified in order to analyse additional Buyer-Seller Watermarking Protocols, identifying which specific sections of the CSP scripts remain identical and which require modification. First, we model the protocol proposed by Memon and Wong [2], an examplar of the Offline Watermarking Authority (OFWA) Model, defined in the framework by Poh and Martin [3]. Second, we model the Shao protocol [4] as an example of a protocol fitting the Online Watermarking Authority (ONWA) Model. Our analysis of the protocols reaffirms the unbinding attack described by Lei et al.[5] on the Memon and Wong protocol and we identify a new unbinding attack on the protocol proposed by Shao.

  18. Rosetta Ligand docking with flexible XML protocols.

    PubMed

    Lemmon, Gordon; Meiler, Jens

    2012-01-01

    RosettaLigand is premiere software for predicting how a protein and a small molecule interact. Benchmark studies demonstrate that 70% of the top scoring RosettaLigand predicted interfaces are within 2Å RMSD from the crystal structure [1]. The latest release of Rosetta ligand software includes many new features, such as (1) docking of multiple ligands simultaneously, (2) representing ligands as fragments for greater flexibility, (3) redesign of the interface during docking, and (4) an XML script based interface that gives the user full control of the ligand docking protocol. PMID:22183535

  19. 2014 Building America House Simulation Protocols

    SciTech Connect

    Wilson, E.; Engebrecht, C. Metzger; Horowitz, S.; Hendron, R.

    2014-03-01

    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  20. 2014 Building America House Simulation Protocols

    SciTech Connect

    Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.; Hendron, R.

    2014-03-01

    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  1. Satellite-Friendly Protocols and Standards

    NASA Astrophysics Data System (ADS)

    Koudelka, O.; Schmidt, M.; Ebert, J.; Schlemmer, H.; Kastner, S.; Riedler, W.

    2002-01-01

    We are currently observing a development unprecedented with other services, the enormous growth of the Internet. Video, voice and data applications can be supported via this network in high quality. Multi-media applications require high bandwidth which may not be available in many areas. When making proper use of the broadcast feature of a communications satellite, the performance of the satellite-based system can compare favourably to terrestrial solutions. Internet applications are in many cases highly asymmetric, making them very well suited to applications using small and inexpensive terminals. Data from one source may be used simultaneously by a large number of users. The Internet protocol suite has become the de-facto standard. But this protocol family in its original form has not been designed to support guaranteed quality of service, a prerequisite for real-time, high quality traffic. The Internet Protocol has to be adapted for the satellite environment, because long roundtrip delays and the error behaviour of the channel could make it inefficient over a GEO satellite. Another requirement is to utilise the satellite bandwidth as efficiently as possible. This can be achieved by adapting the access system to the nature of IP frames, which are variable in length. In the framework of ESA's ARTES project a novel satellite multimedia system was developed which utilises Multi-Frequency TDMA in a meshed network topology. The system supports Quality of Service (QoS) by reserving capacity with different QoS requirements. The system is centrally controlled by a master station with the implementation of a demand assignment (DAMA) system. A lean internal signalling system has been adopted. Network management is based on the SNMP protocol and industry-standard network management platforms, making interfaces to standard accounting and billing systems easy. Modern communication systems will have to be compliant to different standards in a very flexible manner. The

  2. Quantum oblivious set-member decision protocol

    NASA Astrophysics Data System (ADS)

    Shi, Run-hua; Mu, Yi; Zhong, Hong; Zhang, Shun

    2015-08-01

    We present and define a privacy-preserving problem called the oblivious set-member decision problem, which allows a server to decide whether a private secret of a user is a member of his private set in an oblivious manner. Namely, if the secret belongs to his private set, he does not know which member it is. We propose a quantum solution to the oblivious set-member decision problem. Compared to classical solutions, the proposed quantum protocol achieves an exponential reduction in communication complexity, since it only needs O (1 ) communication cost.

  3. Impact of Communication Protocol on Performance

    SciTech Connect

    Worley, P.H.

    1999-02-01

    We use the PSTSWM compact application benchmark code to characterize the performance behavior of interprocessor communication on the SGI/Cray Research Origin 2000 and T3E-900. We measure 1. single processor performance, 2. point-to-point communication performance, 3. performance variation as a function of communication protocols and transport layer for collective communication routines, and 4. performance sensitivity of full application code to choice of parallel implementation. We also compare and contrast these results with similar results for the previous generation of parallel platforms, evaluating how the relative importance of communication performance has changed.

  4. Toward a Standard Protocol for Micelle Simulation.

    PubMed

    Johnston, Michael A; Swope, William C; Jordan, Kirk E; Warren, Patrick B; Noro, Massimo G; Bray, David J; Anderson, Richard L

    2016-07-01

    In this paper, we present protocols for simulating micelles using dissipative particle dynamics (and in principle molecular dynamics) that we expect to be appropriate for computing micelle properties for a wide range of surfactant molecules. The protocols address challenges in equilibrating and sampling, specifically when kinetics can be very different with changes in surfactant concentration, and with minor changes in molecular size and structure, even using the same force field parameters. We demonstrate that detection of equilibrium can be automated and is robust, for the molecules in this study and others we have considered. In order to quantify the degree of sampling obtained during simulations, metrics to assess the degree of molecular exchange among micellar material are presented, and the use of correlation times are prescribed to assess sampling and for statistical uncertainty estimates on the relevant simulation observables. We show that the computational challenges facing the measurement of the critical micelle concentration (CMC) are somewhat different for high and low CMC materials. While a specific choice is not recommended here, we demonstrate that various methods give values that are consistent in terms of trends, even if not numerically equivalent. PMID:27096611

  5. Chapter 18: Variable Frequency Drive Evaluation Protocol

    SciTech Connect

    Romberger, J.

    2014-11-01

    An adjustable-speed drive (ASD) includes all devices that vary the speed of a rotating load, including those that vary the motor speed and linkage devices that allow constant motor speed while varying the load speed. The Variable Frequency Drive Evaluation Protocol presented here addresses evaluation issues for variable-frequency drives (VFDs) installed on commercial and industrial motor-driven centrifugal fans and pumps for which torque varies with speed. Constant torque load applications, such as those for positive displacement pumps, are not covered by this protocol. Other ASD devices, such as magnetic drive, eddy current drives, variable belt sheave drives, or direct current motor variable voltage drives, are also not addressed. The VFD is by far the most common type of ASD hardware. With VFD speed control on a centrifugal fan or pump motor, energy use follows the affinity laws, which state that the motor electricity demand is a cubic relationship to speed under ideal conditions. Therefore, if the motor runs at 75% speed, the motor demand will ideally be reduced to 42% of full load power; however, with other losses it is about 49% of full load power.

  6. Advantages of a leveled commitment contracting protocol

    SciTech Connect

    Sandholm, T.W.; Lesser, V.R.

    1996-12-31

    In automated negotiation systems consisting of self-interested agents, contracts have traditionally been binding. Such contracts do not allow agents to efficiently accommodate future events. Game theory has proposed contingency contracts to solve this problem. Among computational agents, contingency contracts are often impractical due to large numbers of interdependent and unanticipated future events to be conditioned on, and because some events are not mutually observable. This paper proposes a leveled commitment contracting protocol that allows self-interested agents to efficiently accommodate future events by having the possibility of unilaterally decommitting from a contract based on local reasoning. A decommitment penalty is assigned to both agents in a contract: to be freed from the contract, an agent only pays this penalty to the other party. It is shown through formal analysis of several contracting settings that this leveled commitment feature in a contracting protocol increases Pareto efficiency of deals and can make contracts individually rational when no full commitment contract can. This advantage holds even if the agents decommit manipulatively.

  7. A high performance totally ordered multicast protocol

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd; Whetten, Brian; Kaplan, Simon

    1995-01-01

    This paper presents the Reliable Multicast Protocol (RMP). RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service such as IP Multicasting. RMP is fully and symmetrically distributed so that no site bears un undue portion of the communication load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These QoS guarantees are selectable on a per packet basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, an implicit naming service, mutually exclusive handlers for messages, and mutually exclusive locks. It has commonly been held that a large performance penalty must be paid in order to implement total ordering -- RMP discounts this. On SparcStation 10's on a 1250 KB/sec Ethernet, RMP provides totally ordered packet delivery to one destination at 842 KB/sec throughput and with 3.1 ms packet latency. The performance stays roughly constant independent of the number of destinations. For two or more destinations on a LAN, RMP provides higher throughput than any protocol that does not use multicast or broadcast.

  8. Toward a Standard Protocol for Micelle Simulation.

    PubMed

    Johnston, Michael A; Swope, William C; Jordan, Kirk E; Warren, Patrick B; Noro, Massimo G; Bray, David J; Anderson, Richard L

    2016-07-01

    In this paper, we present protocols for simulating micelles using dissipative particle dynamics (and in principle molecular dynamics) that we expect to be appropriate for computing micelle properties for a wide range of surfactant molecules. The protocols address challenges in equilibrating and sampling, specifically when kinetics can be very different with changes in surfactant concentration, and with minor changes in molecular size and structure, even using the same force field parameters. We demonstrate that detection of equilibrium can be automated and is robust, for the molecules in this study and others we have considered. In order to quantify the degree of sampling obtained during simulations, metrics to assess the degree of molecular exchange among micellar material are presented, and the use of correlation times are prescribed to assess sampling and for statistical uncertainty estimates on the relevant simulation observables. We show that the computational challenges facing the measurement of the critical micelle concentration (CMC) are somewhat different for high and low CMC materials. While a specific choice is not recommended here, we demonstrate that various methods give values that are consistent in terms of trends, even if not numerically equivalent.

  9. Resilient packet ring media access protocol

    NASA Astrophysics Data System (ADS)

    Thepot, Frederic

    2001-07-01

    The discussion will cover the new initiative to create a new MAC layer standard for resilient packet rings: IEEE 802.17 RPR. The key aspects of the presentation will include a preliminary address of the Metro Area Network today and the current networking technologies such as SONET/SDH which are not optimized to carry IP traffic over Metro MAN. The next segment will cover the options which could change the traditional and expensive layered networking model, and address the real benefits of marrying several technologies like Ethernet, SONET/SDH and IP into one technology. The next part of the discussion will detail the technical advantages a new MAC will bring to the services providers. Lastly a summary of the view and strategy about the acceptance and deployment of this new technology in the next 12 months, specifically, now one defines and develops standards for a Resilient Packet Ring Access Protocol for use in Local, Metropolitan, and Wide Area Networks for transfer of data packets at rates scalable to multiple gigabits per second; specifically address the data transmission requirements of carriers that have present and planned fiber optic physical infrastructure in a ring topology; and, defining and developing detailed specifications for using existing and/or new physical layers at appropriate data rates that will support transmission of this access protocol.

  10. Internet Protocol Enhanced over Satellite Networks

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1999-01-01

    Extensive research conducted by the Satellite Networks and Architectures Branch of the NASA Lewis Research Center led to an experimental change to the Internet's Transmission Control Protocol (TCP) that will increase performance over satellite channels. The change raises the size of the initial burst of data TCP can send from 1 packet to 4 packets or roughly 4 kilobytes (kB), whichever is less. TCP is used daily by everyone on the Internet for e-mail and World Wide Web access, as well as other services. TCP is one of the feature protocols used in computer communications for reliable data delivery and file transfer. Increasing TCP's initial data burst from the previously specified single segment to approximately 4 kB may improve data transfer rates by up to 27 percent for very small files. This is significant because most file transfers in wide-area networks today are small files, 4 kilobytes or less. In addition, because data transfers over geostationary satellites can take 5 to 20 times longer than over typical terrestrial connections, increasing the initial burst of data that can be sent is extremely important. This research along with research from other institutions has led to the release of two new Request for Comments from the Internet Engineering Task Force (IETF, the international body that sets Internet standards). In addition, two studies of the implications of this mechanism were also funded by NASA Lewis.

  11. OIL SPILL DISPERSANT EFFECTIVENESS PROTOCOL. II: PERFORMANCE OF THE REVISED PROTOCOL

    EPA Science Inventory

    The current U.S. Environmental Protection Agency (EPA) protocol for testing the effectiveness of dispersants for use in treating oil spills on the open water, the swirling flask test (SFT), has been found to give widely varying results in the hands of different testing laborator...

  12. Unified Protocol for the Transdiagnostic Treatment of Emotional Disorders: Protocol Development and Initial Outcome Data

    ERIC Educational Resources Information Center

    Ellard, Kristen K.; Fairholme, Christopher P.; Boisseau, Christina L.; Farchione, Todd J.; Barlow, David H.

    2010-01-01

    The Unified Protocol (UP) is a transdiagnostic, emotion-focused cognitive-behavioral treatment developed to be applicable across the emotional disorders. The UP consists of 4 core modules: increasing emotional awareness, facilitating flexibility in appraisals, identifying and preventing behavioral and emotional avoidance, and situational and…

  13. A Survey on Underwater Acoustic Sensor Network Routing Protocols.

    PubMed

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-03-22

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research.

  14. The Quantum Steganography Protocol via Quantum Noisy Channels

    NASA Astrophysics Data System (ADS)

    Wei, Zhan-Hong; Chen, Xiu-Bo; Niu, Xin-Xin; Yang, Yi-Xian

    2015-08-01

    As a promising branch of quantum information hiding, Quantum steganography aims to transmit secret messages covertly in public quantum channels. But due to environment noise and decoherence, quantum states easily decay and change. Therefore, it is very meaningful to make a quantum information hiding protocol apply to quantum noisy channels. In this paper, we make the further research on a quantum steganography protocol for quantum noisy channels. The paper proved that the protocol can apply to transmit secret message covertly in quantum noisy channels, and explicity showed quantum steganography protocol. In the protocol, without publishing the cover data, legal receivers can extract the secret message with a certain probability, which make the protocol have a good secrecy. Moreover, our protocol owns the independent security, and can be used in general quantum communications. The communication, which happen in our protocol, do not need entangled states, so our protocol can be used without the limitation of entanglement resource. More importantly, the protocol apply to quantum noisy channels, and can be used widely in the future quantum communication.

  15. A Survey on Underwater Acoustic Sensor Network Routing Protocols

    PubMed Central

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-01-01

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research. PMID:27011193

  16. A Survey on Underwater Acoustic Sensor Network Routing Protocols.

    PubMed

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-01-01

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research. PMID:27011193

  17. New Routing Metrics for ADHOC Network Routing Protocols

    NASA Astrophysics Data System (ADS)

    Reddy, P. C.

    2014-12-01

    The performance and reliability of Internet is measured using different quantities. When the quantities measured are essential and have wide range of acceptance then they are called metrics. Performance metrics enable comparison and selection among the alternatives. In computer networks, metrics are used to evaluate an application, protocol etc. Routing in adhoc networks is nontrivial. Routing protocols for adhoc networks are still evolving and there is need for continuous evaluation of them. In the literature existing, several routing protocols are evaluated using standard metrics under different conditions. This paper proposes new metrics for evaluation of routing protocols and uses them to evaluate the adhoc network routing protocols AODV, DSR, DSDV and TORA. Simulation environment is created using NS-2 simulator. Typical range of speeds, pause times and data rates are used. The results provide new insights in to the working of the routing protocols.

  18. A Secure RFID Authentication Protocol Adopting Error Correction Code

    PubMed Central

    Zheng, Xinying; Chen, Pei-Yu

    2014-01-01

    RFID technology has become popular in many applications; however, most of the RFID products lack security related functionality due to the hardware limitation of the low-cost RFID tags. In this paper, we propose a lightweight mutual authentication protocol adopting error correction code for RFID. Besides, we also propose an advanced version of our protocol to provide key updating. Based on the secrecy of shared keys, the reader and the tag can establish a mutual authenticity relationship. Further analysis of the protocol showed that it also satisfies integrity, forward secrecy, anonymity, and untraceability. Compared with other lightweight protocols, the proposed protocol provides stronger resistance to tracing attacks, compromising attacks and replay attacks. We also compare our protocol with previous works in terms of performance. PMID:24959619

  19. The Real Performance Drivers behind XML Lock Protocols

    NASA Astrophysics Data System (ADS)

    Bächle, Sebastian; Härder, Theo

    Fine-grained lock protocols should allow for highly concurrent transaction processing on XML document trees, which is addressed by the taDOM lock protocol family enabling specific lock modes and lock granules adjusted to the various XML processing models. We have already proved its operational flexibility and performance superiority when compared to competitor protocols. Here, we outline our experiences gained during the implementation and optimization of these protocols. We figure out their performance drivers to maximize throughput while keeping the response times at an acceptable level and perfectly exploiting the advantages of our tailor-made lock protocols for XML trees. Because we have implemented all options and alternatives in our prototype system XTC, benchmark runs for all “drivers” allow for comparisons in identical environments and illustrate the benefit of all implementation decisions. Finally, they reveal that careful lock protocol optimization pays off.

  20. Probabilistic protocols in quantum information science: Use and abuse

    NASA Astrophysics Data System (ADS)

    Caves, Carlton

    2014-03-01

    Protocols in quantum information science often succeed with less than unit probability, but nonetheless perform useful tasks because success occurs often enough to make tolerable the overhead from having to perform the protocol several times. Any probabilistic protocol must be analyzed from the perspective of the resources required to make the protocol succeed. I present results from analyses of two probabilistic protocols: (i) nondeterministic (or immaculate) linear amplification, in which an input coherent state is amplified some of the time to a larger-amplitude coherent state, and (ii) probabilistic quantum metrology, in which one attempts to improve estimation of a parameter (or parameters) by post-selecting on a particular outcome. The analysis indicates that there is little to be gained from probabilistic protocols in these two situations.

  1. Melanins and melanogenesis: methods, standards, protocols.

    PubMed

    d'Ischia, Marco; Wakamatsu, Kazumasa; Napolitano, Alessandra; Briganti, Stefania; Garcia-Borron, José-Carlos; Kovacs, Daniela; Meredith, Paul; Pezzella, Alessandro; Picardo, Mauro; Sarna, Tadeusz; Simon, John D; Ito, Shosuke

    2013-09-01

    Despite considerable advances in the past decade, melanin research still suffers from the lack of universally accepted and shared nomenclature, methodologies, and structural models. This paper stems from the joint efforts of chemists, biochemists, physicists, biologists, and physicians with recognized and consolidated expertise in the field of melanins and melanogenesis, who critically reviewed and experimentally revisited methods, standards, and protocols to provide for the first time a consensus set of recommended procedures to be adopted and shared by researchers involved in pigment cell research. The aim of the paper was to define an unprecedented frame of reference built on cutting-edge knowledge and state-of-the-art methodology, to enable reliable comparison of results among laboratories and new progress in the field based on standardized methods and shared information.

  2. Automatic quality assessment protocol for MRI equipment.

    PubMed

    Bourel, P; Gibon, D; Coste, E; Daanen, V; Rousseau, J

    1999-12-01

    The authors have developed a protocol and software for the quality assessment of MRI equipment with a commercial test object. Automatic image analysis consists of detecting surfaces and objects, defining regions of interest, acquiring reference point coordinates and establishing gray level profiles. Signal-to-noise ratio, image uniformity, geometrical distortion, slice thickness, slice profile, and spatial resolution are checked. The results are periodically analyzed to evaluate possible drifts with time. The measurements are performed weekly on three MRI scanners made by the Siemens Company (VISION 1.5T, EXPERT 1.0T, and OPEN 0.2T). The results obtained for the three scanners over approximately 3.5 years are presented, analyzed, and compared.

  3. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  4. Optical protocols for advanced spacecraft networks

    NASA Technical Reports Server (NTRS)

    Bergman, Larry A.

    1991-01-01

    Most present day fiber optic networks are in fact extensions of copper wire networks. As a result, their speed is still limited by electronics even though optics is capable of running three orders of magnitude faster. Also, the fact that photons do not interact with one another (as electrons do) provides optical communication systems with some unique properties or new functionality that is not readily taken advantage of with conventional approaches. Some of the motivation for implementing network protocols in the optical domain, a few possible approaches including optical code-division multiple-access (CDMA), and how this class of networks can extend the technology life cycle of the Space Station Freedom (SSF) with increased performance and functionality are described.

  5. Intermittent kangaroo mother care: a NICU protocol.

    PubMed

    Davanzo, Riccardo; Brovedani, Pierpaolo; Travan, Laura; Kennedy, Jacqueline; Crocetta, Anna; Sanesi, Cecilia; Strajn, Tamara; De Cunto, Angela

    2013-08-01

    The practice of kangaroo mother care (KMC) is steadily increasing in high-tech settings due to its proven benefits for both infants and parents. In spite of that, clear guidelines about how to implement this method of care are lacking, and as a consequence, some restrictions are applied in many neonatal intensive care units (NICUs), preventing its practice. Based on recommendations from the Expert Group of the International Network on Kangaroo Mother Care, we developed a hospital protocol in the neonatal unit of the Institute for Maternal and Child Health in Trieste, Italy, a level 3 unit, aimed to facilitate and promote KMC implementation in high-tech settings. Our guideline is therefore proposed, based both on current scientific literature and on practical considerations and experience. Future adjustments and improvements would be considered based on increasing clinical KMC use and further knowledge.

  6. Extensible Authentication Protocol Overview and Its Applications

    NASA Astrophysics Data System (ADS)

    Youm, Heung Youl

    The Extensible Authentication Protocol (EAP) is an authentication framework that supports multiple authentication mechanisms [38] between a peer and an authentication server in a data communication network. EAP is used as a useful tool for enabling user authentication and distribution of session keys. There are numerous EAP methods that have been developed by global SDOs such as IETF, IEEE, ITU-T, and 3GPP. In this paper, we analyze the most widely deployed EAP methods ranging from the EAP-TLS [27] to the EAP-PSK [25]. In addition, we derive the security requirements of EAP methods meet, evaluate the typical EAP methods in terms of the security requirements, and discuss the features of the existing widely-deployed EAP methods. In addition, we identify two typical use cases for the EAP methods. Finally, recent global standardization activities in this area are reviewed.

  7. Hamstring Muscle Injuries, a Rehabilitation Protocol Purpose

    PubMed Central

    Valle, Xavier; L.Tol, Johannes; Hamilton, Bruce; Rodas, Gil; Malliaras, Peter; Malliaropoulos, Nikos; Rizo, Vicenc; Moreno, Marcel; Jardi, Jaume

    2015-01-01

    Context: Hamstring acute muscle injuries are prevalent in several sports including AFL football (Australian Football League), sprinting and soccer, and are often associated with prolonged time away from sport. Evidence Acquisition: In response to this, research into prevention and management of hamstring injury has increased, but epidemiological data shows no decline in injury and re-injury rates, suggesting that rehabilitation programs and return to play (RTP) criteria have to be improved. There continues to be a lack of consensus regarding how to assess performance, recovery and readiness to RTP, following hamstring strain injury. Results: The aim of this paper was to propose rehabilitation protocol for hamstring muscle injuries based on current basic science and research knowledge regarding injury demographics and management options. Conclusions: Criteria-based (subjective and objective) progression through the rehabilitation program will be outlined along with exercises for each phase, from initial injury to RTP. PMID:26715969

  8. Neonatal euthanasia: lessons from the Groningen Protocol.

    PubMed

    Eduard Verhagen, A A

    2014-10-01

    Decisions about neonatal end-of-life care have been studied intensely over the last 20 years in The Netherlands. Nationwide surveys were done to quantify these decisions, provide details and monitor the effect of guidelines, new regulations and other interventions. One of those interventions was the Groningen Protocol for newborn euthanasia in severely ill newborns, published in 2005. Before publication, an estimated 20 cases of euthanasia per year were performed. After publication, only two cases in five years were reported. Studies suggested that this might be partly caused by the lack of consensus about the dividing line between euthanasia and palliative care. New recommendations about paralytic medication use in dying newborns were issued to increase transparency and to improve reporting of euthanasia. New surveys will be needed to measure the effects of these interventions. This cycle of interventions and measurements seems useful for continuous improvement of end-of-life care in newborns.

  9. Fault recovery in the reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Whetten, Brian

    1995-01-01

    The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.

  10. Chest magnetic resonance imaging: a protocol suggestion*

    PubMed Central

    Hochhegger, Bruno; de Souza, Vinícius Valério Silveira; Marchiori, Edson; Irion, Klaus Loureiro; Souza Jr., Arthur Soares; Elias Junior, Jorge; Rodrigues, Rosana Souza; Barreto, Miriam Menna; Escuissato, Dante Luiz; Mançano, Alexandre Dias; Araujo Neto, César Augusto; Guimarães, Marcos Duarte; Nin, Carlos Schuler; Santos, Marcel Koenigkam; Silva, Jorge Luiz Pereira e

    2015-01-01

    In the recent years, with the development of ultrafast sequences, magnetic resonance imaging (MRI) has been established as a valuable diagnostic modality in body imaging. Because of improvements in speed and image quality, MRI is now ready for routine clinical use also in the study of pulmonary diseases. The main advantage of MRI of the lungs is its unique combination of morphological and functional assessment in a single imaging session. In this article, the authors review most technical aspects and suggest a protocol for performing chest MRI. The authors also describe the three major clinical indications for MRI of the lungs: staging of lung tumors; evaluation of pulmonary vascular diseases; and investigation of pulmonary abnormalities in patients who should not be exposed to radiation. PMID:26811555

  11. Designing an Exploration Atmosphere Prebreathe Protocol

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Feiveson, A. H.; Gernhardt, M. L.; Norcross, J. R.; Wessel, J. H., III

    2015-01-01

    Extravehicular activities (EVAs) at remote locations must maximize limited resources such as oxygen (O2) and also minimize the risk of decompression sickness (DCS). A proposed remote denitrogenation (prebreathe) protocol requires astronauts to live in a mildly hypoxic atmosphere at 8.2 psia while periodically performing EVAs at 4.3 psia. Empirical data are required to confirm that the protocol meets the current accept requirements: less than or equal to 15% incidence of Type I DCS, less than or equal to 20% incidence of Grade IV venous gas emboli (VGE), both at 95% statistical confidence, with no Type II DCS symptom during the validation trial. METHODS: A repeated measures statistical design is proposed in which groups of 6 subjects with physical characteristics similar to active-duty astronauts would first become equilibrated to an 8.2 psia atmosphere in a hypobaric chamber containing 34% O2 and 66% N2, over 48 h, and then perform 4 simulated EVAs at 4.3 psia over the next 9 days. In the equilibration phase, subjects undergo a 3-h 100% O2 mask prebreathe prior to and during a 5-min ascent to 8.2 psia to prevent significant tissue N2 supersaturation on reaching 8.2 psia. Masks would be removed once 34% O2 is established at 8.2 psia, and subjects would then equilibrate to this atmosphere for 48 h. The hypoxia is equivalent to breathing air at 1,220 meters (4,000 ft) altitude, just as was experienced in the shuttle 10.2 psia - 26.5% O2 staged denitrogenation protocol and the current ISS campout denitrogenation protocol. For simulated EVAs, each subject dons a mask and breathes 85% O2 and 15% N2 during a 3-min depressurization to 6.0 psia, holds for 15 min, and then completes a 3-min depressurization to 4.3 psia. The simulated EVA period starts when 6.0 psia is reached and continues for a total of 240 min (222 min at 4.3 psia). During this time, subjects will follow a prescribed repetitive activity against loads in the upper and lower body with mean metabolic rate

  12. Assessment and classification of protocol deviations

    PubMed Central

    Ghooi, Ravindra Bhaskar; Bhosale, Neelambari; Wadhwani, Reena; Divate, Pathik; Divate, Uma

    2016-01-01

    Introduction: Deviations from the approved trial protocol are common during clinical trials. They have been conventionally classified as deviations or violations, depending on their impact on the trial. Methods: A new method has been proposed by which deviations are classified in five grades from 1 to 5. A deviation of Grade 1 has no impact on the subjects’ well-being or on the quality of data. At the maximum, a deviation Grade 5 leads to the death of the subject. This method of classification was applied to deviations noted in the center over the last 3 years. Results: It was observed that most deviations were of Grades 1 and 2, with fewer falling in Grades 3 and 4. There were no deviations that led to the death of the subject (Grade 5). Discussion: This method of classification would help trial managers decide on the action to be taken on the occurrence of deviations, which would be based on their impact. PMID:27453830

  13. Phenology monitoring protocol: Northeast Temperate Network

    USGS Publications Warehouse

    Tierney, Geri; Mitchell, Brian; Miller-Rushing, Abraham J.; Katz, Jonathan; Denny, Ellen; Brauer, Corinne; Donovan, Therese; Richardson, Andrew D.; Toomey, Michael; Kozlowski, Adam; Weltzin, Jake F.; Gerst, Kathy; Sharron, Ed; Sonnentag, Oliver; Dieffenbach, Fred

    2013-01-01

    historical parks and national historic sites in the northeastern US. This protocol was developed in collaboration with and relies upon the procedures and infrastructure of the USA National Phenology Network (USA-NPN), including Nature’s Notebook, USA-NPN’s online plant and animal phenology observation program (www.nn.usanpn.org). Organized in 2007, USA-NPN is a nation-wide partnership among federal agencies, schools and universities, citizen volunteers, and others to monitor and understand the influence of seasonal cycles on the nation’s biological resources. The overall goal of NETN’s phenology monitoring program is to determine trends in the phenology of key species in order to assist park managers with the detection and mitigation of the effects of climate change on park resources. An additional programmatic goal is to interest and educate park visitors and staff, as well as a cadre of volunteer monitors.

  14. Outcome analysis of individualized vestibular rehabilitation protocols

    NASA Technical Reports Server (NTRS)

    Black, F. O.; Angel, C. R.; Pesznecker, S. C.; Gianna, C.

    2000-01-01

    OBJECTIVE: To determine the outcome of vestibular rehabilitation protocols in subjects with peripheral vestibular disorders compared with normal and abnormal control subjects. STUDY DESIGN: Prospective study using repeated measure, matched control design. Subjects were solicited consecutively according to these criteria: vestibular disorder subjects who had abnormal results of computerized dynamic posturography (CDP) sensory organization tests (SOTs) 5 and 6 and underwent rehabilitation; vestibular disorder subjects who had abnormal results of SOTs 5 and 6 and did not undergo rehabilitation; and normal subjects (normal SOTs). SETTING: Tertiary neurotology clinic. SUBJECTS: Men and women over age 18 with chronic vestibular disorders and chief complaints of unsteadiness, imbalance, and/or motion intolerance, and normal subjects. INTERVENTIONS: Pre- and post-rehabilitation assessment included CDP, vestibular disability, and activities of daily living questionnaires. Individualized rehabilitation plans were designed and implemented to address the subject's specific complaints and functional deficits. Supervised sessions were held at weekly intervals, and self-administered programs were devised for daily home use. MAIN OUTCOME MEASURES: CDP composite and SOT scores, number of falls on CDP, and self-assessment questionnaire results. RESULTS: Subjects who underwent rehabilitation (Group A) showed statistically significant improvements in SOTs, overall composite score, and reduction in falls compared with abnormal (Group B) control groups. Group A's performances after rehabilitation were not significantly different from those of normal subjects (Group C) in SOTs 3 through 6, and close to normal on SOTs 1 and 2. Subjects in Group A also reported statistically significant symptomatic improvement. CONCLUSIONS: Outcome measures of vestibular protocol physical therapy confirmed objective and subjective improvement in subjects with chronic peripheral vestibular disorders. These

  15. Interplanetary Overlay Network Bundle Protocol Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    The Interplanetary Overlay Network (ION) system's BP package, an implementation of the Delay-Tolerant Networking (DTN) Bundle Protocol (BP) and supporting services, has been specifically designed to be suitable for use on deep-space robotic vehicles. Although the ION BP implementation is unique in its use of zero-copy objects for high performance, and in its use of resource-sensitive rate control, it is fully interoperable with other implementations of the BP specification (Internet RFC 5050). The ION BP implementation is built using the same software infrastructure that underlies the implementation of the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol (CFDP) built into the flight software of Deep Impact. It is designed to minimize resource consumption, while maximizing operational robustness. For example, no dynamic allocation of system memory is required. Like all the other ION packages, ION's BP implementation is designed to port readily between Linux and Solaris (for easy development and for ground system operations) and VxWorks (for flight systems operations). The exact same source code is exercised in both environments. Initially included in the ION BP implementations are the following: libraries of functions used in constructing bundle forwarders and convergence-layer (CL) input and output adapters; a simple prototype bundle forwarder and associated CL adapters designed to run over an IPbased local area network; administrative tools for managing a simple DTN infrastructure built from these components; a background daemon process that silently destroys bundles whose time-to-live intervals have expired; a library of functions exposed to applications, enabling them to issue and receive data encapsulated in DTN bundles; and some simple applications that can be used for system checkout and benchmarking.

  16. Extraction and anonymity protocol of medical file.

    PubMed Central

    Bouzelat, H.; Quantin, C.; Dusserre, L.

    1996-01-01

    To carry out the epidemiological study of patients suffering from a given cancer, the Department of Medical Informatics (DIM) has to link information coming from different hospitals and medical laboratories in the Burgundy region. Demands from the French department for computerized information security (Commission Nationale de l'Informatique et des Libertés: CNIL), in regard to abiding by the law of January 6, 1978, completed by the law of July 1st, 1994 on nominal data processing in the framework of medical research have to be taken into account. Notably, the CNIL advised to render anonymous patient identities before the extraction of each establishment file. This paper describes a recently implemented protocol, registered with the French department for computerized information security (Service Central de la Sécurité des Systèmes d'information : SCSSI) whose purpose is to render anonymous medical files in view of their extraction. Once rendered anonymous, these files will be exportable so as to be merged with other files and used in a framework of epidemiological studies. Therefore, this protocol uses the Standard Hash Algorithm (SHA) which allows the replacement of identities by their imprints while ensuring a minimal collision rate in order to allow a correct linkage of the different information concerning the same patient. A first evaluation of the extraction and anonymity software with regard to the purpose of an epidemiological survey is described here. In this paper, we also show how it would be possible to implement this system by means of the Internet communication network. PMID:8947681

  17. Wireless Sensor Networks Energy-Efficient MAC Protocol

    NASA Astrophysics Data System (ADS)

    Lijuan, Du; Yuanpeng, Wang; WeiPeng, Jing

    This paper presents a new wireless sensor network energy-efficient MAC protocol, ES-MAC protocol, and shows the results of simulation experiments. During the transmission the nodes do not send ACK packages while use a small amount of new information packets, so they can reduce unnecessary energy loss and wasted time. The theoretical analysis and simulation results show that ES-MAC protocol reduces energy consumption while reducing network latency and improving network throughput.

  18. TRIGA: Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh

    2006-01-01

    We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.

  19. Secure authentication protocol for Internet applications over CATV network

    NASA Astrophysics Data System (ADS)

    Chin, Le-Pond

    1998-02-01

    An authentication protocol is proposed in this paper to implement secure functions which include two way authentication and key management between end users and head-end. The protocol can protect transmission from frauds, attacks such as reply and wiretap. Location privacy is also achieved. A rest protocol is designed to restore the system once when systems fail. The security is verified by taking several security and privacy requirements into consideration.

  20. Protocols and services for distributed data-intensive science

    NASA Astrophysics Data System (ADS)

    Allcock, William; Foster, Ian; Tuecke, Steven; Chervenak, Ann; Kesselman, Carl

    2001-08-01

    We describe work being performed in the Globus project to develop enabling protocols and services for distributed data-intensive science. These services include: * High-performance, secure data transfer protocols based on FTP, plus a range of libraries and tools that use these protocols * Replica catalog services supporting the creation and location of file replicas in distributed systems These components leverage the substantial body of "Grid" services and protocols developed within the Globus project and by its collaborators, and are being used in a number of data-intensive application projects.

  1. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  2. Variable bandwidth broadcasting protocol for video-on-demand

    NASA Astrophysics Data System (ADS)

    Paris, Jehan-Francois; Long, Darrell D. E.

    2003-01-01

    We present the first broadcasting protocol that can alter the number of channels allocated to a given video without inconveniencing the viewer and without causing any temporary bandwidth surge. Our variable bandwidth broadcasting (VBB) protocol assigns to each video a minimum number of channels whose bandwidths are all equal to the video consumption rate. Additional channels can be assigned to the video at any time to reduce the customer waiting time or retaken to free server bandwidth. The cost of this additional flexibility is quite reasonable as the bandwidth requirements of our VBB fall between those of the fast broadcasting protocol and the new pagoda broadcasting protocol.

  3. A Weak Value Based QKD Protocol Robust Against Detector Attacks

    NASA Astrophysics Data System (ADS)

    Troupe, James

    2015-03-01

    We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.

  4. Analysis of limiting information characteristics of quantum-cryptography protocols

    SciTech Connect

    Sych, D V; Grishanin, Boris A; Zadkov, Viktor N

    2005-01-31

    The problem of increasing the critical error rate of quantum-cryptography protocols by varying a set of letters in a quantum alphabet for space of a fixed dimensionality is studied. Quantum alphabets forming regular polyhedra on the Bloch sphere and the continual alphabet equally including all the quantum states are considered. It is shown that, in the absence of basis reconciliation, a protocol with the tetrahedral alphabet has the highest critical error rate among the protocols considered, while after the basis reconciliation, a protocol with the continual alphabet possesses the highest critical error rate. (quantum optics and quantum computation)

  5. A Flexible CSMA based MAC Protocol for Software Defined Radios

    NASA Astrophysics Data System (ADS)

    Puschmann, André; Kalil, Mohamed A.; Mitschele-Thiel, Andreas

    2012-09-01

    In this article, we propose a flexible CSMA based MAC protocol which facilitates research and experimentation using software define radios. The modular architecture allows to employ the protocol on platforms with heterogeneous hardware capabilities and provides the freedom to exchange or adapt the spectrum sensing mechanism without modifying the MAC protocol internals. We discuss the architecture of the protocol and provide structural details of its main components. Furthermore, we present throughput measurements that have been obtained on an example system using host-based spectrum sensing.

  6. The Protocol of Choice for Treatment of Snake Bite

    PubMed Central

    Mohammad Alizadeh, Afshin; Rahimi, Mitra; Erfantalab, Peyman; Ostadi, Ali

    2016-01-01

    The aim of the current study is to compare three different methods of treatment of snake bite to determine the most efficient one. To unify the protocol of snake bite treatment in our center, we retrospectively reviewed files of the snake-bitten patients who had been referred to us between 2010 and 2014. They were contacted for follow-up using phone calls. Demographic and on-arrival characteristics, protocol used for treatment (WHO/Haddad/GF), and outcome/complications were evaluated. Patients were entered into one of the protocol groups and compared. Of a total of 63 patients, 56 (89%) were males. Five, 19, and 28 patients were managed by Haddad, WHO, or GF protocols, respectively. Eleven patients had fallen into both GF and WHO protocols and were excluded. Serum sickness was significantly more common when WHO protocol was used while 100% of the compartment syndromes and 71% of deformities had been reported after GF protocol. The most important complications were considered to be deformity, compartment syndrome, and amputation and were more frequent after the use of WHO and GF protocols (23.1% versus 76.9%; none in Haddad; P = NS). Haddad protocol seems to be the best for treatment of snake-bitten patients in our region. However, this cannot be strictly concluded because of the limited sample size and nonsignificant P values. PMID:27738653

  7. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols. PMID:27462493

  8. Protocol for a Delay-Tolerant Data-Communication Network

    NASA Technical Reports Server (NTRS)

    Torgerson, Jordan; Hooke, Adrian; Burleigh, Scott; Fall, Kevin

    2004-01-01

    As its name partly indicates, the Delay-Tolerant Networking (DTN) Bundle Protocol is a protocol for delay-tolerant transmission of data via communication networks. This protocol was conceived as a result of studies of how to adapt Internet protocols so that Internet-like services could be provided across interplanetary distances in support of deep-space exploration. The protocol, and software to implement the protocol, is being developed in collaboration among experts at NASA's Jet Propulsion Laboratory and other institutions. No current Internet protocols can accommodate long transmission delay times or intermittent link connectivity. The DTN Bundle Protocol represents a departure from the standard Internet assumption that a continuous path is available from a host computer to a client computer: It provides for routing of data through networks that may be disjointed and may be characterized by long transmission delays. In addition to networks that include deepspace communication links, examples of such networks include terrestrial ones within which branches are temporarily disconnected. The protocol is based partly on the definition of a message-based overlay above the transport layers of the networks on which it is hosted.

  9. NEMVP: North American energy measurement and verification protocol

    SciTech Connect

    1996-03-01

    This measurement and verification protocol discusses procedures that,when implemented, allow buyers, sellers, and financiers of energy projects to quantify energy conservation measure performance and savings.

  10. The Xpress Transfer Protocol (XTP): A tutorial (short version)

    NASA Technical Reports Server (NTRS)

    Sanders, Robert M.; Weaver, Alfred C.

    1990-01-01

    The Xpress Transfer Protocol (XTP) is a reliable, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers, and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in the high speed networks without compromising reliability and functionality. This tutorial briefly describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4.

  11. The Xpress Transfer Protocol (XTP): A tutorial (expanded version)

    NASA Technical Reports Server (NTRS)

    Sanders, Robert M.; Weaver, Alfred C.

    1990-01-01

    The Xpress Transfer Protocol (XTP) is a reliable, real-time, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in high speed networks without compromising reliability and functionality. This paper describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4.

  12. The AAPM/RSNA physics tutorial for residents: digital fluoroscopy.

    PubMed

    Pooley, R A; McKinney, J M; Miller, D A

    2001-01-01

    A digital fluoroscopy system is most commonly configured as a conventional fluoroscopy system (tube, table, image intensifier, video system) in which the analog video signal is converted to and stored as digital data. Other methods of acquiring the digital data (eg, digital or charge-coupled device video and flat-panel detectors) will become more prevalent in the future. Fundamental concepts related to digital imaging in general include binary numbers, pixels, and gray levels. Digital image data allow the convenient use of several image processing techniques including last image hold, gray-scale processing, temporal frame averaging, and edge enhancement. Real-time subtraction of digital fluoroscopic images after injection of contrast material has led to widespread use of digital subtraction angiography (DSA). Additional image processing techniques used with DSA include road mapping, image fade, mask pixel shift, frame summation, and vessel size measurement. Peripheral angiography performed with an automatic moving table allows imaging of the peripheral vasculature with a single contrast material injection.

  13. The AAPM/RSNA physics tutorial for residents: digital fluoroscopy.

    PubMed

    Pooley, R A; McKinney, J M; Miller, D A

    2001-01-01

    A digital fluoroscopy system is most commonly configured as a conventional fluoroscopy system (tube, table, image intensifier, video system) in which the analog video signal is converted to and stored as digital data. Other methods of acquiring the digital data (eg, digital or charge-coupled device video and flat-panel detectors) will become more prevalent in the future. Fundamental concepts related to digital imaging in general include binary numbers, pixels, and gray levels. Digital image data allow the convenient use of several image processing techniques including last image hold, gray-scale processing, temporal frame averaging, and edge enhancement. Real-time subtraction of digital fluoroscopic images after injection of contrast material has led to widespread use of digital subtraction angiography (DSA). Additional image processing techniques used with DSA include road mapping, image fade, mask pixel shift, frame summation, and vessel size measurement. Peripheral angiography performed with an automatic moving table allows imaging of the peripheral vasculature with a single contrast material injection. PMID:11259716

  14. Report of AAPM TG 135: quality assurance for robotic radiosurgery.

    PubMed

    Dieterich, Sonja; Cavedon, Carlo; Chuang, Cynthia F; Cohen, Alan B; Garrett, Jeffrey A; Lee, Charles L; Lowenstein, Jessica R; d'Souza, Maximian F; Taylor, David D; Wu, Xiaodong; Yu, Cheng

    2011-06-01

    The task group (TG) for quality assurance for robotic radiosurgery was formed by the American Association of Physicists in Medicine's Science Council under the direction of the Radiation Therapy Committee and the Quality Assurance (QA) Subcommittee. The task group (TG-135) had three main charges: (1) To make recommendations on a code of practice for Robotic Radiosurgery QA; (2) To make recommendations on quality assurance and dosimetric verification techniques, especially in regard to real-time respiratory motion tracking software; (3) To make recommendations on issues which require further research and development. This report provides a general functional overview of the only clinically implemented robotic radiosurgery device, the CyberKnife. This report includes sections on device components and their individual component QA recommendations, followed by a section on the QA requirements for integrated systems. Examples of checklists for daily, monthly, annual, and upgrade QA are given as guidance for medical physicists. Areas in which QA procedures are still under development are discussed.

  15. Information searching protocol: a smart protocol for specific content search over the Internet

    NASA Astrophysics Data System (ADS)

    Bhattarakosol, Pattarasinee; Preechaveerakul, Ladda

    2006-10-01

    Currently, information is very important to Internet users. Unfortunately, searching for specific information from the Internet is not easy as wishes. The existing search engine mechanisms cannot be performed using a pathname of URL as a search key. Therefore, users who have a partial pathname of URL cannot use their knowledge to narrow down the search results. Thus, users have to spend a long time searching for the required web site from the result list. This paper proposes a search protocol named Information Searching Protocol (ISP) that supports the multiple search contents for users who know a partial pathname of URL and keywords. Moreover, the architecture of the Global Search Engine System (GSES) that cooperates with the ISP and is responsible for the search mechanism is also proposed. The GSES consists of two separated parts: an Internet Search Protocol agent at the client site, and GSES components at the server site. These components allow users to perform the search using a pathname of URL composing with keywords. The functions of GSES components indicate that the ISP enhances the search mechanism. So, users receive more specific URL and can, shortly, get access to the required site.

  16. A Security Analysis of the 802.11s Wireless Mesh Network Routing Protocol and Its Secure Routing Protocols

    PubMed Central

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-01-01

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP. PMID:24002231

  17. A security analysis of the 802.11s wireless mesh network routing protocol and its secure routing protocols.

    PubMed

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-09-02

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP.

  18. Exploring Shared Memory Protocols in FLASH

    SciTech Connect

    Horowitz, Mark; Kunz, Robert; Hall, Mary; Lucas, Robert; Chame, Jacqueline

    2007-04-01

    ABSTRACT The goal of this project was to improve the performance of large scientific and engineering applications through collaborative hardware and software mechanisms to manage the memory hierarchy of non-uniform memory access time (NUMA) shared-memory machines, as well as their component individual processors. In spite of the programming advantages of shared-memory platforms, obtaining good performance for large scientific and engineering applications on such machines can be challenging. Because communication between processors is managed implicitly by the hardware, rather than expressed by the programmer, application performance may suffer from unintended communication – communication that the programmer did not consider when developing his/her application. In this project, we developed and evaluated a collection of hardware, compiler, languages and performance monitoring tools to obtain high performance on scientific and engineering applications on NUMA platforms by managing communication through alternative coherence mechanisms. Alternative coherence mechanisms have often been discussed as a means for reducing unintended communication, although architecture implementations of such mechanisms are quite rare. This report describes an actual implementation of a set of coherence protocols that support coherent, non-coherent and write-update accesses for a CC-NUMA shared-memory architecture, the Stanford FLASH machine. Such an approach has the advantages of using alternative coherence only where it is beneficial, and also provides an evolutionary migration path for improving application performance. We present data on two computations, RandomAccess from the HPC Challenge benchmarks and a forward solver derived from LS-DYNA, showing the performance advantages of the alternative coherence mechanisms. For RandomAccess, the non-coherent and write-update versions can outperform the coherent version by factors of 5 and 2.5, respectively. In LS-DYNA, we obtain

  19. Judges' Agreement and Disagreement Patterns When Encoding Verbal Protocols.

    ERIC Educational Resources Information Center

    Schael, Jocelyne; Dionne, Jean-Paul

    The basis of agreement or disagreement among judges/evaluators when applying a coding scheme to concurrent verbal protocols was studied. The sample included 20 university graduates, from varied backgrounds; 10 subjects had and 10 subjects did not have experience in protocol analysis. The total sample was divided into four balanced groups according…

  20. The Geneva Protocol of 1925: Past and Present.

    ERIC Educational Resources Information Center

    Harbison, John L.

    1982-01-01

    Presents a position paper for use in high school social studies class debates on the Geneva Protocol of 1925. The Protocol was an international agreement to restrict chemical and biological warfare (CBW). The author traces the history of U.S. policies dealing with CBW since 1925. (AM)

  1. Clinicians as Communication Partners: Developing a Mediated Discourse Elicitation Protocol

    ERIC Educational Resources Information Center

    Hengst, Julie A.; Duff, Melissa C.

    2007-01-01

    This article presents the development and piloting of a mediated discourse elicitation protocol. Grounded in situated theories of communication and informed by mediated discourse analysis, this protocol selectively samples familiar discourse types in a manner designed to preserve interactional aspects of communication. Critically, the mediated…

  2. PACIFIC NORTHWEST SIDE-BY-SIDE PROTOCOL COMPARISON TEST

    EPA Science Inventory

    Eleven state, tribal, and federal agencies participated during summer 2005 in a side-by-side comparison of protocols used to measure common in-stream physical attributes to help determine which protocols are best for determining status and trend of stream/watershed condition. Th...

  3. Concurrent Think-Aloud Protocol as a Socially Situated Construct

    ERIC Educational Resources Information Center

    Sasaki, Tomomi

    2008-01-01

    Verbal report protocols have been considered as direct representations of individual cognitive processes. The present study examined the social nature of verbal reports, particularly focusing on whether and in what ways concurrent think-aloud (TA) protocol data are recipient-designed. The results of this study suggest that verbal reports elicited…

  4. Discrete rotational symmetry and quantum-key-distribution protocols

    SciTech Connect

    Shirokoff, David; Fung, Chi-Hang Fred; Lo, Hoi-Kwong

    2007-03-15

    We study the role of discrete rotational symmetry in the quantum key distribution by generalizing the well-known Bennett-Brassard 1984 and Scarani-Acin-Ribordy-Gisin 2004 protocols. We observe that discrete rotational symmetry results in the protocol's invariance to continuous rotations, thus leading to a simplified relation between bit and phase error rates and consequently a straightforward security proof.

  5. 40 CFR 766.14 - Contents of protocols.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Contents of protocols. 766.14 Section 766.14 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT DIBENZO-PARA-DIOXINS/DIBENZOFURANS General Provisions § 766.14 Contents of protocols....

  6. 40 CFR 766.14 - Contents of protocols.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Contents of protocols. 766.14 Section 766.14 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT DIBENZO-PARA-DIOXINS/DIBENZOFURANS General Provisions § 766.14 Contents of protocols....

  7. 40 CFR 766.14 - Contents of protocols.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Contents of protocols. 766.14 Section 766.14 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT DIBENZO-PARA-DIOXINS/DIBENZOFURANS General Provisions § 766.14 Contents of protocols....

  8. 40 CFR 766.14 - Contents of protocols.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Contents of protocols. 766.14 Section 766.14 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT DIBENZO-PARA-DIOXINS/DIBENZOFURANS General Provisions § 766.14 Contents of protocols....

  9. SDLIP + STARTS = SDARTS: A Protocol and Toolkit for Metasearching.

    ERIC Educational Resources Information Center

    Green, Noah; Ipeirotis, Panagiotis G.; Gravano, Luis

    This paper describes how SDLIP and STARTS, two complementary protocols for searching over distributed document collections, were combined. The resulting protocol, called SDARTS, is simple yet expressible enough to enable building sophisticated metasearch engines. SDARTS can be viewed as an instantiation of SDLIP with metasearch-specific elements…

  10. Experimental eavesdropping attack against Ekert's protocol based on Wigner's inequality

    SciTech Connect

    Bovino, F. A.; Colla, A. M.; Castagnoli, G.; Castelletto, S.; Degiovanni, I. P.; Rastello, M. L.

    2003-09-01

    We experimentally implemented an eavesdropping attack against the Ekert protocol for quantum key distribution based on the Wigner inequality. We demonstrate a serious lack of security of this protocol when the eavesdropper gains total control of the source. In addition we tested a modified Wigner inequality which should guarantee a secure quantum key distribution.

  11. Power Tools for Talking: Custom Protocols Enrich Coaching Conversations

    ERIC Educational Resources Information Center

    Pomerantz, Francesca; Ippolito, Jacy

    2015-01-01

    Discussion-based protocols--an "agreed upon set of discussion or observation rules that guide coach/teacher/student work, discussion, and interactions" (Ippolito & Lieberman, 2012, p. 79)--can help focus and structure productive professional learning discussions. However, while protocols are slowly growing into essential elements of…

  12. A Network Client Using the Gopher Information Discovery Protocol

    1993-10-05

    WSGOPHER uses the protocol known as Gopher, which is described in Internet RFC 1436. Specifically Gopher is a client/server protocol. Gopher servers provide information across the network to Gopher clients. WSGOPHER is an implementation of a Gopher client for Microsoft Windows 3.1 and Windows Sockets version 1.1.

  13. The Strong-Inference Protocol: Not Just for Grant Proposals

    ERIC Educational Resources Information Center

    Hiebert, Sara M.

    2007-01-01

    The strong-inference protocol puts into action the important concepts in Platt's often-assigned, classic paper on the strong-inference method (10). Yet, perhaps because students are frequently performing experiments with known outcomes, the protocols they write as undergraduates are usually little more than step-by-step instructions for performing…

  14. 21 CFR 312.320 - Treatment IND or treatment protocol.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 5 2014-04-01 2014-04-01 false Treatment IND or treatment protocol. 312.320... for Treatment Use § 312.320 Treatment IND or treatment protocol. Under this section, FDA may permit an investigational drug to be used for widespread treatment use. (a) Criteria. The criteria in § 312.305(a) must...

  15. 21 CFR 312.320 - Treatment IND or treatment protocol.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 5 2012-04-01 2012-04-01 false Treatment IND or treatment protocol. 312.320... for Treatment Use § 312.320 Treatment IND or treatment protocol. Under this section, FDA may permit an investigational drug to be used for widespread treatment use. (a) Criteria. The criteria in § 312.305(a) must...

  16. 21 CFR 312.320 - Treatment IND or treatment protocol.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Treatment IND or treatment protocol. 312.320... for Treatment Use § 312.320 Treatment IND or treatment protocol. Under this section, FDA may permit an investigational drug to be used for widespread treatment use. (a) Criteria. The criteria in § 312.305(a) must...

  17. 21 CFR 312.320 - Treatment IND or treatment protocol.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 5 2013-04-01 2013-04-01 false Treatment IND or treatment protocol. 312.320... for Treatment Use § 312.320 Treatment IND or treatment protocol. Under this section, FDA may permit an investigational drug to be used for widespread treatment use. (a) Criteria. The criteria in § 312.305(a) must...

  18. Enhancing Self-Regulated Learning by Writing Learning Protocols

    ERIC Educational Resources Information Center

    Nuckles, Matthias; Hubner, Sandra; Renkl, Alexander

    2009-01-01

    Learning protocols are a self-guided way of writing that allows for elaboration and reflection on learning content. In an experimental study (N = 103), we supported protocol writing with prompts to elicit important strategies as postulated by a cyclical model of self-regulated learning. Students received either (a) no prompts, (b) cognitive…

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  20. The Design of a Reliable Multipeer Protocol for DVEs.

    ERIC Educational Resources Information Center

    Stuer, Gunther; Broeckhove, Jan; Arickx, Frans

    2003-01-01

    Presents the design and implementation of a reliable multipeer protocol (RMPP), which is written in Java and is suitable for applications in the area of distributed virtual environments (DVEs). Discusses motivation, protocol classification, design goals and the error recovery algorithm and lists implementation issues. Presents two possible…

  1. Exercise countermeasure protocol management expert system

    NASA Technical Reports Server (NTRS)

    Webster, L.; Chen, J. G.; Flores, L.; Tan, S.

    1993-01-01

    Exercise will be used primarily to countermeasure against deconditioning on extended space flight. In this paper we describe the development and evaluation of an expert system for exercise countermeasure protocol management. Currently, the system includes two major subsystems: baseline prescription and prescription adjustment. The baseline prescription subsystem is designed to provide initial exercise prescriptions while prescription adjustment subsystem is designed to modify the initial prescription based on the exercised progress. The system runs under three different environments: PC, SUN workstation, and Symbolic machine. The inference engine, baseline prescription module, prescription adjustment module and explanation module are developed under the Symbolic environment by using the ART (Automated Reasoning Tool) software. The Sun environment handles database management features and interfaces with PC environment to obtain physical and physiological data from exercise units on-board during the flight. Eight subjects' data have been used to evaluate the system performance by comparing the prescription of nine experienced exercise physiologists and the one prescribed by the expert system. The results of the validation test indicated that the performance of the expert system was acceptable.

  2. Data Provenance in Photogrammetry Through Documentation Protocols

    NASA Astrophysics Data System (ADS)

    Carboni, N.; Bruseker, G.; Guillem, A.; Bellido Castañeda, D.; Coughenour, C.; Domajnko, M.; de Kramer, M.; Ramos Calles, M. M.; Stathopoulou, E. K.; Suma, R.

    2016-06-01

    Documenting the relevant aspects in digitisation processes such as photogrammetry in order to provide a robust provenance for their products continues to present a challenge. The creation of a product that can be re-used scientifically requires a framework for consistent, standardised documentation of the entire digitisation pipeline. This article provides an analysis of the problems inherent to such goals and presents a series of protocols to document the various steps of a photogrammetric workflow. We propose this pipeline, with descriptors to track all phases of digital product creation in order to assure data provenance and enable the validation of the operations from an analytic and production perspective. The approach aims to support adopters of the workflow to define procedures with a long term perspective. The conceptual schema we present is founded on an analysis of information and actor exchanges in the digitisation process. The metadata were defined through the synthesis of previous proposals in this area and were tested on a case study. We performed the digitisation of a set of cultural heritage artefacts from an Iron Age burial in Ilmendorf, Germany. The objects were captured and processed using different techniques, including a comparison of different imaging tools and algorithms. This augmented the complexity of the process allowing us to test the flexibility of the schema for documenting complex scenarios. Although we have only presented a photogrammetry digitisation scenario, we claim that our schema is easily applicable to a multitude of 3D documentation processes.

  3. [The research protocol IV: study variables].

    PubMed

    Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    The variables in a research study are all that is measured, the information collected, or the data that is collected in order to answer the research questions, which are specified in the objectives. Their selection is essential to the research protocol. This article aims to point out the elements to be considered in the section of the variables. To avoid ambiguity, it is necessary to select only those that will help achieve the study objectives. It should subsequently be defined how they will be measured to ensure that the findings can be replicated; it is therefore desirable to include conceptual and operational definitions. From the methodological point of view, the classification of variables helps us understand how the relationship between them is conceptualized. Depending on the study design, the independent, dependent, universal, and confounding variables should be noted. Another indispensable element for planning statistical analyses is the scale of variable measurement. Therefore, one must specify whether the variables correspond to one of the following four: qualitative nominal, qualitative ordinal, quantitative range, or quantitative ratio. Finally, we should detail the measurement units of each variable. PMID:27560918

  4. Protocol for sampling environmental sites for legionellae

    SciTech Connect

    Barbaree, J.M.; Gorman, G.W.; Martin, W.T.; Fields, B.S.; Morrill, W.E.

    1987-07-01

    A protocol for sampling environmental sites was developed and used to identify possible sources of Legionella species in support of epidemiologic investigations at two hospitals. In hospital A, legionellae were isolated from 43 of 106 (40%) different sites. Three separate Legionella pneumophila serotypes and a previously unrecognized species were present in different combinations in the positive samples. Two of five cooling towers contained the same L. pneumophila serogroup 1 monoclonal type (1,2,4,5) as was isolated from patients. The same monoclonal type was also isolated from make-up water for the two cooling towers, a hot water tank, water separators in four main air compressor systems for respiratory therapy, and cold and hot water faucets. In hospital B, 13 of 37 (38%) sample sites contained legionellae, all of which were L. pneumophila serogroup 1. The monoclonal type matching isolates from patients (1,2,4,5) was found at the highest concentration in a hot water tank, but it was also present at four other sample sites. Since legionellae not related to disease may be found in many of the sites sampled, an epidemiologic association with the probable source should be established before intervention methods, such as disinfection, are undertaken.

  5. [The research protocol III. Study population].

    PubMed

    Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software. PMID:27174763

  6. Potential anesthesia protocols for space exploration missions.

    PubMed

    Komorowski, Matthieu; Watkins, Sharmila D; Lebuffe, Gilles; Clark, Jonathan B

    2013-03-01

    In spaceflight beyond low Earth's orbit, medical conditions requiring surgery are of a high level of concern because of their potential impact on crew health and mission success. Whereas surgical techniques have been thoroughly studied in spaceflight analogues, the research focusing on anesthesia is limited. To provide safe anesthesia during an exploration mission will be a highly challenging task. The research objective is thus to describe specific anesthesia procedures enabling treatment of pre-identified surgical conditions. Among the medical conditions considered by the NASA Human Research Program Exploration Medical Capability element, those potentially necessitating anesthesia techniques have been identified. The most appropriate procedure for each condition is thoroughly discussed. The substantial cost of training time necessary to implement regional anesthesia is pointed out. Within general anesthetics, ketamine combines the unique advantages of preservation of cardiovascular stability, the protective airway reflexes, and spontaneous ventilation. Ketamine side effects have for decades tempered enthusiasm for its use, but recent developments in mitigation means broadened its indications. The extensive experience gathered in remote environments, with minimal equipment and occasionally by insufficiently trained care providers, confirms its high degree of safety. Two ketamine-based anesthesia protocols are described with their corresponding indications. They have been designed taking into account the physiological changes occurring in microgravity and the specific constraints of exploration missions. This investigation could not only improve surgical care during long-duration spaceflights, but may find a number of terrestrial applications in isolated or austere environments.

  7. Fault discovery protocol for passive optical networks

    NASA Astrophysics Data System (ADS)

    Hajduczenia, Marek; Fonseca, Daniel; da Silva, Henrique J. A.; Monteiro, Paulo P.

    2007-06-01

    All existing flavors of passive optical networks (PONs) provide an attractive alternative to legacy copper-based access lines deployed between a central office (CO) of the service provider (SP) and a customer site. One of the most challenging tasks for PON network planners is the reduction of the overall cost of employing protection schemes for the optical fiber plant while maintaining a reasonable level of survivability and reducing the downtime, thus ensuring acceptable levels of quality of service (QoS) for end subscribers. The recently growing volume of Ethernet PONs deployment [Kramer, IEEE 802.3, CFI (2006)], connected with low-cost electronic and optical components used in the optical network unit (ONU) modules, results in the situation where remote detection of faulty/active subscriber modules becomes indispensable for proper operation of an EPON system. The problem of the remote detection of faulty ONUs in the system is addressed where the upstream channel is flooded with the cw transmission from one or more damaged ONUs and standard communication is severed, providing a solution that is applicable in any type of PON network, regardless of the operating protocol, physical structure, and data rate.

  8. [The research protocol IV: study variables].

    PubMed

    Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    The variables in a research study are all that is measured, the information collected, or the data that is collected in order to answer the research questions, which are specified in the objectives. Their selection is essential to the research protocol. This article aims to point out the elements to be considered in the section of the variables. To avoid ambiguity, it is necessary to select only those that will help achieve the study objectives. It should subsequently be defined how they will be measured to ensure that the findings can be replicated; it is therefore desirable to include conceptual and operational definitions. From the methodological point of view, the classification of variables helps us understand how the relationship between them is conceptualized. Depending on the study design, the independent, dependent, universal, and confounding variables should be noted. Another indispensable element for planning statistical analyses is the scale of variable measurement. Therefore, one must specify whether the variables correspond to one of the following four: qualitative nominal, qualitative ordinal, quantitative range, or quantitative ratio. Finally, we should detail the measurement units of each variable.

  9. [The research protocol III. Study population].

    PubMed

    Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software.

  10. Enhanced parent selection algorithms in mintroute protocol

    NASA Astrophysics Data System (ADS)

    Kim, Ki-Il

    2012-11-01

    A low-rate, short-range wireless radio communication on a small device often hampers high reliability in wireless sensor networks. However, more applications are increasingly demanding high reliability. To meet this requirement, various approaches have been proposed in each viewpoint of layers. Among those, MintRoute is a well-known network layer approach to develop a new metric based on link quality for path selection towards the sink. By choosing the link with the highest measured value, it has a higher possibility to transmit a packet over the link without error. However, there are still several issues to be mentioned during operations. In this paper, we propose how to improve the MintRoute protocol through revised algorithms. They include a parent selection considering distance and level from the sink node, and a fast recovery method against failures. Simulations and analysis are performed by in order to validate the suitability of reduced end-to-end delay and fast recovery for failures, thus to enhance the reliability of communication.

  11. RADCAL Operations Manual Radiation Calibration Laboratory Protocol

    SciTech Connect

    Bogard, J.S.

    1998-12-01

    The Life Sciences Division (LSD) of Oak Ridge National Laboratory (ORNL) has a long record of radiation dosimetry research, primarily using the Health Physics Research Reactor (HPRR) and the Radiation Calibration Laboratory (RADCAL) in its Dosimetry Applications Research (DOSAR) Program. These facilities have been used by a broad segment of the research community to perform a variety of experiments in areas including, but not limited to, radiobiology, radiation dosimeter and instrumentation development and calibration, and the testing of materials in a variety of radiation environments. Operations of the HPRR were terminated in 1987 and the reactor was moved to storage at the Oak Ridge Y-12 Plant; however, RADCAL will continue to be operated in accordance with the guidelines of the National Institute of Standards and Technology (NIST) Secondary Calibration Laboratory program and will meet all requirements for testing dosimeters under the National Voluntary Laboratory Accreditation Program (NVLAP). This manual is to serve as the primary instruction and operation manual for the Oak Ridge National Laboratory's RADCAL facility. Its purpose is to (1) provide operating protocols for the RADCAL facility, (2) outline the organizational structure, (3) define the Quality Assurance Action Plan, and (4) describe all the procedures, operations, and responsibilities for the safe and proper operation of all routine aspects of the calibration facility.

  12. Parabiosis in Mice: A Detailed Protocol

    PubMed Central

    Kamran, Paniz; Sereti, Konstantina-Ioanna; Zhao, Peng; Ali, Shah R.; Weissman, Irving L.; Ardehali, Reza

    2013-01-01

    Parabiosis is a surgical union of two organisms allowing sharing of the blood circulation. Attaching the skin of two animals promotes formation of microvasculature at the site of inflammation. Parabiotic partners share their circulating antigens and thus are free of adverse immune reaction. First described by Paul Bert in 18641, the parabiosis surgery was refined by Bunster and Meyer in 1933 to improve animal survival2. In the current protocol, two mice are surgically joined following a modification of the Bunster and Meyer technique. Animals are connected through the elbow and knee joints followed by attachment of the skin allowing firm support that prevents strain on the sutured skin. Herein, we describe in detail the parabiotic joining of a ubiquitous GFP expressing mouse to a wild type (WT) mouse. Two weeks after the procedure, the pair is separated and GFP positive cells can be detected by flow cytometric analysis in the blood circulation of the WT mouse. The blood chimerism allows one to examine the contribution of the circulating cells from one animal in the other. PMID:24145664

  13. Spherical-code key-distribution protocols for qubits

    SciTech Connect

    Renes, Joseph M.

    2004-11-01

    Recently spherical codes were introduced as potentially more capable ensembles for quantum key distribution. Here we develop specific key-creation protocols for the two qubit-based spherical codes, the trine and tetrahedron, and analyze them in the context of a suitably tailored intercept/resend attack, both in standard form, and in a 'gentler' version whose back action on the quantum state is weaker. When compared to the standard unbiased basis protocols, Bennett-Brassard 1984 (BB84) and six-state, two distinct advantages are found. First, they offer improved tolerance of eavesdropping, the trine besting its counterpart BB84 and the tetrahedron the six-state protocol. Second, the key error rate may be computed from the sift rate of the protocol itself, removing the need to sacrifice key bits for this purpose. This simplifies the protocol and improves the overall key rate.0.

  14. Auto-configuration protocols in mobile ad hoc networks.

    PubMed

    Villalba, Luis Javier García; Matesanz, Julián García; Orozco, Ana Lucila Sandoval; Díaz, José Duván Márquez

    2011-01-01

    The TCP/IP protocol allows the different nodes in a network to communicate by associating a different IP address to each node. In wired or wireless networks with infrastructure, we have a server or node acting as such which correctly assigns IP addresses, but in mobile ad hoc networks there is no such centralized entity capable of carrying out this function. Therefore, a protocol is needed to perform the network configuration automatically and in a dynamic way, which will use all nodes in the network (or part thereof) as if they were servers that manage IP addresses. This article reviews the major proposed auto-configuration protocols for mobile ad hoc networks, with particular emphasis on one of the most recent: D2HCP. This work also includes a comparison of auto-configuration protocols for mobile ad hoc networks by specifying the most relevant metrics, such as a guarantee of uniqueness, overhead, latency, dependency on the routing protocol and uniformity.

  15. The importance of the Montreal Protocol in protecting climate

    PubMed Central

    Velders, Guus J. M.; Andersen, Stephen O.; Daniel, John S.; Fahey, David W.; McFarland, Mack

    2007-01-01

    The 1987 Montreal Protocol on Substances that Deplete the Ozone Layer is a landmark agreement that has successfully reduced the global production, consumption, and emissions of ozone-depleting substances (ODSs). ODSs are also greenhouse gases that contribute to the radiative forcing of climate change. Using historical ODSs emissions and scenarios of potential emissions, we show that the ODS contribution to radiative forcing most likely would have been much larger if the ODS link to stratospheric ozone depletion had not been recognized in 1974 and followed by a series of regulations. The climate protection already achieved by the Montreal Protocol alone is far larger than the reduction target of the first commitment period of the Kyoto Protocol. Additional climate benefits that are significant compared with the Kyoto Protocol reduction target could be achieved by actions under the Montreal Protocol, by managing the emissions of substitute fluorocarbon gases and/or implementing alternative gases with lower global warming potentials. PMID:17360370

  16. Packet-Based Protocol Efficiency for Aeronautical and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Carek, David A.

    2005-01-01

    This paper examines the relation between bit error ratios and the effective link efficiency when transporting data with a packet-based protocol. Relations are developed to quantify the impact of a protocol s packet size and header size relative to the bit error ratio of the underlying link. These relations are examined in the context of radio transmissions that exhibit variable error conditions, such as those used in satellite, aeronautical, and other wireless networks. A comparison of two packet sizing methodologies is presented. From these relations, the true ability of a link to deliver user data, or information, is determined. Relations are developed to calculate the optimal protocol packet size forgiven link error characteristics. These relations could be useful in future research for developing an adaptive protocol layer. They can also be used for sizing protocols in the design of static links, where bit error ratios have small variability.

  17. Near-optimal protocols in complex nonequilibrium transformations.

    PubMed

    Gingrich, Todd R; Rotskoff, Grant M; Crooks, Gavin E; Geissler, Phillip L

    2016-09-13

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. We describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. We show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of the protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon. PMID:27573816

  18. [Immunosuppressive protocols in kidney transplantation: with or without induction?].

    PubMed

    Nehme Chelala, Dania; Mourani, Chebl; Moukarzel, Maroun; Azar, Hiba

    2015-01-01

    Kidney transplantation is the treatment of choice of end stage kidney disease. Over the years, kidney transplantation progressed tremendously, mainly by the improvement of immunosuppressive drugs used in the prevention of acute rejection. Since the introduction of cyclosporine in the 80s, many immunosuppressive protocols have been established. These protocols are characterized by two strategies: with or without induction. The agents used in induction therapies can be polyclonal or monoclonal antibodies. The decision of using induction therapy relies mainly on the evaluation of the immunological risk in the recipient. Even if protocols with induction have improved early results concerning acute rejection, the protocoles without induction seem justified in some candidates. The optimal immunosuppressive protocol is not yet established, and individualization of immunosuppressive treatment is necessary. PMID:26591195

  19. Proof of fault coverage for a formal protocol test procedure

    NASA Astrophysics Data System (ADS)

    Randall, Michael A.

    1992-12-01

    Due to the speed and complexity of communication networks being designed today, it is imperative to ensure that they operate correctly. Todays fiber optic networks, which can transmit billions of bits per second over thousands of miles, are heavily dependent on sophisticated software and protocols which are becoming increasingly difficult to test. Conformance testing is a method that is used for this purpose: to test the design of a protocol against an implementation of the design. This thesis provides some insight into the conformance testing problem by first providing background on some current protocol test methods and then focusing on a newer method, which is based on a formal protocol specification. A proof is given that demonstrates the method's error detection capabilities. Two well known local area network protocols, Token Bus and Fiber Distributed Data Interface (FDDI), are used as examples to illustrate how the test method is applied to a specification.

  20. Security of the arbitrated quantum signature protocols revisited

    NASA Astrophysics Data System (ADS)

    Kejia, Zhang; Dan, Li; Qi, Su

    2014-01-01

    Recently, much attention has been paid to the study of arbitrated quantum signature (AQS). Among these studies, the cryptanalysis of some AQS protocols and a series of improved ideas have been proposed. Compared with the previous analysis, we present a security criterion, which can judge whether an AQS protocol is able to prevent the receiver (i.e. one participant in the signature protocol) from forging a legal signature. According to our results, it can be seen that most AQS protocols which are based on the Zeng and Keitel (ZK) model are susceptible to a forgery attack. Furthermore, we present an improved idea of the ZK protocol. Finally, some supplement discussions and several interesting topics are provided.

  1. A Secured Authentication Protocol for SIP Using Elliptic Curves Cryptography

    NASA Astrophysics Data System (ADS)

    Chen, Tien-Ho; Yeh, Hsiu-Lien; Liu, Pin-Chuan; Hsiang, Han-Chen; Shih, Wei-Kuan

    Session initiation protocol (SIP) is a technology regularly performed in Internet Telephony, and Hyper Text Transport Protocol (HTTP) as digest authentication is one of the major methods for SIP authentication mechanism. In 2005, Yang et al. pointed out that HTTP could not resist server spoofing attack and off-line guessing attack and proposed a secret authentication with Diffie-Hellman concept. In 2009, Tsai proposed a nonce based authentication protocol for SIP. In this paper, we demonstrate that their protocol could not resist the password guessing attack and insider attack. Furthermore, we propose an ECC-based authentication mechanism to solve their issues and present security analysis of our protocol to show that ours is suitable for applications with higher security requirement.

  2. The importance of the Montreal Protocol in protecting climate.

    PubMed

    Velders, Guus J M; Andersen, Stephen O; Daniel, John S; Fahey, David W; McFarland, Mack

    2007-03-20

    The 1987 Montreal Protocol on Substances that Deplete the Ozone Layer is a landmark agreement that has successfully reduced the global production, consumption, and emissions of ozone-depleting substances (ODSs). ODSs are also greenhouse gases that contribute to the radiative forcing of climate change. Using historical ODSs emissions and scenarios of potential emissions, we show that the ODS contribution to radiative forcing most likely would have been much larger if the ODS link to stratospheric ozone depletion had not been recognized in 1974 and followed by a series of regulations. The climate protection already achieved by the Montreal Protocol alone is far larger than the reduction target of the first commitment period of the Kyoto Protocol. Additional climate benefits that are significant compared with the Kyoto Protocol reduction target could be achieved by actions under the Montreal Protocol, by managing the emissions of substitute fluorocarbon gases and/or implementing alternative gases with lower global warming potentials.

  3. Reducing the write traffic for a hybrid cache protocol

    SciTech Connect

    Dahlgren, F.; Stenstroem, P.

    1994-12-31

    Coherence misses limit the performance of write-invalidate cache protocols in large-scale shared-memory multiprocessors. By contrast, hybrid protocols mix updates with invalidations and can reduce the coherence miss rate. The gains of the fewer coherence misses, however, can sometimes be outweighed by contention due to the extra traffic making techniques to cut the write traffic important. We study in this paper how write traffic for hybrid protocols can be reduced by incorporating a write cache in each node. Detailed architectural simulations reveal that write caches are effective in exploiting locality in write accesses under relaxed memory consistency models. Hybrid protocols augmented with write caches with only a few entries are shown to outperform a write-invalidate protocol for all five benchmark applications under study.

  4. A Self-Stabilizing Synchronization Protocol for Arbitrary Digraphs

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2011-01-01

    This paper presents a self-stabilizing distributed clock synchronization protocol in the absence of faults in the system. It is focused on the distributed clock synchronization of an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period. We present an outline of a deductive proof of the correctness of the protocol. A bounded model of the protocol was mechanically verified for a variety of topologies. Results of the mechanical proof of the correctness of the protocol are provided. The model checking results have verified the correctness of the protocol as they apply to the networks with unidirectional and bidirectional links. In addition, the results confirm the claims of determinism and linear convergence. As a result, we conjecture that the protocol solves the general case of this problem. We also present several variations of the protocol and discuss that this synchronization protocol is indeed an emergent system.

  5. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised

    NASA Technical Reports Server (NTRS)

    Mueller, J. L. (Editor); Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor)

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.

  6. Mars Sample Handling Protocol Workshop Series: Workshop 4

    NASA Technical Reports Server (NTRS)

    Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)

    2001-01-01

    In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol

  7. Fault-tolerant quantum blind signature protocols against collective noise

    NASA Astrophysics Data System (ADS)

    Zhang, Ming-Hui; Li, Hui-Fang

    2016-07-01

    This work proposes two fault-tolerant quantum blind signature protocols based on the entanglement swapping of logical Bell states, which are robust against two kinds of collective noises: the collective-dephasing noise and the collective-rotation noise, respectively. Both of the quantum blind signature protocols are constructed from four-qubit decoherence-free (DF) states, i.e., logical Bell qubits. The initial message is encoded on the logical Bell qubits with logical unitary operations, which will not destroy the anti-noise trait of the logical Bell qubits. Based on the fundamental property of quantum entanglement swapping, the receiver simply performs two Bell-state measurements (rather than four-qubit joint measurements) on the logical Bell qubits to verify the signature, which makes the protocols more convenient in a practical application. Different from the existing quantum signature protocols, our protocols can offer the high fidelity of quantum communication with the employment of logical qubits. Moreover, we hereinafter prove the security of the protocols against some individual eavesdropping attacks, and we show that our protocols have the characteristics of unforgeability, undeniability and blindness.

  8. Compositional mining of multiple object API protocols through state abstraction.

    PubMed

    Dai, Ziying; Mao, Xiaoguang; Lei, Yan; Qi, Yuhua; Wang, Rui; Gu, Bin

    2013-01-01

    API protocols specify correct sequences of method invocations. Despite their usefulness, API protocols are often unavailable in practice because writing them is cumbersome and error prone. Multiple object API protocols are more expressive than single object API protocols. However, the huge number of objects of typical object-oriented programs poses a major challenge to the automatic mining of multiple object API protocols: besides maintaining scalability, it is important to capture various object interactions. Current approaches utilize various heuristics to focus on small sets of methods. In this paper, we present a general, scalable, multiple object API protocols mining approach that can capture all object interactions. Our approach uses abstract field values to label object states during the mining process. We first mine single object typestates as finite state automata whose transitions are annotated with states of interacting objects before and after the execution of the corresponding method and then construct multiple object API protocols by composing these annotated single object typestates. We implement our approach for Java and evaluate it through a series of experiments. PMID:23844378

  9. How to Implement a Protocol for Babel RMI

    SciTech Connect

    Kumfert, G; Leek, J

    2006-03-30

    RMI support in Babel has two main goals: transparency & flexibility. Transparency meaning that the new RMI features are entirely transparent to existing Babelized code; flexibility meaning the RMI capability should also be flexible enough to support a variety of RMI transport implementations. Babel RMI is a big success in both areas. Babel RMI is completely transparent to already Babelized implementation code, allowing painless upgrade, and only very minor setup changes are required in client code to take advantage of RMI. The Babel RMI transport mechanism is also extremely flexible. Any protocol that implements Babel's minimal, but complete, interface may be used as a Babel RMI protocol. The Babel RMI API allows users to select the best protocol and connection model for their application, whether that means a WebServices-like client-server model for use over a WAP, or a faster binary peer-to-peer protocol for use on different nodes in a leadership-class supercomputer. Users can even change protocols without recompiling their code. The goal of this paper is to give network researchers and protocol implementors the information they need to develop new protocols for Babel RMI. This paper will cover both the high-level interfaces in the Babel RMI API, and the low level details about how Babel RMI handles RMI objects.

  10. Development of a novel protocol for generating flavivirus reporter particles.

    PubMed

    Fernández, Igor Velado; Okamoto, Natsumi; Ito, Aki; Fukuda, Miki; Someya, Azusa; Nishino, Yosii; Sasaki, Nobuya; Maeda, Akihiko

    2014-11-01

    Infection with West Nile virus (WNV), a mosquito-borne flavivirus, is a growing public and animal health concern worldwide. Prevention, diagnosis and treatment strategies for the infection are urgently required. Recently, viral reverse genetic systems have been developed and applied to clinical WNV virology. We developed a protocol for generating reporter virus particles (RVPs) of WNV with the aim of overcoming two major problems associated with conventional protocols, the difficulty in generating RVPs due to the specific skills required for handling RNAs, and the potential for environmental contamination by antibiotic-resistant genes encoded within the genome RNA of the RVPs. By using the proposed protocol, cells were established in which the RVP genome RNA is replicated constitutively and does not encode any antibiotic-resistant genes, and used as the cell supply for RVP genome RNA. Generation of the WNV RVPs requires only the simple transfection of the expression vectors for the viral structural proteins into the cells. Therefore, no RNA handling is required in this protocol. The WNV RVP yield obtained using this protocol was similar that obtained using the conventional protocol. According to these results, the newly developed protocol appears to be a good alternative for the generation of WNV RVPs, particularly for clinical applications.

  11. Fault-tolerant quantum blind signature protocols against collective noise

    NASA Astrophysics Data System (ADS)

    Zhang, Ming-Hui; Li, Hui-Fang

    2016-10-01

    This work proposes two fault-tolerant quantum blind signature protocols based on the entanglement swapping of logical Bell states, which are robust against two kinds of collective noises: the collective-dephasing noise and the collective-rotation noise, respectively. Both of the quantum blind signature protocols are constructed from four-qubit decoherence-free (DF) states, i.e., logical Bell qubits. The initial message is encoded on the logical Bell qubits with logical unitary operations, which will not destroy the anti-noise trait of the logical Bell qubits. Based on the fundamental property of quantum entanglement swapping, the receiver simply performs two Bell-state measurements (rather than four-qubit joint measurements) on the logical Bell qubits to verify the signature, which makes the protocols more convenient in a practical application. Different from the existing quantum signature protocols, our protocols can offer the high fidelity of quantum communication with the employment of logical qubits. Moreover, we hereinafter prove the security of the protocols against some individual eavesdropping attacks, and we show that our protocols have the characteristics of unforgeability, undeniability and blindness.

  12. Protocols, practices, and the reproduction of technique in molecular biology.

    PubMed

    Lynch, Michael

    2002-06-01

    Protocols are one of the main organizational resources in molecular biology. They are written instructions that specify ingredients, equipment, and sequences of steps for making technical preparations. Some protocols are published in widely used manuals, while others are hand-written variants used by particular laboratories and individual technicians. It is widely understood, both in molecular biology and in social studies of science, that protocols do not describe exactly what practitioners do in the laboratory workplace. In social studies of science, the difference between protocols and the actual practices of doing them often is used to set up ironic contrasts between 'messy' laboratory practices and the appearance of technical order. Alternatively, in ethnomethodological studies of work, the difference is examined as a constitutive feature, both of the lived-work of doing technical projects, and of the administrative work of regulating and evaluating such projects. The present article takes its point of departure from ethnomethodology, and begins with a discussion of local problems with performing molecular biology protocols on specific occasions. The discussion then moves to particular cases in criminal law in which defense attorneys cross-examine forensic technicians and lab administrators. In these interrogations, the distinction between protocols and actual practices animates the dialogue and becomes consequential for judgments in the case at hand. The article concludes with a discussion of administrative science: the work of treating protocols and paper trails as proxies for actual 'scientific' practices.

  13. A study of treadmill exercise protocols for Chinese males.

    PubMed

    Ho, B L

    1982-02-01

    Treadmill stress testing currently is used in screening for coronary artery disease. Maximal oxygen consumption is the best index of work capacity and maximal cardiovascular function. Clinically, there are many exercise protocols being utilized, including Bruce, Kattus, Balke, Naughton, and Chinese Air Force (CAF). The purpose of this study is to compare five of them and evaluate their reproducibility. Each of 24 volunteers performed one exercise test per week for 3 periods totalling 15 weeks. During each period, the five different protocols were performed in individually randomized order. Maximal and submaximal oxygen consumption and heart rate were determined. Statistical analysis revealed no difference in maximal oxygen consumption among the various protocols; however significant differences did exist in maximal treadmill time. Maximal exertional duration was not affected by the test periods. All of the five protocols were equally reproducible. The Chinese Air Force has adopted a treadmill protocol with a constant speed of 3.5 mph and a 5% increase in elevation every 3 min. The physiological parameters measured by this protocol include a maximal heart rate of 181 beats/min; maximal oxygen consumption of 48 ml/kg/min; and maximal exercise duration of 17 min. Bruce and Balke protocols were 13 and 24 min, respectively, in our study. The regression equation of oxygen consumption and duration was: Y (ml/kg/min) = 1.26X (min) + 26.3. The CAF protocol is a safe, reproducible, easily performed method with moderate exercise duration. We confirm its advantages and prefer to select this protocol to serve as a routine screening or clinical method for use in testing Chinese people.

  14. Quantum secure communication using a multi-photon tolerant protocol

    NASA Astrophysics Data System (ADS)

    El Rifai, Mayssaa; Verma, Pramode K.

    2015-03-01

    This paper proposes a quantum secure communication protocol using multiple photons to represent each bit of a message to be shared. The multi-photon tolerant approach to quantum cryptography provides a quantum level security while using more than a single photon per transmission. The protocol proposed is a multi-stage protocol; an explanation of its operation and implementation are provided. The multi-stage protocol is based on the use of unitary transformations known only to Alice and Bob. This paper studies the security aspects of the multi-stage protocol by assessing its vulnerability to different attacks. It is well known that as the number of photons increases, the level of vulnerability of the multi-stage protocol increases. This paper sets a limit on the number of photons that can be used while keeping the multi-stage protocol a multi-photon tolerant quantum secure method for communication. The analysis of the number of photons to be used is based on the probability of success of a Helstrom discrimination done by an eavesdropper on the channel. Limiting the number of photons up to certain threshold per stage makes it impossible for an eavesdropper to decipher the message sent over the channel. The proposed protocol obviates the disadvantages associated with single photon implementations, such as limited data rates and distances along with the need to have no more than a single photon per time slot. The multi-stage protocol is a step toward direct quantum communication rather than quantum key distribution associated with single photon approaches.

  15. Modeling Techniques for High Dependability Protocols and Architecture

    NASA Technical Reports Server (NTRS)

    LaValley, Brian; Ellis, Peter; Walter, Chris J.

    2012-01-01

    This report documents an investigation into modeling high dependability protocols and some specific challenges that were identified as a result of the experiments. The need for an approach was established and foundational concepts proposed for modeling different layers of a complex protocol and capturing the compositional properties that provide high dependability services for a system architecture. The approach centers around the definition of an architecture layer, its interfaces for composability with other layers and its bindings to a platform specific architecture model that implements the protocols required for the layer.

  16. A Quantum Watermarking Protocol Based on Bell Dual Basis

    NASA Astrophysics Data System (ADS)

    Mo, Jia; Ma, Zhaofeng; Yang, Yixian; Niu, Xinxin

    2013-11-01

    This paper presents a Bell-dual-basis-based quantum watermarking protocol composed of three major algorithms: watermark embedding and extracting and the intercepting test. The first two are completed by using the entanglement swapping property of Bell dual basis and the test is accomplished through IBF protocol to guarantee its bottom security. The watermarking protocol is mainly designed for the protection of digital copyright in the existence of classical information. This design finds that the quality of digital contents is not damaged with its zero-watermark attributes when embedding watermarks.

  17. An XML-based protocol for distributed event services

    SciTech Connect

    Gunter, Dan K.; Smith, Warren; Quesnel, Darcy

    2001-06-25

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  18. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  19. A Robust Conditional Privacy-Preserving Authentication Protocol in VANET

    NASA Astrophysics Data System (ADS)

    Jung, Chae Duk; Sur, Chul; Park, Youngho; Rhee, Kyung-Hyune

    Recently, Lu et al. proposed an efficient conditional privacy preservation protocol, named ECPP, based on group signature scheme for secure vehicular communications. However, ECPP dose not provide unlinkability and traceability when multiple RSUs are compromised. In this paper, we make up for the limitations and propose a robust conditional privacy-preserving authentication protocol without loss of efficiency as compared with ECPP. Furthermore, in our protocol, RSUs can issue multiple anonymous certificates to an OBU to alleviate system overheads for validity check of RSUs. In order to achieve these goals, we consider a universal re-encryption scheme as our building block.

  20. A Scalable and Practical Authentication Protocol in Mobile IP

    NASA Astrophysics Data System (ADS)

    Lee, Yong; Lee, Goo-Yeon; Kim, Hwa-Long

    Due to the proliferation of mobile devices connected to the Internet, implementing a secure and practical Mobile IP has become an important goal. A mobile IP can not work properly without authentication between the mobile node (MN), the home agent (HA) and the foreign agent (FA). In this paper, we propose a practical Mobile IP authentication protocol that uses public key cryptography only during the initial authentication. The proposed scheme is compatible with the conventional Mobile IP protocol and provides scalability against the number of MN's. We also show that the proposed protocol offers secure operation.

  1. Membrane Protein Production in Escherichia coli: Protocols and Rules.

    PubMed

    Angius, Federica; Ilioaia, Oana; Uzan, Marc; Miroux, Bruno

    2016-01-01

    Functional and structural studies on membrane proteins are limited by the difficulty to produce them in large amount and in a functional state. In this review, we provide protocols to achieve high-level expression of membrane proteins in Escherichia coli. The T7 RNA polymerase-based expression system is presented in detail and protocols to assess and improve its efficiency are discussed. Protocols to isolate either membrane or inclusion bodies and to perform an initial qualitative test to assess the solubility of the recombinant protein are also included. PMID:27485328

  2. MASSIVE TRANSFUSION PROTOCOL: STANDARDIZING CARE TO IMPROVE PATIENT OUTCOMES.

    PubMed

    Porteous, Joan

    2015-06-01

    Providing rapid response is a primary goal when caring for surgical patients with injuries involving massive blood loss. Massive transfusion protocols have been developed in some tertiary care health care facilities to ensure a rapid and efficient response in the provision of care to patients with a massive and uncontrolled hemorrhage. The purpose of this article is to discuss a massive transfusion protocol and to describe the process used to implement a massive transfusion protocol at Winnipeg's Health Sciences Centre (the site) as well as to describe its impact in the operating room department. PMID:26310036

  3. Performance enhancement of OSPF protocol in the private network

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Lu, Yang; Lin, Xiaokang

    2005-11-01

    The private network serves as an information exchange platform to support the integrated services via microwave channels and accordingly selects the open shortest path first (OSPF) as the IP routing protocol. But the existing OSPF can't fit the private network very well for its special characteristics. This paper presents our modifications to the standard protocol in such aspects as the single-area scheme, link state advertisement (LSA) types and formats, OSPF packet formats, important state machines, setting of protocol parameters and link flap damping. Finally simulations are performed in various scenarios and the results indicate that our modifications can enhance the OSPF performance in the private network effectively.

  4. Study & Analysis of various Protocols in popular Web Browsers

    NASA Astrophysics Data System (ADS)

    Mishra, Bharat; Baghel, Harish Singh; Patil, Manoj; Singh, Pramod

    2012-08-01

    The web browsers are the application software that are used to access information from the World Wide Web. With the increasing popularity of the web browsers, the modern web browsers are designed to contain more features as compared to the existing web browsers. For the transferring of information through these browsers, various protocols have been implemented on these modern web browsers to make these browsers more efficient. Different protocols used in different layers have different functions and by increasing the efficiency of these protocols we can make the working of browsers more efficient.

  5. RFID authentication protocol to enhance patient medication safety.

    PubMed

    Kaul, Sonam Devgan; Awasthi, Amit K

    2013-12-01

    Medication errors can cause substantial harm to patients. Automated patient medication system with RFID technology is purposely used to reduce the medication error, to improve the patient safety, to provide personalized patient medication and identification and also to provide counterfeit protection to the patients. In order to enhance medication safety for patients we propose a new dynamic ID based lightweight RFID authentication protocol. Due to low storage capacity and limited computational and communicational capacity of tags, only pseudo random number generator function, one way hash function and bitwise Xor operation are used in our authentication protocol. The proposed protocol is practical, secure and efficient for health care domain.

  6. Membrane Protein Production in Escherichia coli: Protocols and Rules.

    PubMed

    Angius, Federica; Ilioaia, Oana; Uzan, Marc; Miroux, Bruno

    2016-01-01

    Functional and structural studies on membrane proteins are limited by the difficulty to produce them in large amount and in a functional state. In this review, we provide protocols to achieve high-level expression of membrane proteins in Escherichia coli. The T7 RNA polymerase-based expression system is presented in detail and protocols to assess and improve its efficiency are discussed. Protocols to isolate either membrane or inclusion bodies and to perform an initial qualitative test to assess the solubility of the recombinant protein are also included.

  7. High-Performance CCSDS AOS Protocol Implementation in FPGA

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Torgerson, Jordan L.; Pang, Jackson

    2010-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Advanced Orbiting Systems (AOS) space data link protocol provides a framing layer between channel coding such as LDPC (low-density parity-check) and higher-layer link multiplexing protocols such as CCSDS Encapsulation Service, which is described in the following article. Recent advancement in RF modem technology has allowed multi-megabit transmission over space links. With this increase in data rate, the CCSDS AOS protocol implementation needs to be optimized to both reduce energy consumption and operate at a high rate.

  8. Design and Implementation of a Secure Modbus Protocol

    NASA Astrophysics Data System (ADS)

    Fovino, Igor Nai; Carcano, Andrea; Masera, Marcelo; Trombetta, Alberto

    The interconnectivity of modern and legacy supervisory control and data acquisition (SCADA) systems with corporate networks and the Internet has significantly increased the threats to critical infrastructure assets. Meanwhile, traditional IT security solutions such as firewalls, intrusion detection systems and antivirus software are relatively ineffective against attacks that specifically target vulnerabilities in SCADA protocols. This paper describes a secure version of the Modbus SCADA protocol that incorporates integrity, authentication, non-repudiation and anti-replay mechanisms. Experimental results using a power plant testbed indicate that the augmented protocol provides good security functionality without significant overhead.

  9. Comment on "Quantum oblivious set-member decision protocol"

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Xiao, Di; Huang, Wei; Song, Ting-Ting

    2016-03-01

    In a recent paper [Phys. Rev. A 92, 022309 (2015), 10.1103/PhysRevA.92.022309], the authors proposed a quantum oblivious set-member decision protocol, which is designed to allow a server to check whether a private secret of a user is a member of his private set in an oblivious manner. Such protocols should protect the privacies of both the server and the user. However, we find that the user in their protocol can steal l -1 bits of information about the server's private set by sending the false decoy states.

  10. A Secure Authenticated Key Exchange Protocol for Credential Services

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    In this paper, we propose a leakage-resilient and proactive authenticated key exchange (called LRP-AKE) protocol for credential services which provides not only a higher level of security against leakage of stored secrets but also secrecy of private key with respect to the involving server. And we show that the LRP-AKE protocol is provably secure in the random oracle model with the reduction to the computational Difie-Hellman problem. In addition, we discuss about some possible applications of the LRP-AKE protocol.

  11. MASSIVE TRANSFUSION PROTOCOL: STANDARDIZING CARE TO IMPROVE PATIENT OUTCOMES.

    PubMed

    Porteous, Joan

    2015-06-01

    Providing rapid response is a primary goal when caring for surgical patients with injuries involving massive blood loss. Massive transfusion protocols have been developed in some tertiary care health care facilities to ensure a rapid and efficient response in the provision of care to patients with a massive and uncontrolled hemorrhage. The purpose of this article is to discuss a massive transfusion protocol and to describe the process used to implement a massive transfusion protocol at Winnipeg's Health Sciences Centre (the site) as well as to describe its impact in the operating room department.

  12. A comparative study of protocols for secure quantum communication under noisy environment: single-qubit-based protocols versus entangled-state-based protocols

    NASA Astrophysics Data System (ADS)

    Sharma, Vishal; Thapliyal, Kishore; Pathak, Anirban; Banerjee, Subhashish

    2016-07-01

    The effect of noise on various protocols of secure quantum communication has been studied. Specifically, we have investigated the effect of amplitude damping, phase damping, squeezed generalized amplitude damping, Pauli type as well as various collective noise models on the protocols of quantum key distribution, quantum key agreement, quantum secure direct quantum communication and quantum dialogue. From each type of protocol of secure quantum communication, we have chosen two protocols for our comparative study: one based on single-qubit states and the other one on entangled states. The comparative study reported here has revealed that single-qubit-based schemes are generally found to perform better in the presence of amplitude damping, phase damping, squeezed generalized amplitude damping noises, while entanglement-based protocols turn out to be preferable in the presence of collective noises. It is also observed that the effect of noise depends upon the number of rounds of quantum communication involved in a scheme of quantum communication. Further, it is observed that squeezing, a completely quantum mechanical resource present in the squeezed generalized amplitude channel, can be used in a beneficial way as it may yield higher fidelity compared to the corresponding zero squeezing case.

  13. A blueprint for a sepsis protocol.

    PubMed

    Shapiro, Nathan I; Howell, Michael; Talmor, Daniel

    2005-04-01

    viewed. Although complex and challenging, these therapies must be brought to the patient's bedside. We propose and describe the Multiple Urgent Sepsis Therapies (MUST) protocol as a practical way to implement a comprehensive treatment plan using available evidence-based therapies.

  14. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  15. An accurate link correlation estimator for improving wireless protocol performance.

    PubMed

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-02-12

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation.

  16. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... continued evaluation is necessary to ensure the potency, quality, and reliability of the product....

  17. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... continued evaluation is necessary to ensure the potency, quality, and reliability of the product....

  18. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... continued evaluation is necessary to ensure the potency, quality, and reliability of the product....

  19. Factors influencing adherence to an emergency department national protocol.

    PubMed

    Ebben, Remco H A; Vloet, Lilian C M; de Groot, Joke Mintjes; van Achterberg, Theo

    2012-02-01

    The objective of this study was to identify factors that influence emergency nurses' adherence to an emergency department national protocol (EDNP). A survey of emergency nurses (n=200) and physicians with medical end responsibility on an emergency department (n=103) was carried out. Emergency nurses' self-reported adherence to the EDNP was 38%, 55% of the nurses and 44% of the physicians were aware of the protocol. Interference with professional autonomy, insufficient organizational support and the EDNP's applicability were indicated as barriers for adherence. The main influencing factor seems awareness. Other factors related to the individual, the organization and to protocol characteristics. Solely disseminating the EDNP is not enough to get the protocol used in clinical practice. PMID:21552130

  20. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... continued evaluation is necessary to ensure the potency, quality, and reliability of the product....

  1. Joint Architecture Standard (JAS) Reliable Data Delivery Protocol (RDDP) specification.

    SciTech Connect

    Enderle, Justin Wayne; Daniels, James W.; Gardner, Michael T.; Eldridge, John M.; Hunt, Richard D.; Gallegos, Daniel E.

    2011-05-01

    The Joint Architecture Standard (JAS) program at Sandia National Laboratories requires the use of a reliable data delivery protocol over SpaceWire. The National Aeronautics and Space Administration at the Goddard Spaceflight Center in Greenbelt, Maryland, developed and specified a reliable protocol for its Geostationary Operational Environment Satellite known as GOES-R Reliable Data Delivery Protocol (GRDDP). The JAS program implemented and tested GRDDP and then suggested a number of modifications to the original specification to meet its program specific requirements. This document details the full RDDP specification as modified for JAS. The JAS Reliable Data Delivery Protocol uses the lower-level SpaceWire data link layer to provide reliable packet delivery services to one or more higher-level host application processes. This document specifies the functional requirements for JRDDP but does not specify the interfaces to the lower- or higher-level processes, which may be implementation-dependent.

  2. Improvement on "Quantum Key Agreement Protocol with Maximally Entangled States"

    NASA Astrophysics Data System (ADS)

    Chong, Song-Kong; Tsai, Chia-Wei; Hwang, Tzonelih

    2011-06-01

    Recently, Hsueh and Chen [in Proceedings of the 14th Information Security Conference, National Taiwan University of Science and Technology, Taipei, pp. 236-242, 2004] proposed a quantum key agreement (QKA) protocol with maximally entangled states. Their protocol allows two users to negotiate a secret key in such a way that no one can predetermine the shared key alone. This study points out two security flaws in their protocol: (1) a legitimate but malicious user can fully control the shared key alone; (2) an eavesdropper can obtain the shared key without being detected. A possible solution is presented to avoid these attacks and also Tsai et al.'s CNOT attack [in Proceedings of the 20th Cryptology and Information Security Conference, National Chiao Tung University, Hsinchu, pp. 210-213, 2010] on Hsueh and Chen protocol to obtain the shared key without being detected.

  3. Electronic Voting Protocol Using Identity-Based Cryptography

    PubMed Central

    Gallegos-Garcia, Gina; Tapia-Recillas, Horacio

    2015-01-01

    Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps. PMID:26090515

  4. Distributed reservation control protocols for random access broadcasting channels

    NASA Technical Reports Server (NTRS)

    Greene, E. P.; Ephremides, A.

    1981-01-01

    Attention is given to a communication network consisting of an arbitrary number of nodes which can communicate with each other via a time-division multiple access (TDMA) broadcast channel. The reported investigation is concerned with the development of efficient distributed multiple access protocols for traffic consisting primarily of single packet messages in a datagram mode of operation. The motivation for the design of the protocols came from the consideration of efficient multiple access utilization of moderate to high bandwidth (4-40 Mbit/s capacity) communication satellite channels used for the transmission of short (1000-10,000 bits) fixed length packets. Under these circumstances, the ratio of roundtrip propagation time to packet transmission time is between 100 to 10,000. It is shown how a TDMA channel can be adaptively shared by datagram traffic and constant bandwidth users such as in digital voice applications. The distributed reservation control protocols described are a hybrid between contention and reservation protocols.

  5. Current Status of EPA Protocol Gas Verification Program

    EPA Science Inventory

    Accurate compressed gas reference standards are needed to calibrate and audit continuous emission monitors (CEMs) and ambient air quality monitors that are being used for regulatory purposes. US Environmental Protection Agency (EPA) established its traceability protocol to ensur...

  6. Minimal computational-space implementation of multiround quantum protocols

    SciTech Connect

    Bisio, Alessandro; D'Ariano, Giacomo Mauro; Perinotti, Paolo; Chiribella, Giulio

    2011-02-15

    A single-party strategy in a multiround quantum protocol can be implemented by sequential networks of quantum operations connected by internal memories. Here, we provide an efficient realization in terms of computational-space resources.

  7. Meta-Envy-Free Cake-Cutting Protocols

    NASA Astrophysics Data System (ADS)

    Manabe, Yoshifumi; Okamoto, Tatsuaki

    This paper discusses cake-cutting protocols when the cake is a heterogeneous good that is represented by an interval in the real line. We propose a new desirable property, the meta-envy-freeness of cake-cutting, which has not been formally considered before. Though envy-freeness was considered to be one of the most important desirable properties, envy-freeness does not prevent envy about role assignment in the protocols. We define meta-envy-freeness that formalizes this kind of envy. We show that current envy-free cake-cutting protocols do not satisfy meta-envy-freeness. Formerly proposed properties such as strong envy-free, exact, and equitable do not directly consider this type of envy and these properties are very difficult to realize. This paper then shows meta-envy-free cake-cutting protocols for two and three party cases.

  8. Photographic protocol for image acquisition in craniofacial microsomia

    PubMed Central

    2011-01-01

    Craniofacial microsomia (CFM) is a congenital condition associated with orbital, mandibular, ear, nerve, and soft tissue anomalies. We present a standardized, two-dimensional, digital photographic protocol designed to capture the common craniofacial features associated with CFM. PMID:22208766

  9. Electronic Voting Protocol Using Identity-Based Cryptography.

    PubMed

    Gallegos-Garcia, Gina; Tapia-Recillas, Horacio

    2015-01-01

    Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps.

  10. Current Status of EPA Verification Program for EPA Protocol Gases

    EPA Science Inventory

    Accurate compressed gas calibration standards are needed to calibrate continuous emission monitors (CEMs) and ambient air quality monitors that are being used for regulatory purposes. US Environmental Protection Agency (EPA) established its traceability protocol to ensure that c...

  11. 40 CFR 766.28 - Expert review of protocols.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Expert review of protocols. EPA will gather a panel of experts in analysis of chemical matrices for HDDs... and/or of other U.S. Government agencies who have had experience in analysis of chemical matrices...

  12. Developing a notebook protocol for the high school chemistry classroom

    NASA Astrophysics Data System (ADS)

    Rensing, Roselyn I.

    The focus of this project is to increase science literacy in high school students using a protocol that emphasizes writing. A protocol or lesson sequence comprises a code of behavior to encourage learning through reflection, writing, and self-assessment. A basic protocol may have a sequence of writing elements or tasks which checks for prior knowledge, looks at lesson standards, studies content, and summarizes learning. Using the protocol, students will demonstrate evidence of their learning through writing. The project will identify a progression of tasks which enable students to master content and express mastery through writing. The student's interactive notebook will record evidence of their learning. Several styles of writing and reporting tasks will be explored using the notebook. Students will help implement and identify tasks that demonstrate their knowledge and understanding of science content.

  13. A Voting Protocol Based on the Controlled Quantum Operation Teleportation

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-05-01

    Based on the controlled quantum operation teleportation, a secure voting protocol is proposed in this paper. Genuine four-qubit entangled state functions as the quantum channel. The eligible voter's quantum operation which represents his vote information can be transmitted to the tallyman Bob with the help of the scrutineer Charlie. Voter's quantum identity authentication provides the anonymity of voters'ID, which is ensured by a zero-knowledge proof of the notary organization CA. Charlie's supervision in the whole voting process can make the protocol satisfy verifiability and non-reusability so as to avoid Bob's dishonest behaviour. The security analysis shows that the voting protocol satisfies unforgeability, and has great advantages over some relevant researches. Additionally, the quantum operation can be transmitted successfully with the probability 1, which can make the protocol reliable and practical.

  14. Protocol for Addressing Induced Seismicity Associated with Enhanced Geothermal Systems

    SciTech Connect

    Majer, Ernie; Nelson, James; Robertson-Tait, Ann; Savy, Jean; Wong, Ivan

    2012-01-01

    This Protocol is a living guidance document for geothermal developers, public officials, regulators and the general public that provides a set of general guidelines detailing useful steps to evaluate and manage the effects of induced seismicity related to EGS projects.

  15. Terrestrial Photovoltaic Module Accelerated Test-To-Failure Protocol

    SciTech Connect

    Osterwald, C. R.

    2008-03-01

    This technical report documents a test-to-failure protocol that may be used to obtain quantitative information about the reliability of photovoltaic modules using accelerated testing in environmental temperature-humidity chambers.

  16. Quantum Private Comparison Protocol Based on Cluster States

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Long, Dongyang

    2013-01-01

    We present a quantum private comparison (QPC) protocol, enabling two players to compare the equality of their information without revealing any information about their respective private inputs, in which the four-particle cluster states as the information carriers are used. The presented protocol can ensure correctness, privacy, and fairness with the assistance of a semi-trusted third party (TP). Meanwhile, the participants including the TP are just required having the ability to perform single-particle measurements, which make the presented protocol more feasible in technique. Furthermore, the photon transmission is a one-way distribution; the Trojan horse attacks can be automatically avoided. The security of this protocol is also analyzed.

  17. Protocol to Exploit Waiting Resources for UASNs †

    PubMed Central

    Hung, Li-Ling; Luo, Yung-Jeng

    2016-01-01

    The transmission speed of acoustic waves in water is much slower than that of radio waves in terrestrial wireless sensor networks. Thus, the propagation delay in underwater acoustic sensor networks (UASN) is much greater. Longer propagation delay leads to complicated communication and collision problems. To solve collision problems, some studies have proposed waiting mechanisms; however, long waiting mechanisms result in low bandwidth utilization. To improve throughput, this study proposes a slotted medium access control protocol to enhance bandwidth utilization in UASNs. The proposed mechanism increases communication by exploiting temporal and spatial resources that are typically idle in order to protect communication against interference. By reducing wait time, network performance and energy consumption can be improved. A performance evaluation demonstrates that when the data packets are large or sensor deployment is dense, the energy consumption of proposed protocol is less than that of existing protocols as well as the throughput is higher than that of existing protocols. PMID:27005624

  18. Data transmission protocol for Pi-of-the-Sky cameras

    NASA Astrophysics Data System (ADS)

    Uzycki, J.; Kasprowicz, G.; Mankiewicz, M.; Nawrocki, K.; Sitek, P.; Sokolowski, M.; Sulej, R.; Tlaczala, W.

    2006-10-01

    The large amount of data collected by the automatic astronomical cameras has to be transferred to the fast computers in a reliable way. The method chosen should ensure data streaming in both directions but in nonsymmetrical way. The Ethernet interface is very good choice because of its popularity and proven performance. However it requires TCP/IP stack implementation in devices like cameras for full compliance with existing network and operating systems. This paper describes NUDP protocol, which was made as supplement to standard UDP protocol and can be used as a simple-network protocol. The NUDP does not need TCP protocol implementation and makes it possible to run the Ethernet network with simple devices based on microcontroller and/or FPGA chips. The data transmission idea was created especially for the "Pi of the Sky" project.

  19. Electronic Voting Protocol Using Identity-Based Cryptography.

    PubMed

    Gallegos-Garcia, Gina; Tapia-Recillas, Horacio

    2015-01-01

    Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps. PMID:26090515

  20. CURRENT STATUS OF THE EPA PROTOCOL GAS PROGRAM

    EPA Science Inventory

    Accurate compressed gas calibration standards are needed to calibrate continuous emission monitors (CEMs) and ambient air quality monitors that are being used for regulatory purposes. EPA has published a protocol to establish the traceability of these standards to national refer...

  1. [Implementation of a massive transfusion protocol in an emergency department].

    PubMed

    Tonglet, M; Minon, J M; Damas, F; Clanet, M; Vergnion, M

    2014-02-01

    We present here the massive transfusion protocol implemented in our institution in 2013. It will improve our management of critical massive bleeding, a situation which is rare in in our hospital, but carries a high mortality risk.

  2. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  3. Yoga protocol for treatment of breast cancer-related lymphedema

    PubMed Central

    Narahari, SR; Aggithaya, Madhur Guruprasad; Thernoe, Liselotte; Bose, Kuthaje S; Ryan, Terence J

    2016-01-01

    Introduction: Vaqas and Ryan (2003) advocated yoga and breathing exercises for lymphedema. Narahari et al. (2007) developed an integrative medicine protocol for lower-limb lymphedema using yoga. Studies have hypothesized that yoga plays a similar role as that of central manual lymph drainage of Foldi's technique. This study explains how we have used yoga and breathing as a self-care intervention for breast cancer-related lymphedema (BCRL). Methods: The study outcome was to create a yoga protocol for BCRL. Selection of yoga was based on the actions of muscles on joints, anatomical areas associated with different groups of lymph nodes, stretching of skin, and method of breathing in each yoga. The protocol was piloted in eight BCRL patients, observed its difficulties by interacting with patients. A literature search was conducted in PubMed and Cochrane library to identify the yoga protocols for BCRL. Results: Twenty yoga and 5 breathing exercises were adopted. They have slow, methodical joint movements which helped patients to tolerate pain. Breathing was long and diaphragmatic. Flexion of joints was coordinated with exhalation and extension with inhalation. Alternate yoga was introduced to facilitate patients to perform complex movements. Yoga's joint movements, initial positions, and mode of breathing were compared to two other protocols. The volume reduced from 2.4 to 1.2 L in eight patients after continuous practice of yoga and compression at home for 3 months. There was improvement in the range of movement and intensity of pain. Discussion: Yoga exercises were selected on the basis of their role in chest expansion, maximizing range of movements: flexion of large muscles, maximum stretch of skin, and thus part-by-part lymph drainage from center and periphery. This protocol addressed functional, volume, and movement issues of BCRL and was found to be superior to other BCRL yoga protocols. However, this protocol needs to be tested in centers routinely managing BCRL

  4. IPV6 Mobile Network Protocol Weaknesses and a Cryptosystem Approach

    NASA Astrophysics Data System (ADS)

    Balitanas, Maricel; Kim, Tai-Hoon

    This paper reviews some of the improvements associated with the new Internet protocol version 6, an emphasis on its security-related functionality particularly in its authentication and concludes with a hybrid cryptosystem for its authentication issue. Since new generation of Internet protocol is on its way to solve the growth of IP address depletion. It is in a process that may take several years to complete. Thus, as a step to effective solution and efficient implementation this review has been made.

  5. IP- -: A Reduced Internet Protocol for Optical Packet Networking

    NASA Astrophysics Data System (ADS)

    Ohta, Masataka; Fujikawa, Kenji

    IP- - is proposed as an Internet Protocol suitable for optical packet networking. As optical routers require much faster control than electric ones and lack of optical buffers other than those by fiber delay lines requires fixed time control, Internet Protocols must be at least as simple as IPv4 and much simpler than IPv6. IP- - also addresses issues of IP address space exhaustion and IP routing table explosion.

  6. Internet-Protocol-Based Satellite Bus Architecture Designed

    NASA Technical Reports Server (NTRS)

    Slywczak, Richard A.

    2004-01-01

    NASA is designing future complex satellite missions ranging from single satellites and constellations to space networks and sensor webs. These missions require more interoperability, autonomy, and coordination than previous missions; in addition, a desire exists to have scientists retrieve data directly from the satellite rather than a central distribution source. To meet these goals, NASA has been studying the possibility of extending the Transmission Control Protocol/Internet Protocol (TCP/IP) suite for spacebased applications.

  7. ABM Clinical Protocol #18: Use of Antidepressants in Breastfeeding Mothers

    PubMed Central

    Sriraman, Natasha K.; Melvin, Kathryn; Meltzer-Brody, Samantha

    2015-01-01

    A central goal of The Academy of Breastfeeding Medicine is the development of clinical protocols for managing common medical problems that may impact breastfeeding success. These protocols serve only as guidelines for the care of breastfeeding mothers and infants and do not delineate an exclusive course of treatment or serve as standards of medical care. Variations in treatment may be appropriate according to the needs of an individual patient. PMID:26204124

  8. The Electronic Nose: A Protocol To Evaluate Fresh Meat Flavor

    NASA Astrophysics Data System (ADS)

    Isoppo, S.; Cornale, P.; Barbera, S.

    2009-05-01

    An Electronic Nose, comprising 10 MOS, was used to carry out meat aroma measurements in order to define an analytical protocol. Every meat sample (Longissimus Dorsi) was tested before, during and after cooking in oven (at 165° C for 600 seconds). Analysis took place in these three steps because consumers perceive odor when they buy (raw aroma), cook (cooking aroma) and eat meat (cooked aroma). Therefore these tests permitted to obtain a protocol useful to measure aroma daily perceived by meat eater.

  9. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  10. Biocoder: A programming language for standardizing and automating biology protocols

    PubMed Central

    2010-01-01

    Background Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. Results We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. Conclusions BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains. PMID:21059251

  11. ABM Clinical Protocol #18: Use of Antidepressants in Breastfeeding Mothers.

    PubMed

    Sriraman, Natasha K; Melvin, Kathryn; Meltzer-Brody, Samantha

    2015-01-01

    A central goal of The Academy of Breastfeeding Medicine is the development of clinical protocols for managing common medical problems that may impact breastfeeding success. These protocols serve only as guidelines for the care of breastfeeding mothers and infants and do not delineate an exclusive course of treatment or serve as standards of medical care. Variations in treatment may be appropriate according to the needs of an individual patient. PMID:26204124

  12. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    PubMed Central

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-01-01

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results. PMID:26610495

  13. A Passive Testing Approach for Protocols in Wireless Sensor Networks.

    PubMed

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-01-01

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results. PMID:26610495

  14. Developing and implementing computerized protocols for standardization of clinical decisions.

    PubMed

    Morris, A H

    2000-03-01

    Humans have only a limited ability to incorporate information in decision making. In certain situations, the mismatch between this limitation and the availability of extensive information contributes to the varying performance and high error rate of clinical decision makers. Variation in clinical practice is due in part to clinicians' poor compliance with guidelines and recommended therapies. The use of decision-support tools is a response to both the information revolution and poor compliance. Computerized protocols used to deliver decision support can be configured to contain much more detail than textual guidelines or paper-based flow diagrams. Such protocols can generate patient-specific instructions for therapy that can be carried out with little interclinician variability; however, clinicians must be willing to modify personal styles of clinical management. Protocols need not be perfect. Several defensible and reasonable approaches are available for clinical problems. However, one of these reasonable approaches must be chosen and incorporated into the protocol to promote consistent clinical decisions. This reasoning is the basis of an explicit method of decision support that allows the rigorous evaluation of interventions, including use of the protocols themselves. Computerized protocols for mechanical ventilation and management of intravenous fluid and hemodynamic factors in patients with the acute respiratory distress syndrome provide case studies for this discussion. PMID:10691588

  15. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  16. On the Cost of Prioritized Atomic Multicast Protocols

    NASA Astrophysics Data System (ADS)

    Miedes, Emili; Muñoz-Escoí, Francesc D.

    A prioritized atomic multicast protocol allows an application to tag messages with a priority that expresses their urgency and tries to deliver first those with a higher priority. For instance, such a service can be used in a database replication context, to reduce the transaction abort rate when integrity constraints are used. We present a study of the three most important and well-known classes of atomic multicast protocols in which we evaluate the cost imposed by the prioritization mechanisms, in terms of additional latency overhead, computational cost and memory use. This study reveals that the behavior of the protocols depends on the particular properties of the setting (number of nodes, message sending rates, etc.) and that the extra work done by a prioritized protocol does not introduce any additional latency overhead in most of the evaluated settings. This study is also a performance comparison of these classes of total order protocols and can be used by system designers to choose the proper prioritized protocol for a given deployment.

  17. Effects of two neuromuscular fatigue protocols on landing performance.

    PubMed

    James, C Roger; Scheuermann, Barry W; Smith, Michael P

    2010-08-01

    The purpose of the study was to investigate the effects of two fatigue protocols on landing performance. A repeated measures design was used to examine the effects of fatigue and fatigue protocol on neuromuscular and biomechanical performance variables. Ten volunteers performed non-fatigued and fatigued landings on two days using different fatigue protocols. Repeated maximum isometric squats were used to induce fatigue on day one. Sub-maximum cycling was used to induce fatigue on day two. Isometric squat maximum voluntary contraction (MVC) was measured before and after fatigued landings on each day. During the landings, ground reaction force (GRF), knee kinematics, and electromyographic (EMG) data were recorded. Isometric MVC, GRF peaks, loading rates, impulse, knee flexion at contact, range of motion, max angular velocity, and EMG root mean square (RMS) values were compared pre- and post-fatiguing exercise and between fatigue protocols using repeated ANOVA. Fatigue decreased MVC strength (p0.05), GRF second peak, and initial impulse (p0.01), but increased quadriceps medium latency stretch reflex EMG activity (p0.012). Knee flexion at contact was 5.2 degrees greater (p0.05) during fatigued landings following the squat exercise compared to cycling. Several variables exhibited non-significant but large effect sizes when comparing the effects of fatigue and fatigue protocol. In conclusion, fatigue alters landing performance and different fatigue protocols result in different performance changes.

  18. The NICHD Investigative Interview Protocol: A Meta-Analytic Review.

    PubMed

    Benia, Luis Roberto; Hauck-Filho, Nelson; Dillenburg, Mariana; Stein, Lilian Milnitsky

    2015-01-01

    Systematic review and meta-analysis of literature were conducted examining the effectiveness of the National Institute for Child Health and Human Development Investigative Interview Protocol in improving the quality of child forensic interviews. Online databases were searched for journal articles published between the years 2000 and 2013. Measures of interview quality were the type of interviewer utterances and the amount of information provided by children. Five studies met criteria for inclusion in the meta-analysis. Weighted mean of the effect sizes was calculated for each outcome measure. Protocol interviews had more invitations (g = 1.60) and fewer option-posing (g = -.95) and suggestive prompts (g = -.63) than standard interviews. Children interviewed by the protocol provided more central details (g = .90) in response to invitations than controls. Meta-analyses of a subset of preschool children samples revealed that protocol interviews had more invitations (g = 1.46), fewer suggestive prompts (g = -.61), and fewer option-posing prompts (g = -1.05) than controls. Findings corroborate results from previous studies that suggested the benefits of the protocol on the interviewers' performance and on children's informativeness. However, protocol did not show the same performance with regard to preschool children.

  19. Experimental protocols and preparations to study respiratory long term facilitation

    PubMed Central

    Mateika, Jason H.; Sandhu, Kulraj S.

    2011-01-01

    Respiratory long-term facilitation is a form of neuronal plasticity that is induced following exposure to intermittent hypoxia. Long-term facilitation is characterized by a progressive increase in respiratory motor output during normoxic periods that separate hypoxic episodes and by a sustained elevation in respiratory activity for up to 90 min after exposure to intermittent hypoxia. This phenomenon is associated with increases in phrenic, hypoglossal or carotid sinus nerve inspiratory-modulated discharge. The examination of long-term facilitation has been steadily ongoing for approximately 3 decades. During this period of time a variety of animal models (e.g. cats, rats and humans), experimental preparations and intermittent hypoxia protocols have been used to study long-term facilitation. This review is designed to summarize the strengths and weaknesses of the models, preparations and protocols that have been used to study LTF over the past 30 years. The review is divided into two primary sections. Initially, the models and protocols used to study LTF in animals other than humans will be discussed, followed by a section specifically focused on human studies. Each section will begin with a discussion of various factors that must be considered when selecting an experimental preparation and intermittent hypoxia protocol to examine LTF. Model and protocol design recommendations will follow, with the goal of presenting a prevailing model and protocol that will ultimately ensure standardized comparisons across studies. PMID:21292044

  20. Developing and implementing computerized protocols for standardization of clinical decisions.

    PubMed

    Morris, A H

    2000-03-01

    Humans have only a limited ability to incorporate information in decision making. In certain situations, the mismatch between this limitation and the availability of extensive information contributes to the varying performance and high error rate of clinical decision makers. Variation in clinical practice is due in part to clinicians' poor compliance with guidelines and recommended therapies. The use of decision-support tools is a response to both the information revolution and poor compliance. Computerized protocols used to deliver decision support can be configured to contain much more detail than textual guidelines or paper-based flow diagrams. Such protocols can generate patient-specific instructions for therapy that can be carried out with little interclinician variability; however, clinicians must be willing to modify personal styles of clinical management. Protocols need not be perfect. Several defensible and reasonable approaches are available for clinical problems. However, one of these reasonable approaches must be chosen and incorporated into the protocol to promote consistent clinical decisions. This reasoning is the basis of an explicit method of decision support that allows the rigorous evaluation of interventions, including use of the protocols themselves. Computerized protocols for mechanical ventilation and management of intravenous fluid and hemodynamic factors in patients with the acute respiratory distress syndrome provide case studies for this discussion.

  1. Protocol for miRNA isolation from biofluids.

    PubMed

    Lekchnov, Evgeny A; Zaporozhchenko, Ivan A; Morozkin, Evgeny S; Bryzgunova, Olga E; Vlassov, Valentin V; Laktionov, Pavel P

    2016-04-15

    MicroRNAs (miRNAs) have been identified as promising biomarkers in cancer and other diseases. Packaging of miRNAs into vesicles and complexes with proteins ensures their stability in biological fluids but also complicates their isolation. Conventional protocols used to isolate cell-free RNA are generally successful in overcoming these difficulties; however, they are costly, labor-intensive, or heavily reliant on the use of hazardous chemicals. Here we describe a protocol that is suitable for isolating miRNAs from biofluids, including blood plasma and urine. The protocol is based on precipitation of proteins, denaturation of miRNA-containing complexes with octanoic acid and guanidine isothiocyanate, and subsequent purification of miRNA on spin columns. The efficacy of miRNA extraction by phenol-chloroform extraction, miRCURY RNA isolation kit--biofluids (Exiqon), and the proposed protocol was compared by quantitative reverse-transcription PCR of miR-16 and miR-126. The proposed protocol was slightly more effective for isolating miRNA from plasma and significantly superior to the other two methods for miRNA isolation from urine. Spectrophotometry and SDS-PAGE data suggest that the disparity in performance between miRCURY Biofluids and the proposed protocol can be attributed to differences in precipitation mechanisms, as confirmed by the retention of different proteins in the supernatant. PMID:26874020

  2. Dose-response curve of EBT, EBT2, and EBT3 radiochromic films to synchrotron-produced monochromatic x-ray beams

    SciTech Connect

    Brown, Thomas A. D.; Hogstrom, Kenneth R.; Alvarez, Diane; Matthews, Kenneth L. II; Ham, Kyungmin; Dugas, Joseph P.

    2012-12-15

    Purpose: This work investigates the dose-response curves of GAFCHROMIC{sup Registered-Sign} EBT, EBT2, and EBT3 radiochromic films using synchrotron-produced monochromatic x-ray beams. EBT2 film is being utilized for dose verification in photoactivated Auger electron therapy at the Louisiana State University Center for Advanced Microstructures and Devices (CAMD) synchrotron facility. Methods: Monochromatic beams of 25, 30, and 35 keV were generated on the tomography beamline at CAMD. Ion chamber depth-dose measurements were used to determine the dose delivered to films irradiated at depths from 0.7 to 8.5 cm in a 10 Multiplication-Sign 10 Multiplication-Sign 10-cm{sup 3} polymethylmethacrylate phantom. AAPM TG-61 protocol was applied to convert measured ionization into dose. Films were digitized using an Epson 1680 Professional flatbed scanner and analyzed using the net optical density (NOD) derived from the red channel. A dose-response curve was obtained at 35 keV for EBT film, and at 25, 30, and 35 keV for EBT2 and EBT3 films. Calibrations of films for 4 MV x-rays were obtained for comparison using a radiotherapy accelerator at Mary Bird Perkins Cancer Center. Results: The sensitivity (NOD per unit dose) of EBT film at 35 keV relative to that for 4-MV x-rays was 0.73 and 0.76 for doses 50 and 100 cGy, respectively. The sensitivity of EBT2 film at 25, 30, and 35 keV relative to that for 4-MV x-rays varied from 1.09-1.07, 1.23-1.17, and 1.27-1.19 for doses 50-200 cGy, respectively. For EBT3 film the relative sensitivity was within 3% of unity for all three monochromatic x-ray beams. Conclusions: EBT and EBT2 film sensitivity showed strong energy dependence over an energy range of 25 keV-4 MV, although this dependence becomes weaker for larger doses. EBT3 film shows weak energy dependence, indicating that it would be a better dosimeter for kV x-ray beams where beam hardening effects can result in large changes in the effective energy.

  3. Absolute x-ray dosimetry on a synchrotron medical beam line with a graphite calorimeter

    SciTech Connect

    Harty, P. D. Ramanathan, G.; Butler, D. J.; Johnston, P. N.; Lye, J. E.; Hall, C. J.; Stevenson, A. W.

    2014-05-15

    Purpose: The absolute dose rate of the Imaging and Medical Beamline (IMBL) on the Australian Synchrotron was measured with a graphite calorimeter. The calorimetry results were compared to measurements from the existing free-air chamber, to provide a robust determination of the absolute dose in the synchrotron beam and provide confidence in the first implementation of a graphite calorimeter on a synchrotron medical beam line. Methods: The graphite calorimeter has a core which rises in temperature when irradiated by the beam. A collimated x-ray beam from the synchrotron with well-defined edges was used to partially irradiate the core. Two filtration sets were used, one corresponding to an average beam energy of about 80 keV, with dose rate about 50 Gy/s, and the second filtration set corresponding to average beam energy of 90 keV, with dose rate about 20 Gy/s. The temperature rise from this beam was measured by a calibrated thermistor embedded in the core which was then converted to absorbed dose to graphite by multiplying the rise in temperature by the specific heat capacity for graphite and the ratio of cross-sectional areas of the core and beam. Conversion of the measured absorbed dose to graphite to absorbed dose to water was achieved using Monte Carlo calculations with the EGSnrc code. The air kerma measurements from the free-air chamber were converted to absorbed dose to water using the AAPM TG-61 protocol. Results: Absolute measurements of the IMBL dose rate were made using the graphite calorimeter and compared to measurements with the free-air chamber. The measurements were at three different depths in graphite and two different filtrations. The calorimetry measurements at depths in graphite show agreement within 1% with free-air chamber measurements, when converted to absorbed dose to water. The calorimetry at the surface and free-air chamber results show agreement of order 3% when converted to absorbed dose to water. The combined standard uncertainty is 3

  4. SU-F-19A-06: Experimental Investigation of the Energy Dependence of TLD Sensitivity in Low-Energy Photon Beams

    SciTech Connect

    Chen, Z; Nath, R

    2014-06-15

    Purpose: To measure the energy dependence of TLD sensitivity in lowenergy photon beams with equivalent mono-energetic energy matching those of 103Pd, 125I and 131Cs brachytherapy sources. Methods: A Pantek DXT 300 x-ray unit (Precision X-ray, Branford, CT), with stable digital voltage control down to 20 kV, was used to establish three lowenergy photon beams with narrow energy spread and equivalent monoenergetic energies matching those of 103Pd, 125I and 131Cs brachytherapy sources. The low-energy x-ray beams and a reference 6 MV photon beam were calibrated according to the AAPM TG-61 and TG-51 protocols, respectively, using a parallel-plate low-energy chamber and a Farmer cylindrical chamber with NIST traceable calibration factors. The dose response of model TLD-100 micro-cubes (1×1×1 mm{sup 3}) in each beam was measured for five different batches of TLDs (each contained approximately 100 TLDs) that have different histories of irradiation and usage. Relative absorbed dose sensitivity was determined as the quotient of the slope of dose response for a beam-of-interest to that of the reference beam. Results: Equivalent mono-energetic photon energies of the low-energy beams established for 103Pd, 125I and 131Cs sources were 20.5, 27.5, and 30.1 keV, respectively. Each beam exhibited narrow spectral spread with energyhomogeneity index close to 90%. The relative absorbed-dose sensitivity was found to vary between different batches of TLD with maximum differences of up to 8%. The mean and standard deviation determined from the five TLD batches was 1.453 ± 0.026, 1.541 ± 0.035 and 1.529 ± 0.051 for the simulated 103P, 125I and 131Cs beams, respectively. Conclusion: Our measured relative absorbed-dose sensitivities are greater than the historically measured value of 1.41. We find that the relative absorbed-dose sensitivity of TLD in the 103P beam is approximately 5% lower than that of 125I and 131Cs beams. Comparison of our results with other studies will be presented.

  5. National protocol framework for the inventory and monitoring of bees

    USGS Publications Warehouse

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative

  6. A mechanical protocol to replicate impact in walking footwear.

    PubMed

    Price, Carina; Cooper, Glen; Graham-Smith, Philip; Jones, Richard

    2014-01-01

    Impact testing is undertaken to quantify the shock absorption characteristics of footwear. The current widely reported mechanical testing method mimics the heel impact in running and therefore applies excessive energy to walking footwear. The purpose of this study was to modify the ASTM protocol F1614 (Procedure A) to better represent walking gait. This was achieved by collecting kinematic and kinetic data while participants walked in four different styles of walking footwear (trainer, oxford shoe, flip-flop and triple-density sandal). The quantified heel-velocity and effective mass at ground-impact were then replicated in a mechanical protocol. The kinematic data identified different impact characteristics in the footwear styles. Significantly faster heel velocity towards the floor was recorded walking in the toe-post sandals (flip-flop and triple-density sandal) compared with other conditions (e.g. flip-flop: 0.36±0.05 ms(-1) versus trainer: 0.18±0.06 ms(-1)). The mechanical protocol was adapted by altering the mass and drop height specific to the data captured for each shoe (e.g. flip-flop: drop height 7 mm, mass 16.2 kg). As expected, the adapted mechanical protocol produced significantly lower peak force and accelerometer values than the ASTM protocol (p<.001). The mean difference between the human and adapted protocol was 12.7±17.5% (p<.001) for peak acceleration and 25.2±17.7% (p=.786) for peak force. This paper demonstrates that altered mechanical test protocols can more closely replicate loading on the lower limb in walking. This therefore suggests that testing of material properties of footbeds not only needs to be gait style specific (e.g. running versus walking), but also footwear style specific.

  7. Development and validation of rapid magnetic particle based extraction protocols

    PubMed Central

    2014-01-01

    Background In order to control and eradicate transboundary animal diseases, early diagnosis and reaction is essential for the implementation of control activities. Thus, mobile diagnostic units which allow analytical testing close to the site of occurrence could provide valuable support for centralized laboratories. Consequently, the availability of diagnostic tests using mobile amplification and detection technologies has been increasing over the past years. However, methods enabling rapid and simple nucleic acid extraction also under resource-limited settings are still scarce. Methods In the present study rapid extraction protocols based on magnetic particle technology have been developed. For this purpose, the two open extraction platforms KingFisher™ Duo (Thermo Fisher Scientific) and BioSprint® 15 (Qiagen) as well as the fully automated EZ1® advanced XL instrument (Qiagen) were used. All protocols were validated in comparison to standard manual extraction using blood and serum samples from animals infected with Schmallenberg virus or bovine viral diarrhea virus. Results All newly developed protocols allowed a complete extraction within 30 minutes of time. The fully automated EZ1-extraction yielded the highest reproducibility, whereas slightly higher intra- and inter-assay variations were observed using the open platforms. Compared to the manual procedure, the analytical sensitivity of all the rapid protocols was 1 log10 step reduced for extraction from blood samples. For sera a reduced dynamic range could only be observed using the maximally shortened BioSprint 15 protocol. Validation using clinical samples showed an excellent concordance of all the rapid extraction protocols to the standard manual extraction procedure, independent of sample materials and target viruses. Conclusions The results of this study show that the speed-optimized novel extraction protocols allow rapid and simple nucleic acid extractions for a variety of target viruses without

  8. Quantum And Relativistic Protocols For Secure Multi-Party Computation

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger

    2009-11-01

    After a general introduction, the thesis is divided into four parts. In the first, we discuss the task of coin tossing, principally in order to highlight the effect different physical theories have on security in a straightforward manner, but, also, to introduce a new protocol for non-relativistic strong coin tossing. This protocol matches the security of the best protocol known to date while using a conceptually different approach to achieve the task. In the second part variable bias coin tossing is introduced. This is a variant of coin tossing in which one party secretly chooses one of two biased coins to toss. It is shown that this can be achieved with unconditional security for a specified range of biases, and with cheat-evident security for any bias. We also discuss two further protocols which are conjectured to be unconditionally secure for any bias. The third section looks at other two-party secure computations for which, prior to our work, protocols and no-go theorems were unknown. We introduce a general model for such computations, and show that, within this model, a wide range of functions are impossible to compute securely. We give explicit cheating attacks for such functions. In the final chapter we discuss the task of expanding a private random string, while dropping the usual assumption that the protocol's user trusts her devices. Instead we assume that all quantum devices are supplied by an arbitrarily malicious adversary. We give two protocols that we conjecture securely perform this task. The first allows a private random string to be expanded by a finite amount, while the second generates an arbitrarily large expansion of such a string.

  9. A literature review: polypharmacy protocol for primary care.

    PubMed

    Skinner, Mary

    2015-01-01

    The purpose of this literature review is to critically evaluate published protocols on polypharmacy in adults ages 65 and older that are currently used in primary care settings that may potentially lead to fewer adverse drug events. A review of OVID, CINAHL, EBSCO, Cochrane Library, Medline, and PubMed databases was completed using the following key words: protocol, guideline, geriatrics, elderly, older adult, polypharmacy, and primary care. Inclusion criteria were: articles in medical, nursing, and pharmacology journals with an intervention, protocol, or guideline addressing polypharmacy that lead to fewer adverse drug events. Qualitative and quantitative studies were included. Exclusion criteria were: publications prior to the year 1992. A gap exists in the literature. No standardized protocol for addressing polypharmacy in the primary care setting was found. Mnemonics, algorithms, clinical practice guidelines, and clinical strategies for addressing polypharmacy in a variety of health care settings were found throughout the literature. Several screening instruments for use in primary care to assess potentially inappropriate prescription of medications in the elderly, such as the Beers Criteria and the STOPP screening tool, were identified. However, these screening instruments were not included in a standardized protocol to manage polypharmacy in primary care. Polypharmacy in the elderly is a critical problem that may result in adverse drug events such as falls, hospitalizations, and increased expenditures for both the patient and the health care system. No standardized protocols to address polypharmacy specific to the primary care setting were identified in this review of the literature. Given the growing population of elderly in this country and the high number of medications they consume, it is critical to focus on the utilization of a standardized protocol to address the potential harm of polypharmacy in the primary care setting and evaluate its effects on

  10. Implementation of an epilepsy self-management protocol.

    PubMed

    Cole, Kimberly A; Gaspar, Phyllis M

    2015-02-01

    It is essential that patients with epilepsy receive educational information about their disease and its management, but there is dissatisfaction with the education received. The purposes of this evidence-based project were to examine the current knowledge level and disease management behaviors of patients with epilepsy in an outpatient clinic and to measure the effectiveness of implementing a self-management protocol using the Epilepsy Self-Management Scale (ESMS). Pender's health promotion model and Rogers' diffusion of innovation theory were used to guide the development and completion of this project. An evidence-based epilepsy self-management protocol was developed and implemented at an outpatient neurology clinic by an interprofessional clinic team that consisted of (a) evaluation of self-management behaviors (ESMS), (b) individual education using the ESMS and developed resources, (c) follow-up telephone call, and (d) measurement of outcomes of the self-management protocol (patient self-management [ESMS] and process). Twenty patients participated in all or portions of the protocol. Scores on the ESMS increased from preimplementation to postimplementation of the protocol (t = -2.67). Seizure management and information management were identified as the most difficult self-management areas. Recommended changes in protocol implementation include adding information about safety measures such as medical alert bracelets and driving to the educational packets. Follow-up telephone calls were discontinued because of difficulties reaching patients. The results of this study suggest that the ESMS is an acceptable tool for evaluating patients' self-management behaviors. Epilepsy self-management protocols need to include both verbal and written educational materials. Educating patients with epilepsy about positive self-management behaviors may lead to better health outcomes.

  11. Programmed electrical stimulation protocols: variations on a theme.

    PubMed

    Fisher, J D; Kim, S G; Ferrick, K J; Roth, J

    1992-11-01

    A series of prospective protocols were designed to determine the yield ratio (true positives vs. false positives = nonclinical) in various patient groups using a variety of programmed electrical stimulation (PES) variables. First, a PES protocol was used in 772 patients. Single, double, and triple extrastimuli were delivered in sequence (leaving each successive extrastimulus just beyond its refractory period before moving to the next extrastimulus) during sinus rhythm and two ventricular paced rates at the RV apex, before moving to the outflow tract and repeating the sequence and then moving on to isoproterenol infusion with the PES sequence repeated at the apex. This protocol met NASPE standards for induction of VT in patients with coronary artery disease and a history of VT, while failing to induce monomorphic VT in any control patient. The best yield ratios combined with the greatest likelihood of inducing clinical tachycardia were achieved with sinus rhythm and three extrastimuli, and pacing at the lower rate and three extrastimuli. Pacing at the faster rate and triple extrastimuli was highly inductive of clinical arrhythmias, but had a low yield ratio due to induction of more nonclinical arrhythmias than other steps. The next protocol was performed in 61 patients with inducible ventricular tachycardia. In each case, the protocol described above was completed at the RV apex, even if tachycardia was also induced at an earlier point in the protocol. This allowed for more accurate yield ratios to be established for each step in the protocol, since each patient was exposed to each of these steps.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1279622

  12. Toward Synthesis, Analysis, and Certification of Security Protocols

    NASA Technical Reports Server (NTRS)

    Schumann, Johann

    2004-01-01

    Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen

  13. Intelligent routing protocol for ad hoc wireless network

    NASA Astrophysics Data System (ADS)

    Peng, Chaorong; Chen, Chang Wen

    2006-05-01

    A novel routing scheme for mobile ad hoc networks (MANETs), which combines hybrid and multi-inter-routing path properties with a distributed topology discovery route mechanism using control agents is proposed in this paper. In recent years, a variety of hybrid routing protocols for Mobile Ad hoc wireless networks (MANETs) have been developed. Which is proactively maintains routing information for a local neighborhood, while reactively acquiring routes to destinations beyond the global. The hybrid protocol reduces routing discovery latency and the end-to-end delay by providing high connectivity without requiring much of the scarce network capacity. On the other side the hybrid routing protocols in MANETs likes Zone Routing Protocol still need route "re-discover" time when a route between zones link break. Sine the topology update information needs to be broadcast routing request on local zone. Due to this delay, the routing protocol may not be applicable for real-time data and multimedia communication. We utilize the advantages of a clustering organization and multi-routing path in routing protocol to achieve several goals at the same time. Firstly, IRP efficiently saves network bandwidth and reduces route reconstruction time when a routing path fails. The IRP protocol does not require global periodic routing advertisements, local control agents will automatically monitor and repair broke links. Secondly, it efficiently reduces congestion and traffic "bottlenecks" for ClusterHeads in clustering network. Thirdly, it reduces significant overheads associated with maintaining clusters. Fourthly, it improves clusters stability due to dynamic topology changing frequently. In this paper, we present the Intelligent Routing Protocol. First, we discuss the problem of routing in ad hoc networks and the motivation of IRP. We describe the hierarchical architecture of IRP. We describe the routing process and illustrate it with an example. Further, we describe the control manage

  14. Adult Rhabdomyosarcoma Survival Improved With Treatment on Multimodality Protocols

    SciTech Connect

    Gerber, Naamit Kurshan; Wexler, Leonard H.; Singer, Samuel; Alektiar, Kaled M.; Keohan, Mary Louise; Shi, Weiji; Zhang, Zhigang; Wolden, Suzanne

    2013-05-01

    Purpose: Rhabdomyosarcoma (RMS) is a pediatric sarcoma rarely occurring in adults. For unknown reasons, adults with RMS have worse outcomes than do children. Methods and Materials: We analyzed data from all patients who presented to Memorial Sloan-Kettering Cancer Center between 1990 and 2011 with RMS diagnosed at age 16 or older. One hundred forty-eight patients met the study criteria. Ten were excluded for lack of adequate data. Results: The median age was 28 years. The histologic diagnoses were as follows: embryonal 54%, alveolar 33%, pleomorphic 12%, and not otherwise specified 2%. The tumor site was unfavorable in 67% of patients. Thirty-three patients (24%) were at low risk, 61 (44%) at intermediate risk, and 44 (32%) at high risk. Forty-six percent were treated on or according to a prospective RMS protocol. The 5-year rate of overall survival (OS) was 45% for patients with nonmetastatic disease. The failure rates at 5 years for patients with nonmetastatic disease were 34% for local failure and 42% for distant failure. Among patients with nonmetastatic disease (n=94), significant factors associated with OS were histologic diagnosis, site, risk group, age, and protocol treatment. On multivariate analysis, risk group and protocol treatment were significant after adjustment for age. The 5-year OS was 54% for protocol patients versus 36% for nonprotocol patients. Conclusions: Survival in adult patients with nonmetastatic disease was significantly improved for those treated on RMS protocols, most of which are now open to adults.

  15. RadNet: Open network protocol for radiation data

    SciTech Connect

    Rees, B.; Olson, K.; Beckes-Talcott, J.; Kadner, S.; Wenderlich, T.; Hoy, M.; Doyle, W.; Koskelo, M.

    1998-12-31

    Safeguards instrumentation is increasingly being incorporated into remote monitoring applications. In the past, vendors of radiation monitoring instruments typically provided the tools for uploading the monitoring data to a host. However, the proprietary nature of communication protocols lends itself to increased computer support needs and increased installation expenses. As a result, a working group of suppliers and customers of radiation monitoring instruments defined an open network protocol for transferring packets on a local area network from radiation monitoring equipment to network hosts. The protocol was termed RadNet. While it is now primarily used for health physics instruments, RadNet`s flexibility and strength make it ideal for remote monitoring of nuclear materials. The incorporation of standard, open protocols ensures that future work will not render present work obsolete; because RadNet utilizes standard Internet protocols, and is itself a non-proprietary standard. The use of industry standards also simplifies the development and implementation of ancillary services, e.g. E-main generation or even pager systems.

  16. An Enhanced Security Protocol for VANET-Based Entertainment Services

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Yoon; Choi, Hyoung-Kee

    Multimedia transactions between vehicles are expected to become a promising application in VANETs but security is a fundamental issue that must be resolved before such transactions can become practical and trusted. Existing certificate-based digital signature schemes are ineffective for ensuring the security of multimedia transactions in VANETs. This ineffectiveness exists because there is no guarantee that (1) vehicles can download the latest certificate revocation lists or that (2) vehicles can complete a multimedia transaction before leaving their communication range. These two problems result, respectively, from a lack of infrastructure and from the inconsistent connectivity inherent in VANETs. In this paper, we propose a digital signature approach that combines a certificateless signature scheme and short-lived public keys to alleviate these problems. We then propose a security protocol that uses the proposed signature approach for multimedia transactions between vehicles. The proposed protocol enables vehicles to trade in multimedia resources without an online trust authority. We provide an analytical approach to optimizing the security of the proposed protocol. The security and performance of our protocol are evaluated via simulation and theoretical analysis. Based on these evaluations, we contend that the proposed protocol is practical for multimedia transactions in VANETs in terms of security and performance.

  17. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Revised

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; Mueller, James L.

    2000-01-01

    The document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. This document supersedes the earlier version (Mueller and Austin 1995) published as Volume 25 in the SeaWiFS Technical Report Series. This document marks a significant departure from, and improvement on, theformat and content of Mueller and Austin (1995). The authorship of the protocols has been greatly broadened to include experts specializing in some key areas. New chapters have been added to provide detailed and comprehensive protocols for stability monitoring of radiometers using portable sources, abovewater measurements of remote-sensing reflectance, spectral absorption measurements for discrete water samples, HPLC pigment analysis and fluorometric pigment analysis. Protocols were included in Mueller and Austin (1995) for each of these areas, but the new treatment makes significant advances in each topic area. There are also new chapters prescribing protocols for calibration of sun photometers and sky radiance sensors, sun photometer and sky radiance measurements and analysis, and data archival. These topic areas were barely mentioned in Mueller and Austin (1995).

  18. Comparative Study on Various Authentication Protocols in Wireless Sensor Networks

    PubMed Central

    Rajeswari, S. Raja; Seenivasagam, V.

    2016-01-01

    Wireless sensor networks (WSNs) consist of lightweight devices with low cost, low power, and short-ranged wireless communication. The sensors can communicate with each other to form a network. In WSNs, broadcast transmission is widely used along with the maximum usage of wireless networks and their applications. Hence, it has become crucial to authenticate broadcast messages. Key management is also an active research topic in WSNs. Several key management schemes have been introduced, and their benefits are not recognized in a specific WSN application. Security services are vital for ensuring the integrity, authenticity, and confidentiality of the critical information. Therefore, the authentication mechanisms are required to support these security services and to be resilient to distinct attacks. Various authentication protocols such as key management protocols, lightweight authentication protocols, and broadcast authentication protocols are compared and analyzed for all secure transmission applications. The major goal of this survey is to compare and find out the appropriate protocol for further research. Moreover, the comparisons between various authentication techniques are also illustrated. PMID:26881272

  19. Meeting pragmatism halfway: making a pragmatic clinical trial protocol.

    PubMed

    Rushforth, Alexander

    2015-11-01

    Pragmatic clinical trials (PCTs) are today an increasingly prominent means of measuring the 'effectiveness' of healthcare interventions in 'real world' clinical settings, in order to produce evidence on which to base regulatory and clinical decision-making. Although several sociological studies have shown persuasively how PCTs are co-constructed within particular healthcare systems in which they are based, they have tended to focus on relatively later stages in careers of trials. The paper contributes to literature by considering how the 'real world' of the UK National Health Service (NHS) is incorporated into the design of a research protocol. Drawing on a meeting held just prior to patient recruitment for a PCT in maternal health, the paper analyses a trial collective's efforts to purify the messy domain of NHS clinical care into the orderly confines of the protocol (Law 2004), which meant satisfying demands for both scientific and social robustness (c.f. Nowotny et al. 2001). The findings show how efforts to inscribe robustness into the PCT protocol were themselves mediated through epistemic and regulatory conventions surrounding protocols as devices in healthcare research. Finally it is argued that meetings constitute an important epistemic instrument through which to settle various emerging tensions in PCT protocol design.

  20. A novel protocol for multiparty quantum key management

    NASA Astrophysics Data System (ADS)

    Xu, Gang; Chen, Xiu-Bo; Dou, Zhao; Yang, Yi-Xian; Li, Zongpeng

    2015-08-01

    Key management plays a fundamental role in the field of cryptography. In this paper, we propose a novel multiparty quantum key management (QKM) protocol. Departing from single-function quantum cryptography protocols, our protocol has a salient feature in that it accomplishes a complete QKM process. In this process, we can simultaneously realize the functions of key generation, key distribution and key backup by executing the protocol once. Meanwhile, for the first time, we propose the idea of multi-function QKM. Firstly, the secret key is randomly generated by managers via the quantum measurements in -level Bell basis. Then, through entanglement swapping, the secret key is successfully distributed to users. Under circumstances of urgent requirement, all managers can cooperate to recover the users' secret key, but neither of them can recover it unilaterally. Furthermore, this protocol is further generalized into the multi-manager and multi-user QKM scenario. It has clear advantages in the burgeoning area of quantum security group communication. In this system, all group members share the same group key, and group key management is the foundation of secure group communication and hence an important subject of study.